Updated, 4:40 p.m. Eastern.
Facebook’s News Feed—the main list of status updates, messages, and photos you see when you open Facebook on your computer or phone—is not a perfect mirror of the world.
But few users expect that Facebook would change their News Feed in order to manipulate their emotional state.
We now know that’s exactly what happened two years ago. For one week in January 2012, data scientists skewed what almost 700,000 Facebook users saw when they logged into its service. Some people were shown content with a preponderance of happy and positive words; some were shown content analyzed as sadder than average. And when the week was over, these manipulated users were more likely to post either especially positive or negative words themselves.
This tinkering was just revealed as part of a new study, published in the prestigious Proceedings of the National Academy of Sciences. Many previous studies have used Facebook data to examine “emotional contagion,” as this one did. This study is different because, while other studies have observed Facebook user data, this one set out to manipulate it.
The experiment is almost certainly legal. In the company’s current terms of service, Facebook users relinquish the their data “data analysis, testing, [and] research.” Is it ethical, though? Since news of the study first emerged, I’ve seen and heard both privacy advocates and casual users express surprise at the audacity of the experiment.
In the wake of both the Snowden stuff and the Cuba twitter stuff, the Facebook "transmission of anger" experiment is terrifying.
— Clay Johnson (@cjoh) June 28, 2014
Get off Facebook. Get your family off Facebook. If you work there, quit. They're fucking awful.
— Erin Kissane (@kissane) June 28, 2014
We’re tracking the ethical, legal, and philosophical response to this Facebook experiment here.
Did an institutional review board—an independent ethics committee that vets research that involves humans—approve the experiment?
Yes, according to Susan Fiske, the Princeton University psychology professor who edited the study for publication.
“I was concerned,” Fiske told The Atlantic, “until I queried the authors and they said their local institutional review board had approved it—and apparently on the grounds that Facebook apparently manipulates people's News Feeds all the time.”
Fiske added that she didn’t want the “the originality of the research” to be lost, but called the experiment “an open ethical question.”
“It's ethically okay from the regulations perspective, but ethics are kind of social decisions. There's not an absolute answer. And so the level of outrage that appears to be happening suggests that maybe it shouldn't have been done...I'm still thinking about it and I'm a little creeped out, too.”
For more, check Atlantic editor Adrienne LaFrance’s full interview with Prof. Fiske.
