Accessibility links

Breaking News

Facebook Feed Manipulation Study Draws Ire


FILE - Facebook logo on a computer screen is seen through glasses held by a woman.
FILE - Facebook logo on a computer screen is seen through glasses held by a woman.

Social media giant Facebook says it conducted a study that altered some users’ feeds because the company cares about “the emotional impact of Facebook.”

There has been widespread criticism of the experiment, which was conducted without users’ knowledge.

For the study, Facebook, along with Cornell University and the University of California at San Francisco, manipulated some users’ feeds – favoring posts that were positive for some users or more negative for others – to see if emotions were contagious online.

Adam Kramer, a Facebook data scientist and co-author of the study, responded to the outcry on his Facebook page.

“The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product,” he wrote. “We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook. We didn't clearly state our motivations in the paper.”

According to Kramer, only 1 in 2,500 users, over 689,000 in total, had their feeds manipulated for a period of one week in 2012.

“I can tell you that our goal was never to upset anyone,” he wrote. “I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety.

Kramer said the study did show that positive stories in a person’s news feed encouraged positivity, while negative stories had the opposite effect.

“People who had positive content experimentally reduced on their Facebook news feed, for one week, used more negative words in their status updates,” said Jeff Hancock, professor of communication at Cornell’s College of Agriculture and Life Sciences and co-director of its Social Media Lab said in a statement. “When news feed negativity was reduced, the opposite pattern occurred: Significantly more positive words were used in peoples’ status updates.”

The findings, according to Hancock, could extend into the real world.

“Online messages influence our experience of emotions, which may affect a variety of offline behaviors,” Hancock said.

The study has been called “psychological manipulation” by some.

“The unwitting participants in the Facebook study were told (seemingly by their friends) for a week either that the world was a dark and cheerless place or that it was a saccharine paradise,” wrote University of Maryland law professor James Grimmelmann in a blog post. ”That’s psychological manipulation, even when it’s carried out automatically.”

University of Texas psychologist Tal Yarkoni disagreed.

“It’s not clear what the notion that Facebook users’ experience is being ‘manipulated’ really even means, because the Facebook news feed is, and has always been, a completely contrived environment,” he wrote on a blog post. “I hope that people who are concerned about Facebook ‘manipulating’ user experience in support of research realize that Facebook is constantly manipulating its users’ experience. In fact, by definition, every single change Facebook makes to the site alters the user experience, since there simply isn’t any experience to be had on Facebook that isn’t entirely constructed by Facebook.”

Patricia Wallace, Senior Director, CTYOnline and Information Technology at Johns Hopkins University, has a different opinion.

“I don’t think people deeply involved consider it a contrived environment,” she said. “They keep in touch with family and friends. Think of unfriending. It’s not trivial.”

The study also raised ethical questions, which, according to the Atlantic, remain unresolved.

Wallace said she thought the study was “unethical.”

“The process for getting [Institutional Review Board] approval is very onerous and goes deeply into what kind of permissions you’re giving people and the risks and benefits of the study,” she told VOA. “The sharing of that information was not done with those people. They never had a chance to agree. Some of them may have been minors.”

She added that people generally understand that companies do studies like this, but that in most cases, they’re not publishing it.

Wallace suggested the study could be pushback against a 2013 study that found Facebook made people depressed.

“People should be up in arms,” she said. “A good thing that’s going to come out of this is that it caught people’s attention. A lot of people are spending hours with this technology, and they don’t know how it’s influencing them.”

XS
SM
MD
LG