Emotional Backlash Unlikely to Unsettle Facebook
It's likely Facebook will weather the severe criticism over its manipulation of users' emotions, but there are those who believe its research crossed the line. "This isn't collecting data and employing it for marketing purposes," said John Carroll, a professor at Boston University. "This was more like a lab experiment. It goes beyond the standard digital vacuuming of people's data."
Jul 2, 2014 11:47 AM PT
It's unlikely that Facebook's psych experiment that turned some 700,000 of its users into involuntary lab rats will hurt its brand or advertising revenue.
However, Facebook's research on the emotional impact of content in the News Feeds of its members has unleashed a torrent of criticism.
Facebook has faced this kind of outrage before, though, and the results have almost always been the same.
"All these things blow up for a day or two, and they blow over amazingly quickly, and everyone goes back to normal afterwards," said Jan Dawson, chief analyst with Jackdaw Research.
"If you look at Google trends for Facebook and privacy as a trend, you'll see these massive spikes from time to time, but then it goes back to a baseline level. It hardly moves the needle at all," he told the E-Commerce Times.
"At a macro level, I think these things are temporary bad news for Facebook, but ultimately don't do it any damage in the long term," Dawson said.
Researchers Are Sorry
The dither over Facebook's research erupted this week after an article about it appeared in the June issue of the Proceedings of the National Academy of Sciences, an important scientific journal. The article focuses on a research project conducted two-and-a-half years ago that entailed manipulation of the News Feeds of nearly 700,000 Facebook users without their knowledge. The goal was to determine how positive and negative content in their feeds influenced their emotional states.
"When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred," wrote the researchers from Facebook, the University of California at San Francisco and Cornell University.
Following the outpouring of outrage over the PNAS piece, the researchers professed innocent intentions.
"Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone," researcher Adam D. I. Kramer wrote on his Facebook page.
"I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused," he added. "In hindsight, the research benefits of the paper may not have justified all of this anxiety."
What's more, Facebook has improved its internal review practices on research since the data for the paper was gathered, Kramer said.
Facebook declined to comment for this story.
Digital Cover Charge Increased
Much of the umbrage over the Facebook research has come from academic circles where ethical standards are higher than those used in marketing research. For example, any research receiving federal funding must abide by the "common rule" which requires that research subjects give informed consent before they're included in a study. They also must be informed of any reasonably foreseeable risks or discomforts to them.
Facebook takes a much looser approach to its research. Its data use policy says only that it might use its members' data for "internal operations, including troubleshooting, data analysis, testing, research, and service improvement." That language, by the way, was added to the data use policy after the data for the emotional impact study was gathered.
As a market research project, the research is a legitimate exercise, contended Larry Chiagouris, a marketing professor at Pace University.
"The News Feed is an important product offering they provide people. What responsible marketer doesn't do experiments to find out what people like about their products? Most do," he told the E-Commerce Times.
"This is a media-created flap," Chiagouris added. "It's going to blow over."
However, there are those who believe Facebook's research went beyond mere market analysis.
"This isn't collecting data and employing it for marketing purposes," said John Carroll, a mass communications professor at Boston University.
"This was more like a lab experiment. It goes beyond the standard digital vacuuming of people's data," he told the E-Commerce Times.
"This was not a test to improve service. This was a test to heighten the attachment of Facebook users to the site," Carroll pointed out.
"The cover charge for the digital world is that your data is going to be scooped up and used in some way. Most people understand that," he continued. However, "this goes beyond that, and it could result in less sharing on Facebook, which means it will be getting less and less value from its users."
For now, it doesn't seem that Facebook users share the indignation about the research that's emanating from some quarters.
"There hasn't been any massive user outcry," Greg Sterling, an analyst with Internet2Go, told the E-Commerce Times. "People in the press and the privacy and social science worlds are talking about it, but ordinary users aren't very upset about it."
Facebook members who commented to the E-Commerce Times about the research expressed indignation, but didn't seem likely to abandon the network.
"I don't think you can expect any privacy when you join and use social media. That doesn't mean I wouldn't like some, but you can't really get angry about it," said Marylu Anterni Medeiros, of Seekonk, Mass.
"However, I don't expect them to manipulate the posts and stories that I see, even if it is for a study," she added.
Rebecca Claeys, of Arena, Wisc. also was upset by the study. "Does it annoy the crap out of me?" she asked. "Yes. Is it unethical as hell? Absolutely. Will I change my Facebook usage based on it? Probably not."