In case you missed it, on June 17 the Proceedings of the National Academy of Sciences (PNAS) published the results of a January 2012 experiment, in which Facebook Data Scientist Adam Kramer, along with Jamie Guillory of the University of California, San Francisco, and Jeff Hancock, of Cornell University, manipulated Facebook’s news feed to provide some users with content that was emotionally positive or emotionally negative to see if this resulted in a correlating reaction by the user. Kramer, in a Facebook post, clarified the research, saying, “we care about the emotional impact of Facebook and the people that use our product.”
Facebook Founder and CEO Mark Zuckerberg has so far been silent on the matter, but today the Wall Street Journal reported that COO Sheryl Sandberg said:
The apology is a bit “I’m sorry if I offended you”, but if my own Facebook feed is any indicator, the social media giant will be OK. At any given moment you can read the results of my friends’ latest BuzzFeed quiz on their spirit animal or where they should go on a time machine. (Guess what? BuzzFeed is using your data too.)
I don’t know if it’s indicative of my generation–I’m firmly in the middle ground between Gen X and Gen Y–but I typically don’t get bent out of shape about how my data is used. Data is a part of my daily life, and I know the old Economics 101 credo–“There’s no such thing as a free lunch”. Facebook has to benefit from its user base. And data is really, really valuable.
But Facebook wasn’t transparent about how the data was used.
In the paper, the authors claim to have informed consent by the user. But Susan Fiske, a professor of psychology at Princeton University, told The Atlantic: “People are supposed to be, under most circumstances, told that they’re going to be participants in research and then agree to it and have the option not to agree to it without penalty.”
Facebook doesn’t have to follow the same rules as academia, but at some point in this little exercise, I suspect that Facebook knew it was in a grey area. Four months after the research was completed, Facebook then updated its terms of service to include research (a Facebook spokesman contends that the previous terms of service implied research studies to improve services). Is anyone going to sue Facebook over this? Probably not, but they sure weren’t transparent about what they were up to. Fiske said:
I had not seen before, personally, something in which the researchers had the cooperation of Facebook to manipulate people… Who knows what other research they’re doing.
Kramer admits that, considering the fervor that resulted from studying the emotional reactions of just a sliver of Facebook’s user base, “the research benefits of the paper may not have justified all of this anxiety.” A British regulator is looking into potential violations of the country’s data protection laws, which could result in a fine. And the reaction from the public and the media has been largely negative, with many saying that the company’s manipulation of users’ news feeds was, for lack of a better word, creepy.
What Facebook’s mishandling of the situation underscores is the consumer’s shifting comfort level about how companies use their data. As more companies look to use data, you have to understand where to draw the line. Your reputation might be at stake.