Developers should be paying close attention to the unfolding controversy surrounding Facebook and the study that let a group of data scientists try to manipulate the emotions of its users in 2012. The fallout now includes reports that British regulators are investigating, and it even prompted Facebook Chief Operating Officer Sheryl Sandberg to apologize – sort of.
“This was part of ongoing research companies do to test different products, and that was what it was; it was poorly communicated,” Sandberg told the Wall Street Journal. “And for that communication we apologize. We never meant to upset you.”
It’s a cautionary tale about when and how to give access to your APIs – and for what purpose. In this case, Facebook allowed a group of data scientists in 2012 to affect the newsfeeds of more than 600,000 users to see if they could control their emotional state. According to the researchers, they were successful.
“In an experiment with people who use Facebook, we test whether emotional contagion occurs outside of in-person interaction between individuals by reducing the amount of emotional content in the News Feed,” they wrote in a study. “When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred.”
Translation: If people saw more posts with happy content, they responded by posting positive content themselves. And if they saw more posts with depressing content, they were more likely to post negative content themselves.
Although the study was published in March, it was just discovered and passed around by social media users in the last few weeks. The backlash was swift.
“The study highlights the ease at which web companies can now collect and analyze an unprecedented amount of behavioral data (something we already knew they could do) and then, apparently, use it to affect how we behave,” wrote Derrick Harris at Gigaom. “Not whether we stay on the site longer or click through to another piece of content, but the literal words we choose to type or thoughts we choose to share.”
“The whole experiment feels like a violation,” wrote blogger and data scientist John Foreman. “Facebook emotionally manipulated people. And to add insult to injury, Facebook used user-generated, supposedly perspective-free content (at least free of Facebook’s perspective).”
Foreman says that platforms like Facebook and users of Facebook data (i.e., data scientists and developers) need to go above and beyond what is legal to maintain the confidence of customers.
“What is allowable in data science is only partially governed by TOSs and precedence,” Foreman wrote. “There’s an inherent creepiness to data science, so it’s important that a company always ask itself, ‘What do our users expect from us?’ Not ‘what is legal?’ or ‘can we point to what we’ve already been doing that’s similar?’”
Facebook says it understands and wants to remain a neutral platform, and that it doesn’t intend to make a habit of trying to shape the way people think and feel.
“We take privacy and security at Facebook really seriously because that is something that allows people to share” opinions and emotions, Sandberg told the Journal.
That neutrality, and a sense from users that they have some control, is critical to making the Facebook platform a valuable one for developers. While the issue hurts Facebook for the moment, time will tell if it makes users more suspect of applications written by developers. Nobody wants users of their application to worry that it might be secretly messing with their minds.
Image credit: Spencer E. Holtaway