While Facebook (FB) is known for doing seemingly endless data analyses, FB is coming under heavy fire for a recently disclosed one that some say is unethical, if not outright sinister.
“When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.”
The study — whose authors included researchers from Cornell, the University of California-San Francisco and Facebook — was recently published in the June edition of the journal Proceedings of the National Academy of Sciences of America.
However, this research has created user unrest, as Facebook failed to get any consent or provide notifications to the subjects of the study.
FB Mood Study Strikes a Nerve
Facebook is no stranger to privacy flubs. For instance, Beacon — which showed users’ actions on third-party partner sites — caused a huge uproar, forcing CEO Mark Zuckerberg to swiftly shut down the program and apologize in a blog post.
But the mood study appears to breach a deeper violation.
Facebook, Twitter (TWTR) and other social media sites were full of posts decrying Facebook’s participation in the study, with many unsurprisingly urging others to delete their Facebook accounts.
And even the journal’s editor, Susan Fiske, expressed concerns about the mood study, saying she was a “little creeped out” by it.
“So, I think it’s an open ethical question,” she said. “It’s ethically okay from the regulations perspective, but ethics are kind of social decisions. There’s not an absolute answer. And so the level of outrage that appears to be happening suggests that maybe it shouldn’t have been done.”
Consent from participants has long been considered vital to the ethics of scientific studies, and was even laid out in the post-World War II Nuremberg Code.
Still, Facebook’s participation in the mood research is almost certainly legal thanks to its stringent terms of service, which gives FB great latitude with user data. Moreover, because there was no federal funding for the study, America’s consent laws likely don’t even apply.
Legal security or not, Facebook’s participation goes against FB’s idealistic mission. Facebook supposedly is here to help “you connect and share with the people in your life” — not treat users as virtual lab rats.
And it’s more worrisome to think that the study occurred before Facebook truly became a mobile juggernaut, which gave FB access to even more data, such as user location. Plus the emergence of wearables should cause additional concern. After all, Apple (AAPL), Google (GOOG) and yes, Facebook (with its Oculus technology) are pushing aggressively to help create devices that will measure health information — which means they’ll have more data such as heart rate, weight, drug use or blood pressure.
Naturally, users have plenty to weigh. But what about FB stock holders?
The brouhaha from the Facebook mood study probably won’t have much impact on the user base, if only because while FB does have competition, there’s no real meaningful, one-for-one alternative.
But the trouble could lie down the road, in the form of more regulation, which would make it tougher to leverage data to generate more revenue.
It’s very likely that sentiment could have its say in FB shares in coming days, but there’s very little to worry about until something comes out of this that could actually shake Facebook’s fundamentals.
Tom Taulli runs the InvestorPlace blog IPO Playbook. He is also the author of High-Profit IPO Strategies, All About Commodities and All About Short Selling. Follow him on Twitter at @ttaulli. As of this writing, he did not hold a position in any of the aforementioned securities.