This article was originally published in the Honolulu Star-Advertiser on July 15, 2014 When news broke in late June about Facebook running unsanctioned psychological experiments on their user population in the name of science, the reaction was overwhelmingly negative.
During one week in January 2012, Facebook manipulated the news feeds of roughly 700,000 users to see how it affected their moods. Facebook removed all positive news from some users' feeds and all negative posts from others. Then their team of data scientists stood back and watched what happened. If there was a week in January 2012 during which your friends seemed abnormally fixated on cute babies and rainbows, you may have been part of the study. Alternatively, if Facebook selected you to receive only negative posts, all the cute baby and rainbow posts would have been stripped away, and you would read only about house fires, car accidents, crime and other similarly depressing fare.
Facebook wanted to know whether emotions were contagious (they are). The team at Facebook wrote in their resulting paper that "emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks." They also learned that people who received only negative posts tended to post less often on their own walls.
The truth is that Facebook experiments on its user base quite frequently, and probably constantly. You consent to it in the terms of use that you agree to when you sign up to use the free service (that wasn't always the case, which is part of the public's angst with Facebook's handling of research along these lines). If you use Facebook, and no doubt most other free online services, you can consider yourself in a petri dish with some unknown number of data scientists studying you and, as we learned in this Facebook case, manipulating you "for science."
However egregious Facebook's emotional contagion experiment was, some may be more disturbed by some of their other experiments. In 2010, Facebook conducted an experiment on more than 61 million of its users to determine if they could influence voter turnout in the United States during an election. They presented "I Voted" buttons for some of their users and suppressed it on others to determine whether it would influence friends to also vote or not. While you can applaud Facebook's effort to get more people voting, it treads dangerously close to controlling who votes and who doesn't. If friends of "I Voted" people are more prone to vote, then couldn't Facebook, led by an increasingly politically active Mark Zuckerberg, theoretically choose who gets the buttons and who doesn't, depending on how they might vote on any given political issue? There is evidence Facebook conducted a similar experiment during the 2012 elections, but the paper hasn't been published yet.
Facebook isn't alone here: Other social media sites conduct research on their users or allow third parties (like the government) to conduct their own. In a U.S. Department of Defense program unveiled in 2011, the government spent nearly $20 million analyzing data from Facebook, Twitter, Pinterest and other social media sites, specifically to study (and potentially influence) the online behavior of Occupy Wall Street and Middle East protesters.
So, what can we learn from the revelations about these activities? Well, for one thing, nothing is ever free. Social networks are free to use because the companies that run them, often at great expense, are offering their users up for advertisements and research such as this. The second thing is that even if you have privacy controls set to hide your content from strangers, you generally can't hide your content from the service provider itself. You should be mindful of that fact when you use any social media service.
Hawaiian Telcom Information Security Director Beau Monday is a local cybersecurity expert. Reach him at Beau.Monday@hawaiiantel.com.
Copyright (c) Honolulu Star-Advertiser
Visit this article in the Star-Advertiser