Facebook psychology studies and privacy

In the last 24 hours, there have been a lot of stories about Facebook running a psychology experiment on its users. Reactions included the following arguments: that it was bad science, that it breached ethical guidelines, and even that it might have been illegal (although there is at least some dispute as to its legality). In general, many people seem to believe that Facebook “totally screwed with a bunch of people in the name of science.” Admittedly, not everyone agrees, but there is no denying that the study has caused much genuine outrage.

So what exactly happened? In short, Facebook picked 689,000 Facebook users and manipulated their News Feeds during the week of January 11 to January 18 in 2012. The manipulation involved reducing the amount of positive or negative (based on keywords) content displayed in a user’s News Feed. The study found that “emotional contagion” resulted from the manipulation; that is, users who had less positive content in their news feed expressed more negativity in status updates, and vice versa for users with less negative content. If you’re interested in the details, the results of the study are published in the PNAS journal under the title “Experimental evidence of massive-scale emotional contagion through social networks.”

About 23 hours ago, Adam D.I. Cramer, one of the three coauthors of the study, posted an apology on Facebook. The apology, however, seems to be limited to “the way the paper described the research and any anxiety it caused.” It does not explicitly admit any kind of wrongdoing. Contrary to this apology, however, the paper published in PNAS appears to describe the experiment in a completely objective tone, as any scientific paper might. Thus, the apology does not address the real issue; it is one thing to express regret for any “anxiety” caused by the research, but it is quite another to talk about whether it was legal or at least ethical to have done the experiment in the first place.

I’ll leave the legal questions to the lawyers. The question we should be asking ourselves is how we personally feel about Facebook running experiments on us, whether or not the experiment was technically allowed by some arcane passage buried deep inside its Data Use Policy. If we are among the many who feel genuine outrage, then the next question is what we personally can do about it.

From the perspective of privacy, the ideal solution of course is to simply quit Facebook. Unfortunately, privacy alone is not the only consideration. With 1.15 billion active users, Facebook is the largest social network in the world. As with any site, Facebook’s value to users is directly proportional to the number of users on the site. After all, the whole point of a “social” networking site is to connect with others, so if there’s nobody on a site, then where is the social aspect?

Fortunately, quitting is not the only option, so let’s first look at a short-term solution. At least at the moment, Facebook only has as much information as you give to the site. Some pieces of information, such as your real name and birth date are required in order to sign up for Facebook in the first place, but most other information, such as your geographical location, what you “Like,” what you search for on Facebook, the private messages you send to people, and your status updates (including the ones used in the study) are completely optional. If you care about your privacy, but you also must use Facebook to keep in touch with certain people, then minimize the information your provide to Facebook to only what is absolutely required.

Many have suggested learning to use Facebook’s privacy settings. This is obviously one way to restrict the amount of data you share, but remember that it is ultimately Facebook who made those privacy controls, and as long as you are a user on the site, you only have as much “control” as they choose to give you. In the 4th quarter of 2013, advertising made up more than 90% of Facebook’s total revenue.

So how do you make money from online advertising? It probably comes as no surprise that consumers prefer to see targeted ads as opposed to completely random ads; targeted ads appeal directly to their interests. Facebook, being in control of so much user information, is thus perfectly positioned to sell advertising tailored to its users. Facebook may profess that it cares deeply about user privacy. This may even be true, though given its record, some may be inclined to disagree.

Therefore, if you wish to have more control over your privacy, don’t rely on Facebook’s privacy controls, which are subject to change; rather, you should sleep better at night knowing that with the minimal amount of information you have given Facebook, it hardly matters what your privacy settings are.

As I mentioned, restriction (or, if you already put a lot of information on Facebook, removal) of information is only a short-term solution. As the largest social network in the world, Facebook is in a dominant position on the Web. With this dominance comes power; Facebook knows that the more dependent users are on its site, the more freedom it has to do whatever it wants. Whatever privacy you feel you may have on Facebook at the moment, there is no guarantee that you will still have this level of privacy in the future.

The only surefire way to be rid of Facebook’s privacy problems is to quit Facebook. Even in an age in which many view Facebooking as synonymous with socializing, there are alternatives. Diaspora, for example, is a decentralized network which is not wholly controlled by a single company. Rather, users control and own their own data in “pods,” and they thus have far more control over their data than they would on Facebook. For example, you are allowed to use a pseudonym instead of your real name; something that Facebook does not currently allow. Social networks like Diaspora may well represent the proper balance between privacy and the human need to socialize. Caveat: I have not tried Diaspora.

Although Facebook is not going anywhere anytime soon, it is important to realize that its power derives from the apathy of many users towards privacy. The moment we as a society begin to care about privacy, the power will once again return to where it rightfully belongs: in our hands.

Leave a Reply

Your email address will not be published. Required fields are marked *