Was the Department of Defense Behind Facebook’s Controversial Manipulation Study?

Screen Shot 2014-07-01 at 2.41.34 PMI’ve spent pretty much all day reading as much as possible about the extremely controversial Facebook “emotional contagion” study in which the company intentionally altered its news feed algorithm to see if it could manipulate its users’ emotions. In case you weren’t aware, Facebook is always altering your news feed under the assumption that there’s no way they could fill your feed with all of your “friends'” pointless, self-absorbed, dull updates (there’s just too much garbage).

As such, Facebook filters your news feed all the time, something which advertisers must find particularly convenient. In any event, the particular alteration under question occurred during one week in January 2012, and the company filled some people’s feeds with positive posts, while others were fed more negative posts.

Once the data was compiled, academics from the University of California, San Francisco and Cornell University were brought in to analyze the results. Their findings were then published in the prestigious Proceedings of the National Academy of Sciences. They found that:

For people who had positive content reduced in their News Feed, a larger percentage of words in people’s status updates were negative and a smaller percentage were positive. When negativity was reduced, the opposite pattern occurred. These results suggest that the emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks.

You probably know most of this already, but here is where it starts to get really strange. Initially, the press release from Cornell highlighting the study said at the bottom: “The study was funded in part by the James S. McDonnell Foundation and the Army Research Office.” Once people started asking questions about this, Cornell claimed it had made a mistake, and that there was no outside funding. Jay Rosen, Journalism Professor at NYU, seems to find this highly questionable. He wrote on his Facebook page that:

Strange little turn in the story of the Facebook “emotional contagion” study. Last month’s press release from Cornell highlighting the study had said at the bottom: “The study was funded in part by the James S. McDonnell Foundation and the Army Research Office.”

Why would the military be interested? I wanted to know. So I asked Adam D.I. Kramer, the Facebook researcher, that question on his Facebook page, where he has posted what he called a public explanation. (He didn’t reply to my or anyone else’s questions.) See:https://www.facebook.com/akramer/posts/10152987150867796

Now it turns out Cornell was wrong! Or it says it was wrong. The press release now reads: “Correction: An earlier version of this story reported that the study was funded in part by the James S. McDonnell Foundation and the Army Research Office. In fact, the study received no external funding.”

Why do I call this strange? Any time my work has been featured in an NYU press release, the PR officers involved show me drafts and coordinate closely with me, for the simple reason that they don’t want to mischaracterize scholarly work. So now we have to believe that Cornell’s Professor of Communication and Information Science, Jeffrey Hancock, wasn’t shown or didn’t read the press release in which he is quoted about the study’s results (weird) or he did read it but somehow failed to notice that it said his study was funded by the Army when it actually wasn’t (weirder).

I think I would notice if my university was falsely telling the world that my research was partially funded by the Pentagon… but, hey, maybe there’s an innocent and boring explanation that I am overlooking.

It gets even more interesting from here. The Professor of Communication and Information Science, Jeffrey Hancock, who Mr. Rosen mentions above, has a history of working with the U.S. military, specifically the Minerva Institute. In case you forgot what this is, the Guardian reported on it earlier this year. It explained:

A US Department of Defense (DoD) research program is funding universities to model the dynamics, risks and tipping points for large-scale civil unrest across the world, under the supervision of various US military agencies. The multi-million dollar program is designed to develop immediate and long-term “warfighter-relevant insights” for senior officials and decision makers in “the defense policy community,” and to inform policy implemented by “combatant commands.”

Launched in 2008 – the year of the global banking crisis – the DoD ‘Minerva Research Initiative’ partners with universities “to improve DoD’s basic understanding of the social, cultural, behavioral, and political forces that shape regions of the world of strategic importance to the US.”

SCG News has written one of the best articles I have seen yet on the links between the Facebook study and the Department of Defense. It notes:

In the official credits for the study conducted by Facebook you’ll find Jeffrey T. Hancock from Cornell University. If you go to the Minerva initiative website you’ll find that Jeffery Hancock received funding from the Department of Defense for a study called “Cornell: Modeling Discourse and Social Dynamics in Authoritarian Regimes”. If you go to the project site for that study you’ll find a visualization program that models the spread of beliefs and disease.

Cornell University is currently being funded for another DoD study right now called “Cornell: Tracking Critical-Mass Outbreaks in Social Contagions” (you’ll find the description for this project on the Minerva Initiative’s funding page).

So I went ahead and looked at the study mentioned above, and sure enough I found this:

Screen Shot 2014-07-01 at 2.21.45 PM

There he is, Jeff Hancock, the same guy who analyzed the Facebook data for Cornell, which initially claimed funding from the Pentagon and then denied it.

I call bullshit. Stinking bullshit.

So it seems that Facebook and the U.S. military are likely working together to study civil unrest and work on ways to manipulate the masses into apathy or misguided feelings of contentment in the face of continued banker and oligarch theft. This is extremely disturbing, but this whole affair is highly troubling in spite of this.

For one thing, although governments and universities need to take certain precautions when conducting such “research,” private companies like Facebook apparently do not. Rather, all they have to do is get people to click “I accept” to a terms of service agreement they never read, which allows companies to do almost anything they want to you, your data and your emotions. What we basically need to do as a society is completely update our laws. For starters, if a private corporation is going to lets say totally violate your most basic civil liberties as defined under the Bill of Rights, a simple terms of service agreement should not be sufficient. For more invasive violations of such rights, perhaps a one page simple-to-read document explaining clearly which of your basic civil liberties you are giving away should be mandatory.

For example, had Facebook not partnered at the university level to analyze this data, we wouldn’t even know this happened at all. So what sort of invasive, mind-fucking behavior do you think all these large corporations with access to your personal data are up to. Every. Single. Day.

The Faculty Lounge blog put it perfectly when it stated:

Academic researchers’ status as academics already makes it more burdensome for them to engage in exactly the same kinds of studies that corporations like Facebook can engage in at will. If, on top of that, IRBs didn’t recognize our society’s shifting expectations of privacy (and manipulation) and incorporate those evolving expectations into their minimal risk analysis, that would make academic research still harder, and would only serve to help ensure that those who are most likely to study the effects of a manipulative practice and share those results with the rest of us have reduced incentives to do so. Would we have ever known the extent to which Facebook manipulates its News Feed algorithms had Facebook not collaborated with academics incentivized to publish their findings?

We can certainly have a conversation about the appropriateness of Facebook-like manipulations, data mining, and other 21st-century practices. But so long as we allow private entities freely to engage in these practices, we ought not unduly restrain academics trying to determine their effects. Recall those fear appeals I mentioned above. As one social psychology doctoral candidate noted on Twitter, IRBs make it impossible to study the effects of appeals that carry the same intensity of fear as real-world appeals to which people are exposed routinely, and on a mass scale, with unknown consequences. That doesn’t make a lot of sense. What corporations can do at will to serve their bottom line, and non-profits can do to serve their cause, we shouldn’t make (even) harder—or impossible—for those seeking to produce generalizable knowledge to do.

If you read Liberty Blitzkrieg, you know I strongly dislike Facebook as a company. However, this is much bigger than just one experiment by Facebook with what appears to be military ties. What this is really about is the frightening reality that these sorts of things are happening every single day, and we have no idea it’s happening. We need to draw the lines as far as to what extent we as a society wish to be data-mined and experimented on by corporations with access to all of our private data. Until we do this, we will continue to be violated and manipulated at will.

For some of my Facebook critical articles from earlier this year, read:

The Chief Operating Officer of Facebook Wants to Ban the Word “Bossy”

How UK Prime Minister David Cameron Paid Thousands of Dollars for Facebook “Likes”

How Facebook Exploits Underage Girls in its Quest for Ad Revenue

This Man’s $600,000 Facebook Disaster is a Warning For All Small Businesses

*The awesome thumbnail photo at the top of this post is by Polish illustrator Pawel Kuczynski. Check out his website here.

In Liberty,
Michael Krieger

Like this post?
Donate bitcoins: 1LefuVV2eCnW9VKjJGJzgZWa9vHg7Rc3r1

 Follow me on Twitter.


 Add your comment
  1. Universities are total BS. They lie about all kinds of things. They are too embedded into the system to NOT lie. The world is how you see it. How you see it for 99% of us is how they decide you should see and think about it. Think about that if you can.

  2. What Zuckerberg did with his clusterskullfuck was no different than Dr. Mengele, but today’s Nuremberg war criminals are media pop stars now.

  3. you are giving facebook too much credit. the collaboration between inQtel –CIA VC and the rest of silicon valley is well documented.

    now, consider facebook is a quasi government operation, and then ask yourself WHY facebook bought ‘Occulus Rift’ for 400 million dollars when it never sold a single dime of profiteable anything.

    because facebook, with it’s giant bags of money is a convenient cover for massivley funding r&d into weaponizing virtual reality.

    watch this video, https://www.youtube.com/watch?v=bcFQ5sQIwR

    the best way to develop these weapons , perhaps the only way, is with a massive scaleable user audience that providers generations of feedback on testing and then redevelopment. you need to have this kind of ‘capitalist’ model for enternment based gaming to develop the perfect wepoans platform. the military itself is simply not big enough to do this on their own in a ‘defense’ setting. you need a self selecting massive audience of gamers to provide the necessary beta platform users, ane company to respond to their needs, continuously upgrading the system until in 10 or 20 years, you finally have something, a controll system, uniquely capable of uniting drone operators with their drones….almost , but not quite, like the interface control systems used in the movie AVATAR.

    keep in mind that the ‘enterntainment’ sector of gaming is perhaps a test bed for virtualized wepaonization systems.

    with occulus rift being a giant test bed. personally, i find it remarkeable. science fiction come reality.

  4. The question is, why would anyone want to be on FB? Can’t blame FB if millions have VOLUNTARILY signed up – I can see a FB logo on this website for logging in. Why hand over your most private details to one company? I don’t get it.

Leave a Reply

5 Trackbacks

  1. Meet CISA – Dianne Feinstein’s Latest Attack on Privacy, Civil Liberties and the Internet | Liberty Blitzkrieg (Pingback)
  2. Twitter Changes its Timeline Algorithm – Here’s What You Need to Know | Liberty Blitzkrieg (Pingback)
  3. Why You Should Think Twice Before Using Facebook’s Messenger App | Liberty Blitzkrieg (Pingback)
  4. A Very Disturbing and Powerful Post – “Get Your Loved Ones Off Facebook” | Liberty Blitzkrieg (Pingback)
  5. At Facebook, Some Hate Speech is More Equal Than Others | Liberty Blitzkrieg (Pingback)