There's this cartoon you probably know. It features a big-headed, bald kid named Charlie and a girl named Lucy. Again and again in this amusing comic, Lucy promises not to move the football she's holding while the Charlie kid runs to kick it. And, again and again, to great bittersweet hilarity, she does and little Charlie goes flying landing on his back, chagrined but, we understand, still trusting in Lucy's innate goodness. We love that adorable sap, Charlie Brown.
And, when it comes to Facebook, aren't we all just a dumb pack of Charlie Browns. Again and again Mark Zuckerberg yanks the ball of privacy rules, data gathering or social graph algorithms away from us, promises not to next time, and then, whoops, we're on our backs chagrined but willing to give the good ol' social network one more try.
But, this time, it's a bit different.
This time Facebook conducted a psychology experiment nearly 700,000 unsuspecting users. The Princeton researchers, who undertook the study with Facebook, wanted to see if emotional contagion could spread in a social network the way it spreads in real life. In other words, can being subjected to positive or negative status updates affect your mood the way being around positive and negative people can. In order to test that, Facebook diddled the newsfeeds of the unsuspecting users to show more positive or negative posts. Then the researchers monitored those naive users' posts to see if they, in turn, reflected the mood of the jiggered newsfeeds. All this without the explicit and informed consent of the 700,000 users.
In other words, to use another Peanuts reference, the doctor was in, but the patients didn't know they were being examined.
Facebook argues that its long-winded terms of service allow for the use of user data for research. But, the experimental protocol for this kind of research requires that researchers "obtain the informed consent of the individual or individuals using language that is reasonably understandable to that person." Come on, people don't even read terms of service and if they did, they'd find no mention of this sort of experiment that doesn't study data, it alters feeds. And, if a subject is being duped, it is the researcher's duty to inform the subject as soon as "reasonably possible." In this case, there seems to be no evidence that users were told that they were part of an experiment or that their feeds had been manipulated.
Facebook also argues that the newsfeed is manipulated all the time by a "secret sauce" algorithm. And, besides, what they did was no different than standard A/B testing that websites do all the time. The latter argument is just willful stupidity. If you can't tell a Pepsi Challenge from a psych test, you shouldn't be allowed sharp objects.
And, what did the research uncover? A tiny bit of evidence that emotional contagion could happen on Facebook and that when users got more bland posts in their newsfeed they were disinclined to post much.
So, what's the problem with all this?
To go back to Charlie Brown and Lucy, it's just more evidence that Facebook doesn't give a flying, flubbed football kick about its users. It cynically toys with privacy, always to its advantage, lies about doing it, buries privacy controls and generally acts, always, in the best interests of its real customers, advertisers. We are just so much product.
None of that is new. But here we see Facebook secretly toying with people's emotions (albeit with little effect, which is, really beside the point, since the outcome wasn't known). And we see them probing how best to get folks to post more often. Plus, unless we really are trusting bald-headed kids, we have to ask what other experiments are Facebook carrying out, and for whom? Mood control is nothing to take lightly.
The lead researcher on the study, Adam Kramer, now suggests maybe the research wasn't such a good idea after all. Yeah, I think that's about right. Because sometime Charlie Brown's foot is going bloody Lucy's nose, and nobody will care about the football anymore.
Or, maybe I'm just the bald-headed kid who thinks that this time people really will catch on.
Wayne MacPhail has been a print and online journalist for 25 years, and is a long-time writer for rabble.ca on technology and the Internet.
Photo: Flickr/James Truepenny
Thank you for reading this story…
More people are reading rabble.ca than ever and unlike many news organizations, we have never put up a paywall – at rabble we’ve always believed in making our reporting and analysis free to all, while striving to make it sustainable as well. Media isn’t free to produce. rabble’s total budget is likely less than what big corporate media spend on photocopying (we kid you not!) and we do not have any major foundation, sponsor or angel investor. Our main supporters are people and organizations -- like you. This is why we need your help. You are what keep us sustainable.
rabble.ca has staked its existence on you. We live or die on community support -- your support! We get hundreds of thousands of visitors and we believe in them. We believe in you. We believe people will put in what they can for the greater good. We call that sustainable.
So what is the easy answer for us? Depend on a community of visitors who care passionately about media that amplifies the voices of people struggling for change and justice. It really is that simple. When the people who visit rabble care enough to contribute a bit then it works for everyone.
And so we’re asking you if you could make a donation, right now, to help us carry forward on our mission. Make a donation today.