There’s this cartoon you probably know. It features a big-headed, bald kid named Charlie and a girl named Lucy. Again and again in this amusing comic, Lucy promises not to move the football she’s holding while the Charlie kid runs to kick it. And, again and again, to great bittersweet hilarity, she does and little Charlie goes flying landing on his back, chagrined but, we understand, still trusting in Lucy’s innate goodness. We love that adorable sap, Charlie Brown.
And, when it comes to Facebook, aren’t we all just a dumb pack of Charlie Browns. Again and again Mark Zuckerberg yanks the ball of privacy rules, data gathering or social graph algorithms away from us, promises not to next time, and then, whoops, we’re on our backs chagrined but willing to give the good ol’ social network one more try.
But, this time, it’s a bit different.
This time Facebook conducted a psychology experiment nearly 700,000 unsuspecting users. The Princeton researchers who undertook the study with Facebook, wanted to see if emotional contagion could spread in a social network the way it spreads in real life. In other words, can being subjected to postive or negative status updates effect your mood the way being around positive and negative people can. In order to test that, Facebook diddled the newsfeeds of the unsuspecting users to show more positive or negative posts. Then the researchers monitored those naive users posts to see if they, in turn, reflected the mood of the jiggered newsfeeds. All this without the explicit and informed consent of the 700,000 users.
In other words, to use another Peanuts reference, the doctor was in, but the patients didn’t know they were being examined.
Facebook argues that its long-winded terms of service allow for the use of user data for research. But, the experimental protocol for this kind of research requires that researchers “obtain the informed consent of the individual or individuals using language that is reasonably understandable to that person”. Come on, people don't even read terms of service and if they did, they'd find no mention of this sort of experiment that doesn't study data, it alters feeds. And, if a subject is being duped, it is the researcher’s duty to inform the subject as soon as “reasonably possible”. In this case, there seems to be no evidence that users were told that they were part of an experiment or that their feeds had been manipulated.
Facebook also argues that the newsfeed is manipulated all the time by a “secret sauce” algorithm. And, besides, what they did was no different than standard A/B testing that websites do all the time. The latter argument is just willful stupidity. If you can't tell a Pepsi Challenge from a psych test, you shouldn't be allowed sharp objects.
And, what did the research uncover? A tiny bit of evidence that emotional contagion could happen on Facebook and that when users got more bland posts in their newsfeed they were disinclined to post much.
So, what’s the problem with all this?
To go back to Charlie Brown and Lucy, it’s just more evidence that Facebook doesn’t give a flying, flubbed football kick about its users. It cynically toys with privacy, always to its advantage, lies about doing it, buries privacy controls and generally acts, always, in the best interests of its real customers, advertisers. We just so much product.
None of that is new. But here we see Facebook secretly toying with people’s emotions (albeit with little effect, which is, really beside the point, since the outcome wasn’t known). And we see them probing how best to get folks to post more often. Plus, unless we really are trusting bald headed kids, we have to ask what other experiments are Facebook carrying out, and for whom? Mood control is nothing to take lightly.
The lead researcher on the study, Adam Kramer, now suggests maybe the research wasn’t such a good idea after all. Yeah, I think that’s about right. Because sometime Charlie Brown’s foot is going bloody Lucy’s nose, and nobody will care about the football anymore.
Or, maybe I’m just the bald-headed kid who thinks that this time people really will catch on.