Facebook Experiments & Why We Shouldn’t Be Surprised
Commentary

Facebook Experiments & Why We Shouldn’t Be Surprised

21st August 2014

The bods at Facebook were recently berated in the news for ‘experimenting’ on us like a troop of Rhesus monkeys. The media ‘shock horror’ reaction caused quite a few ripples, with the aeonian use of words like ‘MASSIVE SCALE MANIPULATION’, ‘PSYCHOLOGICAL EXPERIMENTS’, and ‘ATTEMPT TO CONTROL YOUR EMOTIONS’. Suddenly even ‘respectable’ newspapers and sites were using these dramatic phrases like The Sun on steroids. Was this unexpected, unforeseen? If so, are we naiive or stupid?

Facebook-logo

What Did They Do?

Here’s the abstract from the experiment itself:

“Emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness. Emotional contagion is well established in laboratory experiments, with people transferring positive and negative emotions to others. Data from a large real-world social network, collected over a 20-y period suggests that longer-lasting moods (e.g., depression, happiness) can be transferred through networks [Fowler JH, Christakis NA (2008) BMJ 337:a2338], although the results are controversial. In an experiment with people who use Facebook, we test whether emotional contagion occurs outside of in-person interaction between individuals by reducing the amount of emotional content in the News Feed. When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks. This work also suggests that, in contrast to prevailing assumptions, in-person interaction and nonverbal cues are not strictly necessary for emotional contagion, and that the observation of others’ positive experiences constitutes a positive experience for people.”

Towards the end of June it was reported that Facebook had engaged in a large scale test in which it controlled (as if it doesn’t anyway) the emotional content of the news feeds of over 700,000 users – without consent. That’s the important bit.

News feeds were filtered to show less ‘positive emotional content’ or ‘negative emotional content’ to study the behaviour of users exposed to these emotionally crafted newsfeeds. Personally I find this sort of thing fascinating, and I was pleased to see the results of such a large scale experiment (don’t flagellate me just yet, read the rest and make your mind up after).

The whole thing sounds way creepier than it actually is.

Why Did They Do It?

Back in 2012 Facebook worked with academics from the University of California and from Cornell on this experiment. The Universities were involved to help ensure ethically correct and viable results were gathered, and the study was published online for all to see, not secreted away in some vault away from the public eye as some would have you believe. You can read it here if you like.

Essentially, they wanted to observe the practical nature of this ‘emotional contagion’ and the effect it had on Facebook users, who have no access to non-verbal cues while interacting. It has been drummed into us for years that anything from 55-89% of all communication is non-verbal, and it makes me shudder when I hear it. The world is different now, and we are adapting, not very well, but we are.

Text and emails and messages get misread all the time, causing panic, paranoia and confusion among the masses because someone didn’t punctuate as they meant to (see Eats, Shoots & Leaves) or forgot to add a kiss at the end of a message. All of this communication is created with no access to non-verbal cues on which we so heavily rely on during face to face interactions. Even on the telephone there is inflection and tonality to give us verbal cues where physical behaviour is absent.

But on a screen we have very few cues at all, hence the invention of emoticons, overuse of question marks and exclamation marks, gifs and tons of other annoying-but-sometimes-useful tools to assist in relaying your intended tone.

Very British Problems

Facebook wanted to see how people reacted in this environment to positive and negative emotions, and also to see if an abundance of positivity left users feeling detached, left out, and therefore more negative. Adam Kramer, who co-authored the report was also interested to see if an abundance of negativity meant users visited Facebook less often.

What Were The Results?

I studied Psychology, and a girl called Katie in my class once noted that she had intentionally let a chap out in front of her on the road that morning to see the impact it had on his behaviour. She observed that he then let a car in front him out shortly after, and that that car did the same. This could be a physical example of emotional contagion. Either that or they were all just being very ‘British’ and overly polite that day.

Unsurprisingly, users were more likely to imitate the emotions they had been exposed to in the same way as Katie’s car drivers. Those who saw more positive items in their news feed were more likely to post positive content themselves, and those who saw more negative results were more likely to post negatively.

In a nice and concise summing up, the BBC quote Psychology professor Katherine Sledge Moore as saying “The results are not even that alarming or exciting.”

I’m pretty sure this is still going on as I and a number of friends have noticed a trend for particularly dismal Newsfeeds on occasion recently; even to the point where people have commented on it on Facebook itself…

The Backlash & Subsequent Apology

Soooooo…The media jumped on this, salivating and gleeful rubbing together their blood and dirt stained hands. An abundance of capital letters, quotes from pseudo-experts and scare-mongering began. Talk about a huge audience, this story was ripe for the picking.

Facebook were, of course, forced to apologise. You can read Sheryl Sandberg’s apology in TIME, or Adam Kramer’s apology on Facebook itself.

Some of my favourite media quotes are:

In a series of Twitter posts, Clay Johnson, the co-founder of Blue State Digital, the firm that built and managed Barack Obama’s online campaign for the presidency in 2008, said: “The Facebook ‘transmission of anger’ experiment is terrifying.”The Guardian

Labour MP Jim Sheridan, a member of the Commons media select committee has called for an investigation into the matter…“This is extraordinarily powerful stuff and if there is not already legislation on this, then there should be to protect people”The Guardian

…and my favourite…

“Get off the grid if you think this stuff is truly creepy. Delete your Facebook account.”Roman Kniahynyckyj

 Who actually deleted their Facebook account after this experiment was revealed?

The Point

As is often the case here is no smoke without fire. While to a point I tease, and become exasperated with this subject, there is a fundamental argument that users did not know they were going to be part of his experiment and feel that their privacy as been abused, and that they have been manipulated.

They have a point. Or do they?

Facebook is free. Their terms and conditions are there for all to see. These tests are carried out to create a better user experience…or are they just there to help greedy marketers creep round the background and better manipulate their unwitting target audience?

It’s a quandary all right.

What right do they have? What rights do we have?

bigstock-Two-Shocked-Women-Using-Laptop SIZED
Two ladies finally read the small print after seeing the story in the Daily Mail.

Why I’m Not Upset About Facebook Experiments

Facebook is a business. When you sign up, as 99% of us did, without reading the Terms & Conditions because they were long and there was a LOT of small print, we sold our souls the Facebook Marketing Overlord.

When we merrily jotted down every last piece of personal and private information and clicked ‘Next’ to confirm each one, did we give a single thought as to how it would be used?

We signed up merrily to chat with our friends, share photos, and generally be entertained, but this is not a charity, WHY did we think Facebook was created? What were they going to do with all that information?

Facebook is a business, nothing more, nothing less. We get for free what we want, and so does Facebook so that they may use that information and capitalise upon it, locking us in through habit and socialisation as they go.

Mark Zuckerberg admitted years ago that he referred to the first users of Facebook as ‘dumb ****s’ for trusting him with personal information. I do not believe it is quite that simple for him or us these days, but fundamentally it shows that the seeds of privacy issues were there from the start. But when we signed up we allowed this, we agreed to it. And for all the bleating in the world, I will always come back to the point of ‘personal responsibility’. You are responsible for your own choices, no one else. If you are concerned, read the Terms & Conditions before you sign up. If you genuinely feel this was an invasion of your privacy or just don’t like it, leave Facebook.

But everyone else is experimenting on you too, so who cares, or even noticed until they were told?

Who Else is Experimenting on Me?

OKCupid came straight out when the Facebook ‘scandal’ broke, and basically said ‘yeah, what of it?’, even posting their own experiment titled ‘We Experimented on Human Beings!’ It opens with this paragraph from Christian Rudder:

“I’m the first to admit it: we might be popular, we might create a lot of great relationships, we might blah blah blah. But OkCupid doesn’t really know what it’s doing. Neither does any other website. It’s not like people have been building these things for very long, or you can go look up a blueprint or something. Most ideas are bad. Even good ideas could be better. Experiments are how you sort all this out…”

In 2007 Marks & Spencers were accused of using ‘slimming mirrors’ in their changing rooms which were said to be ‘misleading’. M&S denied the allegations, but nevertheless, thanks to Robert Kilroy-Silk (remember him?) the case made it all the way to the European Parliament. I mean, PLEASE.

We do it all day long, us digital marketing types, except we call it A/B testing and other such things. We create new PPC ads, replace content, change color schemes. We use Analytics to see who is using what device, where they are, how old they are, what gender they are, what else they might be interested in, even until recently what search terms they used…and no one blinks an eyelid.

Give a Daily Mail journalist a full understanding of Google Analytics and they’d probably have you thinking that 1984 was due within 5 years.

How can things improve if they are not tested? How can the results be reliable if the participants are aware of the experiment itself?

prediction

What’s Next?

It is unlikely that companies are going to stop ‘experimenting’ on us anytime soon, if ever.

Clearly Facebook are far from perfecting their algorithm. I rarely see anything in my Newsfeed from my best mates, who I message often, but an abundance of posts about my ex-boyfriend, who I very rarely speak to on Facebook these days. As I was ‘in’, then ‘out’ of a relationship with him on Facebook itself, I kind of expected that after a few months of minimal contact between us they would realise that I didn’t want to know every time he posted something, but maybe I would like to see what my besties are up to today. No? Well ok then.

Without experimenting, how will the bods learn which way to tweak the algorithm? Without studying our behaviour, how will they improve relevance?

Equally, without dissecting our behaviour, how will they best sell advertising space and know where to put it to generate bazillions of bucks each year?

Final Thought:

“We noticed recently that people didn’t like it when Facebook ‘experimented’ with their news feed. Even the FTC is getting involved. But guess what, everybody: if you use the Internet, you’re the subject of hundreds of experiments at any given time, on every site. That’s how websites work.”
– Christian Rudder, OKCupid

Tags

Written By
Laura is a Digital Marketing Consultant with Aira Digital & Advance Promotions. With search experience in a large number of industries both in-house and agency side, Laura has a strong interest in conversion optimisation and web psychology.
  • This field is for validation purposes and should be left unchanged.