Back at university, my research partner and I were conducting an experiment where participants had to listen intently to sounds with small gaps missing and press a key if it sounded correct and another if it sounded incorrect, according to a previously set standard. It was fun. What was less fun was the sensory deprivation we accidentally inflicted on our classmates and friends. A few reported hallucinations. The room they sat in had not long been painted and the walls were stark white. With such little visual stimulation, they started to see things and some felt like they were floating. But we were safe and our study was allowed to continue. one of the reasons was because we obtained “informed consent” and participants could choose to leave at any time they wished. That’s a few of the basics you have to cover to make sure your experiments and studies are legal. Guess which parts Facebook ignored recently? Oh, how about all of it?
During one week in 2012, Facebook manipulated the newsfeeds of 689,000 users, giving preference to negative comments or stories on their news feeds to half of the unwitting participants and preference to positive stories and comments to the other half. All information was still there, they just manipulated which stories appeared in their news feed. They did this to measure the subsequent output on Facebook of these 689,000 people to investigate emotional contagion on social media. Social contagion is a pretty well established social psychology theory which in a nutshell describes how the emotions of other members of your social group(s) will influence your own mood and emotions.
The results of this study were pretty interesting, demonstrating that a statistically significant number of “participants” were susceptible to social contagion. I say “participants” because nobody knew they were participating in an experiment at all! Even more interestingly, the UK’s Information Commissioner’s Office (IOS) has taken umbrage at this breach of research ethics and plan to pull Facebook up on it in the near future. This comes only a short time after Europe’s top court decided that Google had to de-index search results if requested by Europeans for whom old or defamous information was still available through the search engine. It has taken the law a relatively long time to catch up to the speed at which digital technology moves, but there’s nothing like a good old fashioned manipulation of emotions to get people riled up. What do you think? Was there anything wrong with what Facebook have done? Let us know in the comments below!