Facebook’s emotion experiment: too far or just a social network norm?
The social networks methodology raises serious ethical questions and the team may have bent research standards too far, possibly overstepping criteria enshrined in law and human rights declarations.
"If you are exposing people to something that causes changes in psychological status, that’s experimentation.
"This is the kind of thing that would require informed consent."
- James Grimmelmann, professor of technology and the law at the University of Maryland
But given that Facebook has over half a billion users, is it not a foregone conclusion that every change that they make to the news feed - or any other part of its websites - induces a change in millions of people’s emotions? Yet nobody seems to complain about this reality, presumably because, when you put it this way, it seems kind of silly to suggest that a company whose business model is predicated on getting its users to use its product more would do anything other than try to manipulate its users into, you know, using its product more.
Haranguing Facebook and other companies for publicly disclosing scientifically interesting results of experiments that it is already constantly conducting – and that are directly responsible for many of the positive aspects of the user experience – is not likely to accomplish anything useful. If anything, it will only ensure that all of Facebook’s experimental research is done in the dark, where nobody outside the company can ever find out about it.