Yet again Facebook comes up with its own reasons why we should not be using it. This time, we discover they have been deliberately manipulating users emotions to conduct an unethical social experiment.
They tweaked the algorithm by which Facebook sweeps posts into members’ news feeds, using a program to analyze whether any given textual snippet contained positive or negative words. Some people were fed primarily neutral to happy information from their friends; others, primarily neutral to sad. Then everyone’s subsequent posts were evaluated for affective meanings …
Facebook’s methodology raises serious ethical questions. The team may have bent research standards too far, possibly overstepping criteria enshrined in federal law and human rights declarations. “If you are exposing people to something that causes changes in psychological status, that’s experimentation,” says James Grimmelmann, a professor of technology and the law at the University of Maryland. “This is the kind of thing that would require informed consent.”
The Slate article concludes:
Over the course of the study, it appears, the social network made some of us happier or sadder than we would otherwise have been. Now it’s made all of us more mistrustful.
Of course, just thinking about all those millions of people being sucked into Facebook’s maw makes me sadder every day.
Previous reasons not to use Facebook.