Facebook admits it screwed up, but its proposed research guidelines are meaningless
Facebook is no stranger to controversy, nor is the social network unfamiliar with upsetting its users. It seems as though Zuckerberg's baby has been hitting the headlines for all the wrong reasons lately, and it's not all that long since users vented their fury after it was revealed that their newsfeeds had been manipulated in the name of research. Now the social network says that it was "unprepared for the reaction the paper received when it was published and have taken to heart the comments and criticism" and is now implementing new user research guidelines.
"There are things we should have done differently" may seem like something of a half-hearted admission that mistakes were made, but it's the second semi-apology from Facebook this week. Research into how people use the social network will still continue, but Facebook now says "we want to do it in the most responsible way." So what does this actually mean?
On the face of it, very little. Although Facebook's Chief Technology Officer, Mike Schroepfer, says that researchers will be given clearer guidelines, details are very thin on the ground. There is a promise that any research that involves looking at "deeply personal" content, or includes the study of particular groups of people, then there will be "an enhanced review process before research can begin". There is absolutely no explanation of what this means, no details of what review process -- if any -- has been used up until now, and who will be responsible for conducting the review. A "panel" has been set up to oversee things, but we don’t know whether it is independent, whether its members were elected to post, and so on.
Facebook also promises that its six-week boot camp program will include education about research practices, but without details this is hollow and meaningless. Research will be published online, but one thing is not made clear -- whether users chosen to take part in research will be notified. It's not clear whether users are able to opt out of research, or if individuals are able to request details of the information that has been gathered about them; this in spite of a general trend for web-based firms to become more transparent.
Schroepfer's post ends with the words, "we want to do this research in a way that honors the trust you put in us by using Facebook every day. We will continue to learn and improve as we work toward this goal." It seems there is still a lot to learn.
Photo credit: 1000 Words / Shutterstock