Thursday, May 28, 2015

When You Are the Product and The Product Is Research

By Tim Rubbelke 

In mid-2014, researchers at Cornell University released a study titled “Experimental evidence of massive scale emotional contagion through social networks.”  The general idea of the study was that manipulating the Facebook feeds of users to display either primarily positive or negative posts caused subjects’ mood to shift accordingly.[1]  The study involved over 600,000 individuals.

Soon after the study was published, tech websites took notice.[2]  One of the obvious questions was whether the study participants had consented and whether they knew they were part of this study.

Facebook pointed to a broad clause in its “Data Use Policy” (DUP) as implying consent (although it seems no one in the study was aware of their participation.)  Some sites questioned whether the clause in question was even in the DUP at the start of the experiment.[3]

In light of the criticism, the Proceedings of the National Academy of Sciences (PNAS) (the journal that published the research) issued an Expression of Concern about the study.[4]  In it they noted that Cornell’s IRB chose not to weigh in on the research because the data their researchers received was anonymized and thus it was not human subjects research under the regulations.

While Cornell’s decision likely fits the technical description of the human subjects research regulations, it serves to highlight a very interesting question: when researchers (and institutions) receive data, should they be concerned about how the data was collected?

One hopes that Facebook carried out the experiment in an ethical manner, but without oversight we cannot be sure.  It certainly seems that at the very least many participants were subjected to a form of harm—by having their moods unknowingly altered to be sadder.  Perhaps more disturbingly, the study gives no indication of what would happen if a participant’s emotions swung far beyond the expected range.[5]

The rules and regulations surrounding academic research are designed to hold it to a high ethical standard, especially regarding the treatment of human subjects.  When data is shared between academic institutions then the research will be subject to the oversight of at least one institution.  In the case of data from private sources (such as Facebook) things are substantially murkier.  As the PNAS expression of concern noted, privately funded organizations like Facebook do not have to follow the Common Rule.  This is compounded by the fact that, as saying on the internet goes, “If you’re not paying for it, you are the product.”[6] Still, as one expert interviewed by CNet pointed out, the experiment carried out by Facebook went beyond the normal types of platform and advertising enhancement most websites carry out.[7] Furthermore if academic researchers are going to accept data from privately funded entities, and wish to maintain the high ethical standards, they should be cognizant of how the data was collected.

As Kahn, et al. note, it’s inadvisable to try and shoehorn a 20th century research regulation apparatus onto a 21st century world.[8]  And yet, as we try to bring the research oversight into the modern world we should be careful not to discard bedrock principles either. After all, public trust in academia is what fosters participation in research.

Tim Rubbelke is a PhD Candidate at the Saint Louis University Albert Gnaegi Center for Health Care Ethics. He contributes regular pieces on research ethics. 



[1] http://www.pnas.org/content/111/24/8788.full.pdf
[3] http://www.cnet.com/news/facebooks-emotion-manipulation-study-faces-added-scrutiny/
[5] Additionally there’s no information about the age of the subjects.
[6] http://lifehacker.com/5697167/if-youre-not-paying-for-it-youre-the-product
[8] http://www.pnas.org/content/111/38/13677.full

No comments:

Post a Comment