Facebook Research Dances Around Informed Consent

Milgram Experiment

The title of the research paper is certainly scholarly. In “Experimental evidence of massive-scale emotional contagion through social networks,” the results of a “massive experiment on Facebook” were published in The Proceedings Of The National Academy Of Sciences. They showed that moods can spread on the network like a disease by exposing some users to more positive news stories than usual, and others to more negative stories.

“For people who had positive content reduced in their News Feed, a larger percentage of words in people’s status updates were negative and a smaller percentage were positive,” the paper notes. “When negativity was reduced, the opposite pattern occurred.”

The mainstream media had fun with the story. “Facebook emotions are contagious!”

But as the story spread online, and notably after this report in The A.V. Club, actual researchers took notice. And many are upset.

The problem is “informed consent,” a fundamental principle of research involving human subjects. While it can get complicated, it basically means researchers must meet three requirements:

  1. Disclosing to potential research subjects information needed to make an informed decision;
  2. Facilitating the understanding of what has been disclosed; and
  3. Promoting the voluntariness of the decision about whether or not to participate in the research.

This critical issue is summarily dismissed by saying that all Facebook users agree to be studied simply by using Facebook. The study addresses the matter in one sentence fragment: “…It was consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research.”

And indeed, Facebook’s Data Use Policy does mention research: “…In addition to helping people see and find things that you do and share, we may use the information we receive about you for internal operations, including troubleshooting, data analysis, testing, research and service improvement.”

But many say that’s not enough.

“This study is in violation of laws regarding Human Subject protocols in research,” writes Gwynne Ash in a comment on The A.V. Club story. Ash, a professor at Texas State University, goes on to say:

“In this study there was no disclosure to participants that they were members of a research study, even though the purpose of the study was to produce negative emotional states, such as depression, through specific manipulation of data provided to participants (i.e., this was not a naturalistic study). The blanket research permission that is part of the Facebook TOS in no way approaches ethical appropriateness for human subject research of this type. There was also no debriefing of study participants. The publication of this study breaches all accepted protocols for the protection of human subjects in experimental research…”

Aimee Giese, someone I’ve followed online since 2009, put it much more succinctly: “there is NO WAY Facebook did not violate human subjects rules.”

Apparently, informed consent rules don’t officially apply to private companies conducting their own research, but while a Facebook employee was the lead researcher, there were co-authors affiliated with institutions of higher education — University of California, San Francisco and Cornell University — that most certainly adhere to the requirement.

At the University of Hawaii, the Human Computer Interaction Lab leads a lot of research into social computing. Lab director Scott Robertson, who is also an Associate Professor of Department of Information and Computer Sciences, shared his initial thoughts with me.

“My opinion is that what Facebook did here is unethical, but it is a fuzzy boundary,” he said.

“For example, Facebook (and others) conduct so-called A/B studies all the time where they present different interfaces, or different ads, or use different algorithms to different customers and measure things like time spent on the page, click rate, buying, et cetera,” Robertson explained. “If you think about it, they are purposely manipulating the experience and emotions of users in these situations as well, but somehow this seems OK to me.”

“This is a bit of a new frontier, and we will see a lot of this type of thing in the future,” he added.

I don’t know if I was one of the hundreds of thousands of Facebook users included in this study, but I definitely feel manipulated and grumpy. So either way, their test worked.

15 Responses

  1. merzmensch says:

    Very good points, Ryan!
    After all, if the affirmation of Facebook guidlines does mean you are from now a guinea pig for all possible researches without to be extra notified, this subconscious terror will influence our online communication and can influence also our offline communication.

  2. Bob says:

    I’m totally going to contact Facebook and demand a refund.

  3. Chris says:

    Apparently, informed consent rules don’t officially apply to private companies conducting their own research

    My understanding is that this is not quite right. There are two different issues here:

    PNAS (the journal) has a policy requiring IRB ethics review for all published studies that experiment on humans, regardless of whether academic or corporate[1]. A Cornell press release[2] says this work was also funded by the Army Research Office, which is inside DoD, and DoD *also* requires IRB review before publication for any human subjects research.

    So that’s one question: did this study go through IRB review? If not, how is it not violating PNAS’ own guidelines? Whether an IRB reviewed it (and which IRB it was) is supposed to be disclosed inside the paper in either case, and it isn’t.

    A second question: did the users give informed consent? Either the study was not conducting human subjects research — an incredible claim for a study that manipulated the emotions of ~700k people — or it was, and the arbiter of whether informed consent was received is the review board, *not* the researchers themselves. So the researchers’ own opinion on whether they obtained informed consent shouldn’t be relevant.

    [1]: http://www.pnas.org/site/authors/journal.xhtml (vii)

    [2]: http://www.news.cornell.edu/stories/2014/06/news-feed-emotional-contagion-sweeps-facebook

  4. ssasse says:

    Chris, thanks for putting that together.

  5. Gwynne Ash says:

    The issue is both the “informed” part of informed consent (which a blanket TOS cannot adequately cover), and the nature of the research, which sought to create negative affect in the research subjects through deception (withholding positive posts).

    From Cornell’s own IRB website:
    “Fully informing participants of the risks, benefits, and procedures involved in a study is a standard requirement in research with human participants. Ethically and legally, consent is not considered to be “informed” unless the investigator discloses all the facts, risks, and discomforts that might be expected to influence an individual’s decision to willingly participate in a research protocol. This applies to ALL types of research including surveys, interviews, and observations in which participants are identified, and other experiments, such as diet, drug and exercise studies. For a complete list of the components of informed consent that are considered essential by the Cornell IRB, please refer to SOP #10, Informed Consent Options, Processes, and Documentation.” http://www.irb.cornell.edu/faq/#con1

    Further:
    “Do I always have to obtain the informed consent of research participants?
    In general, yes, but there are some limited exceptions. The Cornell IRB is responsible for ensuring that basic ethical principles are abided by in all research. The expectation that the informed consent of research participants be obtained is based upon the Belmont principle of respect for persons, and regarded as extremely important in conducting ethical research. The IRB has the authority to waive some or all of the federal requirements for informed consent in certain extenuating circumstances. A request for waiver of informed consent must be specifically justified by the researcher in the proposal to the IRB.”

    When using deception in human subjects research you have to meet very specific protocols for using deception, none of which were met by this research. At the very least when you use deception, you are required to disclose your deception to participants after the research, and explain why it was used. As people still do not know if they were used as research subjects, there is no way for them to be debriefed, and there was no attempt to debrief them.

    From the American Psychological Association’s guidelines on ethical research, using deception:

    “After the data are collected, the investigator provides the participant with information about the nature of the study and attempts to remove any misconceptions that may have arisen. Where scientific or humane values justify delaying or withholding this information, the investigator incurs a special responsibility to monitor the research and to ensure that there are no damaging consequences for the participant.”

    These violations are especially concerning because federal research funds, from the Army, were used to underwrite the research.

    I believe that if the Cornell IRB granted a waiver of informed consent in this research, they violated their responsibilities for jobs to protect rights of human subjects. It appears likely that only the Cornell IRB reviewed the study, as the UCSF faculty member was a post-doc at Cornell when the research was conducted (according to Cornell’s own press release).

  6. Chris says:

    This story sure is exciting. The PNAS editor has come forward to say that she checked that the au thors had passed university IRB review, but Forbes is now reporting that the editor is incorrect, that there has been a misunderstanding, and that only Facebook internal review happened:

    http://www.forbes.com/sites/kashmirhill/2014/06/29/facebook-doesnt-understand-the-fuss-about-its-emotion-manipulation-study/

    I suspect there’s still more details we’re missing, but if the events happened as described above, PNAS would have little choice but to issue a retraction of the paper for failing to obtain the required IRB review.

  7. Gwynne Ash says:

    Unfortunately, I think the only lesson that FB has learned is that they can no longer publish their research as science (and co-author with university researchers in those endeavors).

    It’s a step. But, as you, and the Forbes author point out, it still means that they miss the point.

  8. Gwynne Ash says:

    I’d also add, as you only have to be 13 to have a FB page, they likely also used as research participants, without informed consent, a high-risk population of minor children. THAT is a big no-no.

    99-to-1 they did not preclude data from minor children from their analysis.

  9. Gwynne Ash says:

    http://mediarelations.cornell.edu/2014/06/30/media-statement-on-cornell-universitys-role-in-facebook-emotional-contagion-research/

    Cornell is saying it didn’t know anything about lack of informed consent for data collection. It only approved data analysis of a pre-existing data set. Which means it is just ducking and covering.

    I predict PNAS will withdraw the paper.

  10. Chris says:

    Cornell’s position here — that its researchers can be involved in the design of a study, hand it off to a company with no oversight to perform the experiment, and then come back to interpret and publish the results — seems incredible. (As in, “not credible”.)

    Their position is inconsistent with the paper, too. Cornell’s press release suggests that the researchers were only involved in the data analysis and not the design, yet the “author contributions” section of the paper says the Cornell researchers *did* design the research and *did not* perform the data analysis:

    Author contributions: A.D.I.K., J.E.G., and J.T.H. designed research; A.D.I.K. performed research; A.D.I.K. analyzed data; and A.D.I.K., J.E.G., and J.T.H. wrote the paper.

  11. Gwynne Ash says:

    Absolutely, Chris. It smells to high heaven.

  1. June 27, 2014

    […] Read more at Hawaii Blog >>> […]

  2. June 27, 2014

    […] Facebook Research Dances Around Informed Consent The title of the research paper is certainly scholarly. In […]

  3. January 26, 2015

    […] is definitely someone worth meeting. I interviewed him last summer, when Facebook said it was conducting research on users by manipulating their newsfeeds. He went on […]

  4. December 13, 2020

    […] Read more at Hawaii Blog >>> […]

Discover more from Hawaii Blog

Subscribe now to keep reading and get access to the full archive.

Continue reading