Scott Monty - Strategic Communications & Leadership Advisor

Scott Monty - Strategic Communications & Leadership Advisor

Repeat offender

We're all aware that Facebook doesn't allow us to see all updates from all of our friends or the brand pages we like in our News Feed. It's simply not feasible, given the amount of content that is shared by those to whom we're connected. The decline of organic reach of brand pages is a result of this phenomenon.

But to some, Facebook may have overstepped the boundary with the revelation of certain research practices. This weekend, the public was informed of a research study at Facebook that was published in The Proceedings of the National Academy of Sciences as "Experimental evidence of massive-scale emotional contagion through social networks" by Facebook data scientist Adam Kramer, Jamie Guillory of UCSF and Jeffrey Hancock of Cornell University.

The study analyzed posts on the feeds of more than 600,000 Facebook users that had been tinkered with - to show either positive or negative emotions. The goal was to determine if the emotions shared over a social network were powerful enough to cause similar emotions to be shown by those who saw them. In other words, they were testing if happy posts made users feel happy and post happy things in response, and similarly with negative posts. You can read more about this on Forbes, The Atlantic, the A.V. Club and elsewhere.

The problem is, no one knew this was going on.

Read Facebook's Terms

"This kind of thing is covered in Facebook's Terms," I hear you saying. You're correct; it is covered. In fact, here's the specific section - although, it's not in the Terms, but two clicks down in the Data Use Policy  > Information we receive and how it is used section. [You'll have to forgive me, as the 9,000 words in the combined documents are a little difficult to sort through for this professional.]

Here's what the relevant section says:
How we use the information we receive
We use the information we receive about you in connection with the services and features we provide to you and other users like your friends, our partners, the advertisers that purchase ads on the site, and the developers that build the games, applications, and websites you use. For example, in addition to helping people see and find things that you do and share, we may use the information we receive about you:
  • for internal operations, including troubleshooting, data analysis, testing, research and service improvement. [Emphasis added]
[UPDATE: the word "research" was added to the Data Use Policy in April 2012, four months after the research study began.]

Interestingly, that very same section also contains this phrase:

Your trust is important to us

We'll get back to the issue of trust in a moment.

The Ethics of Researchers

The 20th century saw the greatest advance in science and understanding, but it also held the distinction of being an era during which many human experiments were conducted that were illegal or unethical, and without the consent or knowledge of the test subjects. One of the most notorious is the Tuskeegee syphyllis experiment conducted by the U.S. Public Health Service from 1932-1972.

Tuskeegee syphyllis experiment blood sample

The ethics of modern medicine, public health, psychology, marketing researchers and the like have come a long way since those comparatively barbaric times.

Whether we regard the Facebook study as one in the field of psychology or market research, there are ethical standards that each requires of its members. Let's take an in-depth look at each. For the purposes of brevity, only the relevant sections have been included.

Marketing Research Association Code of Marketing Research Standards

General Conduct

2. Protect the rights of respondents, including the right to refuse to participate in part or all of the research process.  Researchers must respect the bounds of cooperation set by respondents, who control the parameters under which information is given. In practice, this means all of the following:
  • Respondent agreement to participate in research must be obtained upfront, rather than after the fact.
  • Consent must be granted freely, without coercion.
  • Consent may be withdrawn by the respondent at any point during the contact.
  • Consent must be granted expressly for participation in any subsequent studies.
  • An explicit opt-out request for any future contact or participation at any point during the process will be honored.
  • All reasonable precautions are taken so that respondents are in no way adversely affected as a result of their participation in a marketing research project.
Exceptions: In limited circumstances of passive user data collection, no opportunity may exist for respondents to refuse to participate.

Purpose of Use

6. Obtain consent from respondents prior to utilizing their data in a manner materially different from that to which the respondent has agreed.


8. Make factually correct statements to secure cooperation, including for database/sample development, and honor all promises made to respondents including but not limited to the use of data.

Exceptions: In limited instances, bona fide research projects may require, as part of their design, that respondents remain unaware of specific details such as in message testing. In such cases, upfront instructions to respondents should be truthful and furnish as much information as possible for a respondent to provide informed consent and they should be fully debriefed upon conclusion of contact, when applicable.

One could argue that an up-front agreement to terms - terms in which the research clause is buried - does not give users a chance to refuse to participate or the ability to opt out. One could argue that the research could be considered a "subsequent study" and thus be held separately from the initial consent granted in the Terms/Data Use Policy.

Similarly, the American Psychological Association has a set of standards as well. If we take the research as a grand psychological experiment (which isn't too much of a stretch, as they were studying moods and emotional reaction to stimuli), it's logical to apply the APA's ethical standards to the effort. Incidentally, some psychologists have found a flaw in the research.

APA Ethical Principles of Pscyhologists and Code of Conduct

Principle A: Beneficence and Nonmaleficence
Psychologists strive to benefit those with whom they work and take care to do no harm.

Principle B: fidelity and Responsibility
Psychologists establish relationships of trust with those with whom they work.

Principle C: Integrity
Psychologists seek to promote accuracy, honesty, and truthfulness in the science, teaching and practice of psychology.

Principle E: Respect for People's Rights and Dignity
Psychologists respect the dignity and worth of all people, and the rights of individuals to privacy, confidentiality and self-determination. Psychologists are aware that special safeguards may be necessary to protect the rights and welfare of persons or communities whose vulnerabilities impair autonomous decision making. [such as Facebook users?]

While the study may pass the sniff test for legal, regulatory, and perhaps even some ethical purposes, it's particularly instructive to note that Susan Fiske, the editor of the PNAS paper, thought it was "creepy."


When a professional who regularly grapples with ethical issues thinks a practice is creepy, you can imagine what the public thinks.

This is, of course, a public that has lived through a variety of missteps by Facebook - missteps that are typically backtracked after initial public outrage, and then slowly but surely incorporated as users get used to the new world order. Examples of criticism of Facebook include:

With such a track record with the public, one would think that Facebook would need to be particularly vigilant about anything that would violate the public's trust at this point. But perhaps Facebook is confident that it has the attention and commitment from its 1.2 billion users - so much so that it needn't worry about angering some or losing their trust.

My contention is that most of the trust issues people have with Facebook could have been handled with better communication. 

A number of companies in the digital realm make product changes first and communicate them later - in some cases, only when they're called on it by irate users. Of course, the nature of product development in these organizations is that features need to be tweaked and new releases need to go out in advance of the competition discovering them, for the lead time is very short. 

But this cycle of abusive behavior (infraction, emotional response, apology, reaffirmation of commitment) will continue to go on as long as consumers allow it to.

Meanwhile, perhaps some forward thinking communicators can get out in front of some of these instances and offer up some transparency and leadership in how companies work hand in hand with their users for better experiences.

If they don't, they're at risk of letting their users simply feel used.

UPDATE (7/2/2014): Sheryl Sandberg, COO of Facebook, even admits that "it was communicated poorly." (via WSJ)

Image source: imlsdcc (Flickr)


Post a Comment