In Memoriam: John D. Arras (1945-2015)

By Michelle Meyer

I am deeply saddened to report that bioethicist John D. Arras died on March 9, 2015.Arras John was the Porterfield Professor of Bioethics and Professor of Philosophy at the University of Virginia, where he directed the undergraduate bioethics program, held an additional appointment at the School of Medicine’s Center for Biomedical Ethics and Humanities, and over the years co-taught multiple courses at the Law School. He was a leading figure in the field of bioethics, and held several prestigious appointments beyond UVa including, at the time of his death, as a Fellow of The Hastings Center and a commissioner of the Presidential Commission for the Study of Bioethical Issues (whose recent report on Ebola he spoke to a journalist about just days ago). He also consulted regularly at the National Institutes of Health and was a founding member of the ethics advisory board of the Centers for Disease Control and Prevention.

John’s scholarly focus in bioethics was two-fold. First, like most bioethicists, John tackled concrete practical ethical problems involving medicine, public health, and the biosciences. His interests in this regard were fairly broad, but he focused on physician-assisted suicide, public health, human subjects research, and what justice requires in the way of access to health care. Read More

Will the Real Evidence-Based Ebola Policy Please Stand Up? Seven Takeaways From Maine DHHS v. Hickox

By Michelle Meyer

Ebola pic

The case I mentioned in my last post, Maine Department of Health and Human Services v. Kaci Hickox is no more. Hickox and public health officials agreed to stipulate to a final court order imposing on Hickox the terms that the court had imposed on her in an earlier, temporary order. Until Nov. 10, when the 21-day incubation period for Ebola ends, Hickox will submit to “direct active monitoring” and coordinate her travel with Maine public health authorities to ensure that such monitoring occurs uninterrupted. She has since said that she will not venture into town or other public places, although she is free to do so.

In a new post at The Faculty Lounge,* I offer a detailed account of the case, which suggests the following lessons:

  1. As Hickox herself described it, the result of her case is a “compromise,” reflecting neither what Hickox nor what Maine initially wanted.
  2. That compromise was achieved by the parties availing themselves of the legal process, not through Hickox’s civil disobedience.
  3. The compromise is not easily described, as it has been, as a victory of science-based federal policy over fear-based state demagoguery. By the time the parties got to court, and perhaps even before then, what Maine requested was consistent with U.S. CDC Guidance, albeit a strict application of it. What Hickox had initially offered to do, by contrast, fell below even the most relaxed application of those guidelines, although by the time the parties reached court, she had agreed to comply with that minimum.
  4. The compromise applies only to Hickox, and was based on a stipulation by the parties to agree to the terms that the court had temporarily imposed after reviewing a limited evidentiary record. Additional evidence and legal arguments that the state might have raised in the now-cancelled two-day hearing could have resulted in a different outcome.
  5. A substantially different outcome, however, would have been unlikely under Maine’s public health statute. Indeed, it is not clear that Maine’s public health statute allows public health authorities to compel asymptomatic people at-risk of developing Ebola to do anything, including complying with minimum CDC recommendations.
  6. “Quarantine” is a charged, but ambiguous, term. It allows us to talk past one another, to shorthand and needlessly politicize a much-needed debate about appropriate policy, and to miss the fact that the CDC Guidance in some cases recommends what could be fairly described as a “quarantine” for people like Hickox and requires it for asymptomatic people with stronger exposure to Ebola (but who are still probably less likely to get sick than not).
  7. It’s not clear who has bragging rights to Ebola policy “grounded in science,” or what that policy looks like.

* The piece is quite long, and I cannot bear the fight with the WordPress formatting demons that it would require to cross-post it here.

Above the (Public Health) Law: Healthcare Worker Deception and Disobedience in a Time of Distrust

By Michelle Meyer

[Author’s Note: Addendum and updates (latest: 4  pm, 10/31) added below.]

A physician shall… be honest in all professional interactions, and strive to report physicians… engaging in fraud or deception, to appropriate entities.
AMA Principles of Medical Ethics

This is a troubling series of news reports about deception and defiance on the part of some healthcare workers (HCWs) in response to what they believe to be unscientific, unfair, and/or unconstitutional public health measures (to be clear, the text is not mine (until after the jump); it’s cut and pasted, in relevant part, from the linked sources):

(1) Ebola Aide Doc: I’m Not Telling My Team To Tell The Truth

Gavin Macgregor-Skinner, an epidemiologist and Global Projects Manager for the Elizabeth R. Griffin Foundation, who has led teams of doctors to treat Ebola in West Africa, reported that he “can’t tell them [his doctors] to tell the truth [to U.S. officials]” on Monday’s “CNN Newsroom.”

“At the moment these people are so valuable . . . I have to ensure they come back here, they get the rest needed. I can’t tell them to tell the truth at the moment because we’re seeing so much irrational behavior,” he stated. “I’ve come back numerous times between the U.S. and West Africa. If I come back now and say ‘I’ve been in contact with Ebola patients,’ I’m going to be locked in my house for 21 days,” Macgregor-Skinner said as his reason for not being truthful with officials, he added, “when I’m back here in the US, I am visiting US hospitals everyday helping them get prepared for Ebola. You take me out for three weeks, who’s going to replace me and help now US hospitals get ready? Those gaps can’t be filled.

He argued that teams of doctors and nurses could be trusted with the responsibility of monitoring themselves, stating, “When I bring my team back we are talking each day on video conferencing, FaceTime, Skype, text messaging, supporting each other. As soon as I feel sick I’m going to stay at home and call for help, but I’m not going to go to a Redskins game here in Washington D.C. That’s irresponsible, but I need to get back to these hospitals and help them be prepared.

UPDATE: Here is the CNN video of his remarks.

(2) Ebola Doctor ‘Lied’ About NYC Travels

The city’s first Ebola patient initially lied to authorities about his travels around the city following his return from treating disease victims in Africa, law-enforcement sources said. Dr. Craig Spencer at first told officials that he isolated himself in his Harlem apartment — and didn’t admit he rode the subways, dined out and went bowling until cops looked at his MetroCard the sources said. “He told the authorities that he self-quarantined. Detectives then reviewed his credit-card statement and MetroCard and found that he went over here, over there, up and down and all around,” a source said. Spencer finally ’fessed up when a cop “got on the phone and had to relay questions to him through the Health Department,” a source said. Officials then retraced Spencer’s steps, which included dining at The Meatball Shop in Greenwich Village and bowling at The Gutter in Brooklyn.

Update 11PM, 10/30: A spokesperson for the NYC healh department has now disputed the above story, which cites anonymous police officer sources, in a statement provided to CNBC. The spokesperson said: “Dr. Spencer cooperated fully with the Health Department to establish a timeline of his movements in the days following his return to New York from Guinea, providing his MetroCard, credit cards and cellphone.” . . . When CNBC asked again if Spencer had at first lied to authorities or otherwise mislead them about his movements in the city, Lewin replied: “Please refer to the statement I just sent. As this states, Dr. Spencer cooperated fully with the Health Department.”

(3) Ebola nurse in Maine rejects home quarantine rules [the WaPo headline better captures the gist: After fight with Chris Christie, nurse Kaci Hickox will defy Ebola quarantine in Maine]

Kaci Hickox, the Ebola nurse who was forcibly held in an isolation tent in New Jersey for three days, says she will not obey instructions to remain at home in Maine for 21 days. “I don’t plan on sticking to the guidelines,” Hickox tells TODAY’s Matt Lauer. “I am not going to sit around and be bullied by politicians and forced to stay in my home when I am not a risk to the American public.”

Maine health officials have said they expect her to agree to be quarantined at her home for a 21-day period. The Bangor Daily News reports. But Hickox, who agreed to stay home for two days, tells TODAY she will pursue legal action if Maine forces her into continued isolation. “If the restrictions placed on me by the state of Maine are not lifted by Thursday morning, I will go to court to fight for my freedom,” she says.

Some thoughts on these reports, after the jump.  Read More

Facebook Rumored To Be Planning Foray Into the Online Health Space

By Michelle Meyer

Reuters broke the story on Friday, citing anonymous sources:

The company is exploring creating online “support communities” that would connect Facebook users suffering from various ailments. . . . Recently, Facebook executives have come to realize that healthcare might work as a tool to increase engagement with the site. One catalyst: the unexpected success of Facebook’s “organ-donor status initiative,” introduced in 2012. The day that Facebook altered profile pages to allow members to specify their organ donor-status, 13,054 people registered to be organ donors online in the United States, a 21 fold increase over the daily average of 616 registrations . . . . Separately, Facebook product teams noticed that people with chronic ailments such as diabetes would search the social networking site for advice, said one former Facebook insider. In addition, the proliferation of patient networks such as PatientsLikeMe demonstrate that people are increasingly comfortable sharing symptoms and treatment experiences online. . . . Facebook may already have a few ideas to alleviate privacy concerns around its health initiatives. The company is considering rolling out its first health application quietly and under a different name, a source said.

I’m quoted in this International Business Times article about Facebook’s rumored plans. After the jump is the full statement I provided to the reporter (links added).  Read More

Facebook Announces New Research Policies

6856181503_8d8e73208b_z

By Michelle Meyer

A WSJ reporter just tipped me off to this news release by Facebook regarding the changes it has made in its research practices in response to public outrage about its emotional contagion experiment, published in PNAS. I had a brief window of time in which to respond with my comments, so these are rushed and a first reaction, but for what they’re worth, here’s what I told her (plus links and less a couple of typos):

There’s a lot to like in this announcement. I’m delighted that, despite the backlash it received, Facebook will continue to publish at least some of their research in peer-reviewed journals and to post reprints of that research on their website, where everyone can benefit from it. It’s also encouraging that the company acknowledges the importance of user trust and that it has expressed a commitment to better communicate its research goals and results.

As for Facebook’s promise to subject future research to more extensive review by a wider and more senior group of people within the company, with an enhanced review process for research that concerns, say, minors or sensitive topics, it’s impossible to assess whether this is ethically good or bad without knowing a lot more about both the people who comprise the panel and their review process (including but not limited to Facebook’s policy on when, if ever, the default requirements of informed consent may be modified or waived). It’s tempting to conclude that more review is always better. But research ethics committees (IRBs) can and do make mistakes in both directions – by approving research that should not have gone forward and by unreasonably thwarting important research. Do Facebook’s law, privacy, and policy people have any training in research ethics? Is there any sort of appeal process for Facebook’s data scientists if the panel arbitrarily rejects their proposal? These are the tip of the iceberg of challenges that the academic IRBs continue to face, and I fear that we are unthinkingly exporting an unhealthy system into the corporate world. Discussion is just beginning among academic scientists, corporate data scientists, and ethicists about the ethics of mass-scale digital experimentation (see, ahem, here and here). It’s theoretically possible, but unlikely, that in its new, but unclear, guidelines and review process Facebook has struck the optimal balance among the competing values and interests that this work involves.  Read More

Conference on Digital Experimentation (CODE) at MIT Sloan

Screenshot 2014-09-30 18.19.55Another stop on my fall Facebook/OKCupid tour: on October 10, I’ll be participating on a panel (previewed in the NYT here) on “Experimentation and Ethical Practice,” along with Harvard Law’s Jonathan Zittrain, Google chief economist Hal Varian, my fellow PersonalGenomes.org board member and start-up investor Ester Dyson, and my friend and Maryland Law prof Leslie Meltzer Henry.

The panel will be moderated by Sinan Aral of the MIT Sloan School of Management, who is also one of the organizers of a two-day Conference on Digital Experimentation (CODE), of which the panel is a part. The conference, which brings together academic researchers and data scientists from Google, Microsoft, and, yes, Facebook, may be of interest to some of our social scientist readers. (I’m told registration space is very limited, so “act soon,” as they say.) From the conference website:

The ability to rapidly deploy micro-level randomized experiments at population scale is, in our view, one of the most significant innovations in modern social science. As more and more social interactions, behaviors, decisions, opinions and transactions are digitized and mediated by online platforms, we can quickly answer nuanced causal questions about the role of social behavior in population-level outcomes such as health, voting, political mobilization, consumer demand, information sharing, product rating and opinion aggregation. When appropriately theorized and rigorously applied, randomized experiments are the gold standard of causal inference and a cornerstone of effective policy. But the scale and complexity of these experiments also create scientific and statistical challenges for design and inference. The purpose of the Conference on Digital Experimentation at MIT (CODE) is to bring together leading researchers conducting and analyzing large scale randomized experiments in digitally mediated social and economic environments, in various scientific disciplines including economics, computer science and sociology, in order to lay the foundation for ongoing relationships and to build a lasting multidisciplinary research community.

Fall Facebook/OKCupid and Future of Research Tour

Sept. 18 Tweet Chat

By Michelle Meyer

I’m participating in several public events this fall pertaining to research ethics and regulation, most of them arising out of my recent work (in Wired and in Nature and elsewhere) on how to think about corporations conducting behavioral testing (in collaboration with academic researchers or not) on users and their online environments (think the recent Facebook and OKCupid experiments). These issues raise legal and ethical questions at the intersection of research, business, informational privacy, and innovation policy, and the mix of speakers in most of these events reflect that.  Read More

My Slate Article on the Importance of Replicating Science

By Michelle Meyer

I have a long article in Slate (with Chris Chabris) on the importance of replicating science. We use a recent (and especially bitter) dispute over the failure to replicate a social psychology experiment as an occasion for discussing several things of much broader import, including:

  • The facts that replication, despite being a cornerstone of the scientific method, is rarely practiced (and even less frequently published) not only in psychology but across science, and that when such studies are conducted, they frequently fail to replicate the original findings (let this be a warning to those of you who, like me, cite empirical literature in your scholarship);
  • Why replications are so rarely conducted and published, relative to their importance (tl;dr: it’s the incentives, stupid);
  • Why it’s critical that this aspect of the academic research culture change (because academic science doesn’t only affect academic scientists; the rest of us have a stake in science, too, including those who fund it, those who help researchers produce it (i.e., human subjects), those who consume and build on it (other scholars and policy-makers), and all of us who are subject to myriad laws and policies informed by it); and
  • Some better and worse ways of facilitating that cultural change (among other things, we disagree with Daniel Kahneman’s most recent proposal for conducting replications).

Michelle Meyer: Misjudgements Will Drive Social Trials Underground

Michelle Meyer has a new piece in Nature – an open letter on the Facebook study signed by a group of bioethicists (including PFC’s Executive Director Holly Fernandez Lynch) in which she argues that a Facebook study that manipulated news feeds was not definitively unethical and offered valuable insight into social behavior.

From the piece:

“Some bioethicists have said that Facebook’s recent study of user behavior is “scandalous”, “violates accepted research ethics” and “should never have been performed”. I write with 5 co-authors, on behalf of 27 other ethicists, to disagree with these sweeping condemnations (see go.nature.com/XI7szI).

We are making this stand because the vitriolic criticism of this study could have a chilling effect on valuable research. Worse, it perpetuates the presumption that research is dangerous.”

Read the full article.

How an IRB Could Have Legitimately Approved the Facebook Experiment—and Why that May Be a Good Thing

Image courtest Flickr
Image courtesy Flickr

By Michelle Meyer

By now, most of you have probably heard—perhaps via your Facebook feed itself—that for one week in January of 2012, Facebook altered the algorithms it uses to determine which status updates appeared in the News Feed of 689,003 randomly-selected users (about 1 of every 2500 Facebook users). The results of this study—conducted by Adam Kramer of Facebook, Jamie Guillory of the University of California, San Francisco, and Jeffrey Hancock of Cornell—were just published in the Proceedings of the National Academy of Sciences (PNAS).

Although some have defended the study, most have criticized it as unethical, primarily because the closest that these 689,003 users came to giving voluntary, informed consent to participate was when they—and the rest of us—created a Facebook account and thereby agreed to Facebook’s Data Use Policy, which in its current iteration warns users that Facebook “may use the information we receive about you . . . for internal operations, including troubleshooting, data analysis, testing, research and service improvement.”

Some of the discussion has reflected quite a bit of misunderstanding about the applicability of federal research regulations and IRB review to various kinds of actors, about when informed consent is and isn’t required under those regulations, and about what the study itself entailed. In this post, after going over the details of the study, I explain (more or less in order):

  • How the federal regulations define “human subjects research” (HSR)
  • Why HSR conducted and funded solely by an entity like Facebook is not subject to the federal regulations
  • Why HSR conducted by academics at some institutions (like Cornell and UCSF) may be subject to IRB review, even when that research is not federally funded
  • Why involvement in the Facebook study by two academics nevertheless probably did not trigger Cornell’s and UCSF’s requirements of IRB review
  • Why an IRB—had one reviewed the study—might plausibly have approved the study with reduced (though not waived) informed consent requirements
  • And why we should think twice before holding academics to a higher standard than corporations

Read More