Academic Freedom and Responsibility

by Suzanne M. Rivera, Ph.D.

Earlier this month, the American Association of University Professors (AAUP) recommended that researchers should be trusted with the ability to decide whether individual studies involving human subjects should be exempt from regulation.  The AAUP’s report, which was prepared by a subcommittee of the Association’s Committee on Academic Freedom and Tenure, proposes that minimal risk research should be exempt from the human research protection regulations and that faculty ought to be given the ability to determine when such an exemption may apply to their own projects.

Specifically, the report states, “Research on autonomous adults should be exempt from IRB approval straightforwardly exempt, with no provisos and no requirement of IRB approval of the exemption) if its methodology either (a) imposes no more than minimal risk of harm on its subjects, or (b) consists entirely in speech or writing, freely engaged in, between subject and researcher.”

These recommendations, designed to address long-standing concerns by social scientists about bureaucratic intrusions into their work, are misguided and could result in real harm to research subjects. Read More

You Talkin’ to Me?

by Suzanne M. Rivera, Ph.D.

The principle of justice articulated in The Belmont Report requires equitable selection of human research subjects.  Equitable in this context means that the risks and benefits of the study are distributed fairly.  Fairness has two components: 1) avoiding exploitation of the vulnerable (e.g. preying upon a poor, uneducated population) and 2) avoiding the unjustified exclusion of any population ( whether out of bigotry, laziness or convenience).  

Recruitment strategies invariably shape the selection of research subjects and the extent to which a pool of participants really represents a cross-section of society.  Institutional Review Boards (IRBs) are charged with evaluating whether study recruitment plans and materials used to obtain informed consent are easily understood and free of misleading information.  This is relatively straightforward when researchers, IRB members, and study subjects all speak the same language.  But when studies are done in geographical areas that include numerous cultural and language communities, it can be quite tricky.

One of the barriers that prevents people from enrolling in (or even knowing about) studies is a lack of awareness and planning by researchers to address language differences.  The human research protection regulations at 45 CFR Part 46.116 require that informed consent information must be provided to research participants (or their representatives) in language understandable to them.  IRBs are supposed to be vigilant about this and require investigators to obtain translated Informed Consent Documents (ICDs) for use with non-English speaking research subjects.  But researchers commonly balk at this expectation, saying it’s unreasonable.   (A disproportionate number of objections have been raised to me thusly, “And what am I supposed to do if someone shows up speaking only Swahili?!”) Read More

Accentuate the Negative

by Suzanne M. Rivera, Ph.D.

While attending the annual Advancing Ethical Research Conference of Public Responsibility in Medicine and Research (PRIM&R) last month in San Diego, I had the opportunity to hear a talk by Dr. John Ioannidis, in which he debunked commonly accepted scientific “truths.”  Calling upon his own work, which is focused on looking critically at published studies to examine the strength of their claims (see his heavily downloaded 2005 paper “Why Most Published Research Findings Are False”), Ioannidis raised important questions for those of us who think about research ethics, and who oversee and manage the research conducted at universities and scientific institutes across the country.

Ioannidis persuasively argued that our system for publishing only studies with statistically significant positive findings has resulted in a bizarre kind of reality where virtually no studies are ever reported that found “negative” results.  Negative results are suppressed because nobody is interested in publishing them.  Editors and reviewers have a major role in this problem; they choose not to publish studies that are not “sexy.”  This artificially inflates the proportion of observed “positive” results and influences the likelihood a scientist will even write up a journal article because she knows what it takes to get published.

But isn’t there an ethical obligation to publish so-called negative results?  In human research, people give their time and undergo risks for the conduct of a study.  Their sacrifices are not meaningful if the results are never shared.  Furthermore, negative results tell us something important.  And if they are not published, some other research team somewhere else may unknowingly repeat a study, putting a new batch of subjects at risk, to investigate a question for which the answer is already known.  Finally, to the extent a study is conducted using taxpayer dollars, the data derived should be considered community property, and there are opportunity costs associated with unnecessarily repetitive work.  Read More

Film Review: How to Survive a Plague

By Suzanne M. Rivera

How to Survive a Plague is a moving chronicle of the onset of the AIDS epidemic as seen through the lens of the activists who mobilized to identify and make available the effective treatments we have today.  Beginning at the start of the epidemic, when little was known about the HIV virus and even hospitals were refusing to treat AIDS patients out of fear of contagion, the film follows a group of leaders in the groups ACT-UP and TAG.  Using existing footage interspersed with current-day interviews, it tells the story of how patients and concerned allies pushed the research community to find a way to treat what was then a lethal disease.

The film’s portrayal of the U.S. Government, specifically then-President George H. W. Bush and high ranking officials in the Food and Drug Administration, is damning.  As hundreds of thousands of people became infected with HIV and the death toll rose, prejudice against marginalized groups (especially gay men, IV drug users) contributed to a lack of urgency about the need to learn how stop the spread of the virus and how to treat the opportunistic infections that killed people with full-blown AIDS.  In contrast, footage of demonstrations, meetings, and conferences highlights the courage of the activists who risked and endured discrimination, beatings and arrests to bring attention to the need for more research.

But How to Survive a Plague is more than a documentary about the power people have to make change when they join together to demand action.  It also is a provocative commentary about unintended consequences.  I saw the film while attending the annual Advancing Ethical Research Conference of Public Responsibility in Medicine and Research (PRIM&R).  In that context, I was especially interested in the way How to Survive a Plague highlights an interesting ethical issue in clinical research. Namely, the problem of protecting people so much from research risks that the protection itself causes harm. Read More

Reporting Information about Clinical Trial Data: Passing the Torch from HHS to the FDA

By Leslie Francis

In 2007, motivated by concerns that pharmaceutical companies were not sharing negative data about what had been learned in clinical trials, Congress established enhanced reporting requirements.

A series of articles published in January 2012 in the British Medical Journal demonstrates that data reporting remains deeply problematic, especially for industry-sponsored trials. (The articles can be found here and are very much worth reading).

A posting Sept. 26 in the Federal Register indicates that the Secretary of HHS has delegated authority to to oversee the reporting process to the FDA.  Whether this signals improved monitoring of clinical trial data submissions remains to be seen.  One can only hope.