Machine Learning in Medicine: Addressing Ethical Challenges

Machine learning in medicine is accelerating at an incredible rate, bringing a new era of ethical and regulatory challenges to the clinic.

In a new paper published in PLOS Medicine, Effy Vayena, Alessandro Blasimme, and I. Glenn Cohen spell out these ethical challenges and offer suggestions for how Institutional Review Boards (IRBs), medical practitioners, and developers can ethically deploy machine learning in medicine (MLm). Read More

a stethoscope tied around a dollar bill, with a bottle of pills nearby

Drug Pricing Controls and the Power of Familiar Ideas

Eight in ten Americans think that prescription drug prices are unreasonable, according to a March 2018 Kaiser poll. That same poll found that more Americans considered passing legislation to lower drug pricing to be a top priority than passing legislation to improve infrastructure or to address the prescription painkiller epidemic, among other things.

Effectively addressing drug pricing is a complex task that will require the diligent efforts of many actors. On October 24, the Petrie-Flom Center held a full day’s programming to this important and timely topic. What I want to state here is a simple point—namely, that the very discussion of potential solutions can play a role in turning creative innovations into implementable solutions.

Read More

The leaked HHS memorandum and transgender health rights

At the end of last month, the New York Times reported on a leaked internal memorandum from Health and Human Services proposing to narrowly define “sex” as “biological sex,” a move made with the purpose of excluding transgender people from a variety of civil rights protections.

The memorandum stirred concerns about the future of Section 1557 of the Affordable Care Act, which provides for an anti-discrimination cause of action in health care settings and has been the basis of a number of private lawsuits by transgender patients. The HHS memorandum reinforces that the Trump administration plans to reinterpret Section 1557 to stem this litigation.

Read More

Congress’s opioids package and the politics of the IMD exclusion

At the end of September, the Senate passed a final version of an expansive legislative package designed to tackle the United States opioid epidemic. The package contains a broad range of policy approaches to the crisis, including enhanced tracking of fentanyl in the U.S. mail system, improved access to Medication Assisted Treatment and addiction specialists, and lifted restrictions on telemedicine and inpatient addiction treatment. The package, which now sits on President Trump’s desk, is widely expected to be signed into law.

The legislative effort has been lauded as a rare act of bipartisanship in an otherwise-polarized Washington.

The Washington Post called the set of bills “one of the only major pieces of legislation Congress is expected to pass this year.” A Time headline declared that “Opioid Bill Shows Congress Can Still Work Together.” Praise of this across-the-aisle effort is merited: the Senate voted for the set of bills 98-1, and the House voted for it 393-8.

While critics have rightfully pointed out that the package does not provide for enough increased spending to address the crisis, with more than 72,000 people dying from drug overdoses in 2017, the time is ripe for a collaborative approach to the opioid crisis, and any effort helps.

Read More

image of hands texting on a smart phone

Artificial Intelligence for Suicide Prediction

Suicide is a global problem that causes 800,000 deaths per year worldwide. In the United States, suicide rates rose by 25 percent in the past two decades, and suicide now kills 45,000 Americans each year, which is more than auto accidents or homicides.

Traditional methods of predicting suicide, such as questionnaires administered by doctors, are notoriously inaccurate. Hoping to save lives by predicting suicide more accurately, hospitals, governments, and internet companies are developing artificial intelligence (AI) based prediction tools. This essay analyzes the risks these systems pose to safety, privacy, and autonomy, which have been under-explored.

Two parallel tracks of AI-based suicide prediction have emerged.

The first, which I call “medical suicide prediction,” uses AI to analyze patient records. Medical suicide prediction is not yet widely used, aside from one program at the Department of Veterans Affairs (VA). Because medical suicide prediction occurs within the healthcare context, it is subject to federal laws, such as HIPAA, which protects the privacy and security of patient information, and the Federal Common Rule, which protects human research subjects.

My focus here is on the second track of AI-based suicide prediction, which I call “social suicide prediction.” Though essentially unregulated, social suicide prediction uses behavioral data mined from consumers’ digital interactions. The companies involved, which include large internet platforms such as Facebook and Twitter, are not generally subject to HIPAA’s privacy regulations, principles of medical ethics, or rules governing research on human subjects.

Read More

group of nurses walking in hospital hallway

Burnout and Moral Distress in Nurses: Can Staffing Numbers Increase Patient Safety?

I know Nurse X only by her failures the night a young woman with asthma died gasping for breath just steps from the emergency entrance of Somerville Hospital. The preventable nature of the woman’s death, and the discovery of that hard truth by her husband, are described thoroughly and compellingly in Sunday’s Boston Globe magazine.

This death was the result of medical error, estimated to be the third leading cause of death behind heart disease and cancer.

But the blurry image of Nurse X, standing in the ER doorway and failing to see the dying woman in the shadows steps away, is for me a snapshot of burnout. I’ll carry it with me to the voting booth on Tuesday when I stare at Question 1, the ballot measure in Massachusetts that could determine and lock into place nurse-to-patient staffing levels. Read More

Wendy Parmet on ‘The Week in Health Law’ Podcast

the week in health law podcast logoIf you listened to the last episode of TWIHL you may recall that it was recorded early on October 26, 2018 just before I kicked off our conference on the intersection of health care and immigration policy.

Thereafter, it was my distinct pleasure to welcome Wendy Parmet as our keynote speaker.

Professor Parmet is the Matthews Distinguished University Professor of Law and Director, Center for Health Policy and Law; Professor of Public Policy and Urban Affairs, Northeastern University School of Public Policy and Urban Affairs. With Patricia Illingworth she authored The Health of Newcomers, Immigration, Health Policy, and the Case for Global Solidarity, published last year by NYU Press. Wendy was most generous in letting me share her compelling talk here on TWIHL. Read More

Commentary: Do We Really Need a New, More Powerful Opioid?

By Ron Litman

The FDA’s Analgesic and Anesthetic Drug Advisory Committee (AADPAC), of which I am a member, met October 12 to discuss a controversial New Drug Application (NDA) for a powerful opioid called sufentanil, manufactured by AcelRx.

Like fentanyl, sufentanil is a short-acting synthetic opioid, but approximately 5 to 10 times more potent. In the midst of the current opioid crisis, why would anyone think that the availability of another powerful opioid is a good idea?

Read More

health secretary matt hancock leaves 10 downing street

No room for complacency in patient safety in the NHS

Matt Hancock, the recently appointed Government, Health and Social Care Secretary, made a keynote speech on patient safety in London recently. The speech spelled out the future direction of NHS (National Health Service) patient safety policy development in England and also contained some very useful observations and policy which have relevance to patient safety policy developers globally, as well as in England.

Read More