A doctor in Mtimbwani, Tanzania helps a woman and child.

Two Reasons Why Wealthy Nations Ought to Address Medical Brain Drain

African governments spend millions of dollars every year training physicians who will leave their home countries to live and work in wealthier nations. The result is that for countries like Ethiopia, Kenya, and Sierra Leone, more of their native physicians are now in the United States and Europe than at home. This massive movement of physician has likely contributed to health crises in many African nations, where citizens die of easily curable diseases each year.

Read More

image of hands texting on a smart phone

Artificial Intelligence for Suicide Prediction

Suicide is a global problem that causes 800,000 deaths per year worldwide. In the United States, suicide rates rose by 25 percent in the past two decades, and suicide now kills 45,000 Americans each year, which is more than auto accidents or homicides.

Traditional methods of predicting suicide, such as questionnaires administered by doctors, are notoriously inaccurate. Hoping to save lives by predicting suicide more accurately, hospitals, governments, and internet companies are developing artificial intelligence (AI) based prediction tools. This essay analyzes the risks these systems pose to safety, privacy, and autonomy, which have been under-explored.

Two parallel tracks of AI-based suicide prediction have emerged.

The first, which I call “medical suicide prediction,” uses AI to analyze patient records. Medical suicide prediction is not yet widely used, aside from one program at the Department of Veterans Affairs (VA). Because medical suicide prediction occurs within the healthcare context, it is subject to federal laws, such as HIPAA, which protects the privacy and security of patient information, and the Federal Common Rule, which protects human research subjects.

My focus here is on the second track of AI-based suicide prediction, which I call “social suicide prediction.” Though essentially unregulated, social suicide prediction uses behavioral data mined from consumers’ digital interactions. The companies involved, which include large internet platforms such as Facebook and Twitter, are not generally subject to HIPAA’s privacy regulations, principles of medical ethics, or rules governing research on human subjects.

Read More

Commentary: Do We Really Need a New, More Powerful Opioid?

By Ron Litman

The FDA’s Analgesic and Anesthetic Drug Advisory Committee (AADPAC), of which I am a member, met October 12 to discuss a controversial New Drug Application (NDA) for a powerful opioid called sufentanil, manufactured by AcelRx.

Like fentanyl, sufentanil is a short-acting synthetic opioid, but approximately 5 to 10 times more potent. In the midst of the current opioid crisis, why would anyone think that the availability of another powerful opioid is a good idea?

Read More

doctor wearing gloves holding sperm sample in test tube while writing on clipboard

When Fertility Doctors Use Their Own Sperm, and Families Don’t Find Out for Decades

An Idaho U.S. District Court ruled this week that parents can provisionally sue the fertility doctor who, in 1980, used his own sperm to create their daughter—just so long as their claims aren’t barred by the many years that have passed since the alleged misconduct that DNA tests substantiate. The daughter—now almost 40—discovered the fraud when she tested her ancestry with a mail-order DNA kit.

The facts are scandalous—but not unique. A handful of similar cases have recently come to light.

Read More