Healthcare workers carrying signs protest for improved Covid-19 testing and workplace safety policies outside of UCLA Medical Center in Los Angeles,Dec. 9, 2020.

Beyond 20/20: The Post-COVID Future of Health Care

By Cynthia Orofo

There are two experiences I will never forget as a nurse: the first time I had to withdraw care from a patient and the first day working on a COVID ICU.

Both were unforgiving reminders that the ICU is a demanding place of work that will stress you in every way. But the latter experience was unique for a few particular reasons. Before the end of that first shift, I had overheard several staff members on the floor speak about their fears, thoughts of the unknown, and their version of the “new normal.” As I realized that life would almost certainly not be the same, I developed my own vision of the “new normal” of health care.

Read More

people sitting in conference hall.

All-Male Panels, or ‘Manels,’ Must End

By Kelly Wright and Louise P. King

In this day and age, there is no room for all-male panels, or “manels,” as they are commonly known.

Yet, a quick search of Twitter for #manels or #allmalepanel reveals it remains the norm, with picture after picture of them occurring in a wide array of scientific and medical disciplines. Some try to excuse the error with a woman moderator – a “mom-erator” doing the “housekeeping” of managing the presentations. This is just as bad, if not worse.

Read More

Illustration of multicolored profiles. An overlay of strings of ones and zeroes is visible

Understanding Racial Bias in Medical AI Training Data

By Adriana Krasniansky

Interest in artificially intelligent (AI) health care has grown at an astounding pace: the global AI health care market is expected to reach $17.8 billion by 2025 and AI-powered systems are being designed to support medical activities ranging from patient diagnosis and triaging to drug pricing. 

Yet, as researchers across technology and medical fields agree, “AI systems are only as good as the data we put into them.” When AI systems are trained on patient datasets that are incomplete or under/misrepresentative of certain populations, they stand to develop discriminatory biases in their outcomes. In this article, we present three examples that demonstrate the potential for racial bias in medical AI based on training data. Read More