Pregnant woman sitting across desk from doctor wearing scrubs and holding a pen

Excluding Pregnant People From Clinical Trials Reduces Patient Safety and Autonomy

By Jenna Becker

The exclusion of pregnant people from clinical trials has led to inequities in health care during pregnancy. Without clinical data, pregnant patients lack the drug safety evidence available to most other patients. Further, denying access to clinical trials denies pregnant people autonomy in medical decision-making.

Pregnant people still require pharmaceutical interventions after becoming pregnant. Until maternal health and autonomy is prioritized, pregnant people will be left to make medical decisions without real guidance.

Read More

empty hospital bed

Regulatory Gap in Health Tech: Resource Allocation Algorithms

By Jenna Becker

Hospitals use artificial intelligence and machine learning (AI/ML) not only in clinical decision-making, but also to allocate scarce resources.

These resource allocation algorithms have received less regulatory attention than clinical decision-making algorithms, but nevertheless pose similar concerns, particularly with respect to their potential for bias.

Without regulatory oversight, the risks associated with resource allocation algorithms are significant. Health systems must take particular care when implementing these solutions.

Read More

Motherboard, Reverse Detail: This is a green motherboard, photographed with red-gelled flashes.

The Future of Race-Based Clinical Algorithms

By Jenna Becker

Race-based clinical algorithms are widely used. Yet many race-based adjustments lack evidence and worsen racism in health care. 

Prominent politicians have called for research into the use of race-based algorithms in clinical care as part of a larger effort to understand the public health impacts of structural racism. Physicians and researchers have called for an urgent reconsideration of the use of race in these algorithms. 

Efforts to remove race-based algorithms from practice have thus far been piecemeal. Medical associations, health systems, and policymakers must work in tandem to rapidly identify and remove racist algorithms from clinical practice.

Read More

Code on computer.

Building Trust Through Transparency? FDA Regulation of AI/ML-Based Software

By Jenna Becker

To generate trust in artificial intelligence and machine learning (AI/ML)-based software used in health care, the U.S. Food and Drug Administration (FDA) intends to regulate this technology with an eye toward user transparency. 

But will transparency in health care AI actually build trust among users? Or will algorithm explanations go ignored? I argue that individual algorithm explanations will likely do little to build trust among health care AI users.

Read More

WASHINGTON, DC - OCT. 8, 2019: Rally for LGBTQ rights outside Supreme Court as Justices hear oral arguments in three cases dealing with discrimination in the workplace because of sexual orientation.

It’s Time to Update the ACA’s Anti-Discrimination Protections

By Jenna Becker

Assuming that the Affordable Care Act (ACA) withstands its most recent challenge in California v. Texas, the Biden administration should prioritize as a future reform the codification of clearer nondiscrimination standards.

The ACA’s Section 1557, which provides anti-discrimination protections, has been fraught with challenges. Section 1557 incorporates nondiscrimination protections from four separate civil rights statutes. This vague language allows administrations to offer widely differing interpretations of healthcare anti-discrimination protections.

In a 2016 rule, the Obama administration interpreted Section 1557 broadly, including protections based on gender identity and sexual orientation, as well as specific language access requirements. Many of these protections were eliminated in a 2020 rule promulgated by the Trump administration.

It’s time to end these fluctuating standards. The Biden administration should work with Congress to add clearer nondiscrimination protections to the ACA.

Read More

AI concept art.

Health Care AI in Pandemic Times

By Jenna Becker

The early days of the COVID-19 pandemic was met by the rapid rollout of artificial intelligence tools to diagnose the disease and identify patients at risk of worsening illness in health care settings.

Understandably, these tools generally were released without regulatory oversight, and some models were deployed prior to peer review. However, even after several months of ongoing use, several AI developers still have not shared their testing results for external review. 

This precedent set by the pandemic may have a lasting — and potentially harmful — impact on the oversight of health care AI.

Read More

Code on computer.

Rise in Hospital Ransomware Attacks Requires Government Intervention

By Jenna Becker

Last week, widespread ransomware attacks against hospital systems forced several hospitals to go offline. 

Despite the growing risk of cyberattacks against hospitals, the health care industry has been left to address this issue on their own. Ransomware attacks, named for the fee that these malicious viruses attempt to extract, can be very challenging to address, involving complex cybersecurity protocols.

Unfortunately, many hospitals lack the resources and the time required to prevent this malware from spreading. The government has provided minimal resources to hospital systems looking to enhance their cybersecurity. Resource-strapped hospitals require significant government support to address the growing threat of ransomware.

Read More

Rainbow lgbtq pride flag and trans pride flag.

Sexual Orientation and Gender Identity in Medical Records Can Reduce Disparities

By Jenna Becker

Sexual orientation and gender identity (SOGI) data is widely considered crucial to providing competent care to LGBTQ+ patients. This data can also be used to reduce health disparities among sexual and gender minority populations.

Most electronic health record (EHR) vendors are able to document SOGI data. Many health care systems across the country have been collecting SOGI information for several years. However, SOGI documentation is not broadly required. It’s time to require SOGI data collection in EHRs nationwide.

Read More

computer and stethoscope

Is Real-World Health Algorithm Review Worth the Hassle?

By Jenna Becker

The U.S. Food and Drug Administration (FDA) should not delay their plans to regulate clinical algorithms, despite challenges associated with reviewing the real-world performance of these products. 

The FDA Software Pre-Certification (Pre-Cert) Pilot Program was designed to provide “streamlined and efficient” regulatory oversight of Software as a Medical Device (SaMD) — software products that are regulable by the FDA as a medical device. The Pre-Cert program, in its pilot phase, is intended to inform the development of a future SaMD regulatory model.

Last month, the FDA released an update on Pre-Cert, highlighting lessons learned from pilot testing and next steps for developing the program. One key lesson learned was the difficulty in identifying and obtaining the real-world performance data needed to analyze the clinical effectiveness of SaMDs in practice. Although this challenge will be difficult to overcome in the near future, the FDA’s plans to regulate should not be slowed by insufficient postmarket data.

Read More

Medicine doctor and stethoscope in hand touching icon medical network connection with modern virtual screen interface, medical technology network concept

Insufficient Protections for Health Data Privacy: Lessons from Dinerstein v. Google

By Jenna Becker

A data privacy lawsuit against the University of Chicago Medical Center and Google was recently dismissed, demonstrating the difficulty of pursuing claims against hospitals that share patient data with tech companies.

Patient data sharing between health systems and large software companies is becoming increasingly common as these organizations chase the potential of artificial intelligence and machine learning in healthcare. However, many tech firms also own troves of consumer data, and these companies may be able to match up “de-identified” patient records with a patient’s identity.

Scholars, privacy advocates, and lawmakers have argued that HIPAA is inadequate in the current landscape. Dinerstein v. Google is a clear reminder that both HIPAA and contract law are insufficient for handling these types of privacy violations. Patients are left seemingly defenseless against their most personal information being shared without their meaningful consent.

Read More