Medicine doctor and stethoscope in hand touching icon medical network connection with modern virtual screen interface, medical technology network concept

AI in Digital Health: Autonomy, Governance, and Privacy

The following post is adapted from the edited volume AI in eHealth: Human Autonomy, Data Governance and Privacy in Healthcare.

By Marcelo Corrales Compagnucci and Mark Fenwick

The emergence of digital platforms and related technologies are transforming healthcare and creating new opportunities and challenges for all stakeholders in the medical space. Many of these developments are predicated on data and AI algorithms to prevent, diagnose, treat, and monitor sources of epidemic diseases, such as the ongoing pandemic and other pathogenic outbreaks. However, these opportunities and challenges often have a complex character involving multiple dimensions, and any mapping of this emerging ecosystem requires a greater degree of inter-disciplinary dialogue and more nuanced appreciation of the normative and cognitive complexity of these issues.

Read More

Blue biohazard sign in front of columns of binary code.

The International Weaponization of Health Data

By Matthew Chun

International collaboration through the sharing of health data is crucial for advancing human health. But it also comes with risks — risks that countries around the world seem increasingly unwilling to take.

On the one hand, the international sharing of health-related data sets has paved the way for important advances such as mapping the human genome, tracking global health outcomes, and fighting the rise of multidrug-resistant superbugs. On the other hand, it can pose serious risks for a nation’s citizens, including re-identification, exploitation of genetic vulnerabilities by foreign parties, and unauthorized data usage. As countries aim to strike a difficult balance between furthering research and protecting national interests, recent trends indicate a shift toward tighter controls that could chill international collaborations.

Read More

Hand holding smartphone with colorful app icons concept.

Who Owns the Data Collected by Direct-to-Consumer Health Apps?

By Sara Gerke and Chloe Reichel

Who owns the data that are collected via direct-to-consumer (DTC) health apps? Who should own that data?

We asked our respondents to answer these questions in the third installment of our In Focus Series on Direct-to-Consumer Health Apps. Learn about the respondents and their views on data privacy concerns in the first installment of this series, and read their thoughts on consumer access to DTC health app data in the second installment.

Read More

Illustration of multicolored profiles. An overlay of strings of ones and zeroes is visible

Should Users Have Access to Data Collected by Direct-to-Consumer Health Apps?

By Sara Gerke and Chloe Reichel

Should consumers have access to the data (including the raw data) that are collected via direct-to-consumer (DTC) health apps? What real-world challenges might access to this data introduce, and how might they be addressed?

In this second installment of our In Focus Series on Direct-to-Consumer Health Apps, that’s what we asked our respondents. Learn about the respondents and their views on data privacy concerns in the first installment of this series. Read on for their thoughts on whether and how consumers should gain access to the data that direct-to-consumer health apps collect.

Read More

hands hold phone with app heart and activity on screen over table in office

Perspectives on Data Privacy for Direct-to-Consumer Health Apps

By Sara Gerke and Chloe Reichel

Direct-to-consumer (DTC) health apps, such as apps that manage our diet, fitness, and sleep, are becoming ubiquitous in our digital world.

These apps provide a window into some of the key issues in the world of digital health — including data privacy, data access, data ownership, bias, and the regulation of health technology.

To better understand these issues, and ways forward, we contacted key stakeholders representing a range of perspectives in the field of digital health for their brief answers to five questions about DTC health apps.

Read More

Illustration of a man and a woman standing in front of a DNA helix

A Proposal for Localized Review to Safeguard Genetic Database Privacy

By Robert I. Field, Anthony W. Orlando, and Arnold J. Rosoff

Large genetic databases pose well-known privacy risks. Unauthorized disclosure of an individual’s data can lead to discrimination, public embarrassment, and unwanted revelation of family secrets. Data leaks are of increasing concern as technology for reidentifying anonymous genomes continues to advance.

Yet, with the exception of California and Virginia, state legislative attempts to protect data privacy, most recently in Florida, Oklahoma, and Wisconsin, have failed to garner widespread support. Political resistance is particularly stiff with respect to a private right of action. Therefore, we propose a federal regulatory approach, which we describe below.

Read More

green, red, and yellow qr codes on phones.

The Promise and Pitfalls of China’s QR Codes as Health Certificates

This article is adapted from a longer paper published in the Harvard Journal of Law and Technology (JOLT)’s Digest section. To access the original paper, please visit JOLT.

By April Xiaoyi Xu

At this point in the COVID-19 pandemic, China has successfully managed to contain the spread of the virus, due in large part to its technological strategy, which uses QR codes as a kind of health certificate.

These color-coded QR codes are automatically generated using cell phone data. Green indicates that an individual is healthy and can move freely, yellow signals that the user must quarantine for up to seven days, and red for fourteen days. The basis for these determinations, as well as the extent of the data collected in order to make them, remains opaque.

Read More

Apple watch and fit bit.

Beyond HIPAA: A Proposed Self-Policing Framework for Digital Health Products

By Vrushab Gowda

As digital health products proliferate, app developers, hardware manufacturers, and other entities that fall outside Health Insurance Portability and Accountability Act (HIPAA) regulation are collecting vast amounts of biometric information. This burgeoning market has spurred patient privacy and data stewardship concerns.

To this end, two policy nonprofits – the Center for Democracy and Technology (CDT) and the eHealth Initiative (eHI) – earlier this month jointly published a document detailing self-regulatory guidelines for industry. The following piece traces the development of the “Proposed Consumer Privacy Framework for Health Data,” provides an overview of its provisions, and offers critical analysis.

Read More

phone camera

Deep Phenotyping Could Help Solve the Mental Health Care Crisis

By Justin T. Baker

The United States faces a growing mental health crisis and offers insufficient means for individuals to access care.

Digital technologies — the phone in your pocket, the camera-enabled display on your desk, the “smart” watch on your wrist, and the smart speakers in your home — might offer a path forward.

Deploying technology ethically, while understanding the risks of moving too fast (or too slow) with it, could radically extend our limited toolkit for providing access to high-quality care for the many individuals affected by mental health issues for whom the current mental health system is either out of reach or otherwise failing to meet their need.

Read More

Life preserver on boat.

Incidental Findings in Deep Phenotyping Research: Legal and Ethical Considerations

By Amanda Kim, M.D., J.D., Michael Hsu, M.D., Amanda Koire, M.D., Ph.D., Matthew L. Baum, M.D., Ph.D., D.Phil.

What obligations do researchers have to disclose potentially life-altering incidental findings (IFs) as they happen in real time?

Deep phenotyping research in psychiatry integrates an individual’s real-time digital footprint (e.g., texts, GPS, wearable data) with their biomedical data (e.g., genetic, imaging, other biomarkers) to discover clinically relevant patterns, usually with the aid of machine learning. Findings that are incidental to the study’s objectives, but that may be of great importance to participants, will inevitably arise in deep phenotyping research.

The legal and ethical questions these IFs introduce are fraught. Consider three hypothetical cases below of individuals who enroll in a deep phenotyping research study designed to identify factors affecting risk of substance use relapse or overdose:

A 51-year-old woman with alcohol use disorder (AUD) is six months into sobriety. She is intrigued to learn that the study algorithm will track her proximity to some of her known triggers for alcohol relapse (e.g., bars, liquor stores), and asks to be warned with a text message when nearby so she can take an alternative route. Should the researchers share that data?

A 26-year-old man with AUD is two years into sobriety. Three weeks into the study, he relapses. He begins arriving to work inebriated and loses his job. After the study is over, he realizes the researchers may have been able to see from his alcohol use surveys, disorganized text messages, GPS tracking, and sensor data that he may have been inebriated at work, and wishes someone had reached out to him before he lost his job. Should they have?

A 35-year-old man with severe opioid use disorder experiences a near-fatal overdose and is discharged from the hospital. Two weeks later, his smartphone GPS is in the same location as his last overdose, and his wearable detects that his respiratory rate has plummeted. Should researchers call EMS? Read More