Post-Dobbs, the fear is visceral. What was once personal, private, and one hoped, protected within the presumptively confidential space of the doctor-patient relationship, feels exposed. In response to all this fear, the Internet exploded – delete your period tracker; use encrypted apps; don’t take a pregnancy test. The Biden administration too, chimed in, just days after the Supreme Court’s decision, issuing guidance seeking to reassure both doctors and patients that the federal Health Privacy Rule (HIPAA) was robust and that reproductive health information would remain private. Given the history of women being prosecuted for their reproductive choices and the enormous holes in HIPAA that have long allowed prosecutors to rely on healthcare information as the basis for criminal charges, these assurances rang hollow (as detailed at length in our forthcoming article, HIPAA v. Dobbs). From a health care policy perspective, what is different now is not what might happen. All of this has been happening for decades. The only difference today is the sheer number of people affected and paying attention.
By Katie Gu
In the post-Dobbs fight to safeguard reproductive healthcare, a new spotlight has been placed on two existing federal laws: the Health Insurance Portability and Accountability Act (HIPAA) and the Emergency Medical Treatment and Active Labor Act (EMTALA).
Guidance documents issued over the summer by federal agencies emphasize how these laws can be used to protect reproductive health privacy and access.
By Sara Gerke and Chloe Reichel
Direct-to-consumer (DTC) health apps, such as apps that manage our diet, fitness, and sleep, are becoming ubiquitous in our digital world.
These apps provide a window into some of the key issues in the world of digital health — including data privacy, data access, data ownership, bias, and the regulation of health technology.
To better understand these issues, and ways forward, we contacted key stakeholders representing a range of perspectives in the field of digital health for their brief answers to five questions about DTC health apps.
By Jayson Marwaha and Tiffany Tuedor
On April 5, 2021, a quiet, but potentially transformative shift regarding patient access to data occurred.
Health systems are now required to provide patients with timely access to their own medical records, upon request. (Shockingly, this hasn’t been a requirement in the past.)
By Wendy Netter Epstein and Charlotte Tschider
This May, Google announced a new partnership with national hospital chain HCA Healthcare to consolidate HCA’s digital health data from electronic medical records and medical devices and store it in Google Cloud.
This move is the just the latest of a growing trend — in the first half of this year alone, there have been at least 38 partnerships announced between providers and big tech. Health systems are hoping to leverage the know-how of tech titans to unlock the potential of their treasure troves of data.
Health systems have faltered in achieving this on their own, facing, on the one hand, technical and practical challenges, and, on the other, political and ethical concerns.
By Robert I. Field, Anthony W. Orlando, and Arnold J. Rosoff
Large genetic databases pose well-known privacy risks. Unauthorized disclosure of an individual’s data can lead to discrimination, public embarrassment, and unwanted revelation of family secrets. Data leaks are of increasing concern as technology for reidentifying anonymous genomes continues to advance.
Yet, with the exception of California and Virginia, state legislative attempts to protect data privacy, most recently in Florida, Oklahoma, and Wisconsin, have failed to garner widespread support. Political resistance is particularly stiff with respect to a private right of action. Therefore, we propose a federal regulatory approach, which we describe below.
By Vrushab Gowda
As digital health products proliferate, app developers, hardware manufacturers, and other entities that fall outside Health Insurance Portability and Accountability Act (HIPAA) regulation are collecting vast amounts of biometric information. This burgeoning market has spurred patient privacy and data stewardship concerns.
To this end, two policy nonprofits – the Center for Democracy and Technology (CDT) and the eHealth Initiative (eHI) – earlier this month jointly published a document detailing self-regulatory guidelines for industry. The following piece traces the development of the “Proposed Consumer Privacy Framework for Health Data,” provides an overview of its provisions, and offers critical analysis.
By Catherine A Brownstein and Joseph Gonzalez-Heydrich
Given the potential sensitivities associated with describing (i.e., phenotyping) patients with potentially stigmatizing psychiatric diagnoses, it is important to acknowledge and respect the wishes of the various parties involved.
The phenotypic description and depiction of a patient in the literature, although deidentified, may still be of great impact to a family.
By way of example, a novel genetic variant was identified as a likely explanation for the clinical presentation of a patient in a large cohort of individuals with neurodevelopmental and/or psychiatric phenotypes, a finding of great medical interest. The research team elected to further study this candidate and collected samples for functional evaluation of the gene variant and preparation of a case report.
Because the patient had a complicated phenotype, several physicians from various specialties were involved in the patient’s care. The paper draft was circulated amongst the collaborating clinicians and researchers and ultimately shared with the patient’s family by one of their involved caregivers. This is typically not a requirement of such studies, as the informed consent process includes the subjects’ understanding and consent for dissemination of deidentified results in the scientific literature. But as a general practice, families are informed about manuscripts in process, and in this case the family had requested to be kept abreast of ongoing developments.
By Carmel Shachar
As digital phenotyping technology is developed and deployed, clinical teams will need to carefully consider when it is appropriate to leverage artificial intelligence or machine learning, versus when a more human touch is needed.
Digital phenotyping seeks to utilize the rivers of data we generate to better diagnose and treat medical conditions, especially mental health ones, such as bipolar disorder and schizophrenia. The amount of data potentially available, however, is at once both digital phenotyping’s greatest strength and a significant challenge.
For example, the average smartphone user spends 2.25 hours a day using the 60-90 apps that he or she has installed on their phone. Setting aside all other data streams, such as medical scans, how should clinicians sort through the data generated by smartphone use to arrive at something meaningful? When dealing with this quantity of data generated by each patient or research subject, how does the care team ensure that they do not miss important predictors of health?
By Leslie Francis
In a caustic opinion issued on January 14, the Fifth Circuit vacated penalties assessed by the U.S. Department of Health and Human Services (HHS) against the University of Texas M.D. Anderson Cancer Center for HIPAA security breaches.
As has happened to many other health care entities, M.D. Anderson had employees who were not careful with their laptops and thumb drives (and the data therein). A laptop with the unencrypted protected health care information of nearly 30,000 patients was stolen. Unencrypted thumb drives with information on another almost 6,000 patients were lost. M.D. Anderson disclosed the security breaches to HHS, which assessed civil monetary penalties for violation of HIPAA’s encryption and disclosure rules. M.D. Anderson then filed a petition for review, which resulted in the Fifth Circuit holding that the agency action was arbitrary and capricious for failure to consider an important aspect of the problem.
Commentators have already pointed out that this decision will reverberate throughout the HIPAA enforcement world. As it does, I hope it is met with scorn, for it trades on the informal logical fallacy of the false dilemma in two noteworthy ways.