Hand holding smartphone with colorful app icons concept.

The Fourth Amendment and the Post-Roe Future of Privacy

By Katie Gu

An April 2021 data privacy bill sponsored by Senator Ron Wyden (D-OR) has taken on new urgency in the post-Roe Digital Age.

The bipartisan bill, The Fourth Amendment Is Not For Sale Act, would close the current legal loophole through which the FBI, Department of Homeland Security, Department of Defense, Customs and Border Protection, Immigration and Customs Enforcement, and the Internal Revenue Service, have repeatedly purchased Americans’ personal and consumer information from data brokers.

In the wake of the recent Dobbs v. Jackson Women’s Health Organization decision, this bill may play an important role in protecting reproductive health data against government overreach and new forms of surveillance technologies.

Read More

globe.

A Critical Analysis of the Eurocentric Response to COVID-19: Western Ideas of Health

By Hayley Evans

The international response to COVID-19 has paid insufficient attention to the realities in the Global South, making the response Eurocentric in several ways.

This series of blog posts looks at three aspects of the COVID-19 response that underscore this Eurocentrism. The first post in this series scrutinized the technification of the international response to COVID-19. This second post looks at how the international pandemic response reflects primarily Western ideas of health, which in turn exacerbates negative health outcomes in the Global South.

This series draws on primary research conducted remotely with diverse actors on the ground in Colombia, Nigeria, and the United Kingdom, as well as secondary research gathered through periodicals, webinars, an online course in contact tracing, and membership in the Ecological Rights Working Group of the Global Pandemic Network. I have written about previous findings from this work here.

Read More

Cartoon of contact tracing for COVID-19.

A Critical Analysis of the Eurocentric Response to COVID-19: Data Colonialism

By Hayley Evans

The international response to COVID-19 has paid insufficient attention to realities in the Global South, making the response Eurocentric in several ways.

This series of blog posts looks at three aspects of the COVID-19 response that underscore this Eurocentrism. The first post in this series will scrutinize the digital aspect of the international response to COVID-19. In creating and promoting technological solutions that are impractical and ineffective in the Global South, this digital focus has afforded asymmetric protection to those located in the Global North.

This series draws on primary research conducted remotely with diverse actors on the ground in Colombia, Nigeria, and the United Kingdom, as well as secondary research gathered through periodicals, webinars, an online course in contact tracing, and membership in the Ecological Rights Working Group of the Global Pandemic Network. I have written about previous findings from this work here.

Read More

a home hub featuring icons of all the tasks it can assist with in a thinking cloud

Exploring Elder Care Robotics: Voice Assistants and Home Hubs

This article is part of a four-part series that researches how robotics are being developed for aging care and investigates their ethical implications. In our first article, we explored emotional companion robots, which soothe and comfort patients experiencing loneliness, depression, or diseases such as Alzheimer’s. Today, we look at voice assistants and home hubs—robots designed to coordinate and simplify daily tasks around the house. 

What are Voice Assistants and Home Hubs?

Unlike other robots in this series, you are probably familiar with voice assistants and home hubs. These robots, which include Amazon Echo, Google Home, Apple Siri, Samsung Ballie, and Nest, respond to human commands (voice, motion, or input) to complete tasks like preheating the oven, playing a podcast, or refilling a prescription. Several devices also incorporate artificial intelligence (AI) to learn household patterns and anticipate needs.  However, unlike social robots (covered later in this series), voice assistants do not proactively engage with users unless programmed or commanded.   Read More

Filing archives cabinet on a laptop screen

The Right Lesson from the Google-Ascension Patient Privacy Story

By I. Glenn Cohen

As has been well reported in the media, there is a controversy brewing over nonprofit hospital chain Ascension sharing millions of patient records with Google for their project codenamed “Nightingale.” (very Batman, if you ask me!) Most of the discussion so far, and the answers have not yet become pellucid, concerns whether the hospital and Google complied with HIPAA.

 

This is important, don’t get me wrong, but it is important that conversation not ignore a more important question: Read More

Person looking at a Fitbit watch in a Best Buy store

Reviewing Health Announcements at Google, Facebook, and Apple

By Adriana Krasniansky

Over the past several days, technology players Google, Apple, and Facebook have each reported health-related business news. In this blog post, we examine their announcements and identify emerging ethical questions in the digital health space.

On Nov. 1, Google announced plans to acquire smartwatch maker Fitbit for $2.1 billion in 2020, subject to regulatory approval. The purchase is expected to jumpstart the production of Google’s own health wearables; the company has already invested at least $40 million in wearable research, absorbing watchmaker Fossil’s R&D technology in January 2019.

Read More

Image of binary and dna

Is Data Sharing Caring Enough About Patient Privacy? Part I: The Background

By Timo Minssen (CeBIL, UCPH), Sara Gerke & Carmel Shachar

A recent US lawsuit highlights crucial challenges at the interface of data utility, patient privacy & data misuse

The huge prospects of artificial intelligence and machine learning (ML), as well as the increasing trend toward public-private partnerships in biomedical innovation, stress the importance of an effective governance and regulation of data sharing in the health and life sciences. Cutting-edge biomedical research strongly demands high-quality data to ensure safe and effective health products. It is often argued that greater access to individual patient data collections stored in hospitals’ medical records systems may considerably advance medical science and improve patient care. However, as public and private actors attempt to gain access to such high-quality data to train their advanced algorithms, a number of sensitive ethical and legal aspects also need to be carefully considered. Besides giving rise to safety, antitrust, trade secrets, and intellectual property issues, such practices have resulted in serious concerns with regard to patient privacy, confidentiality, and the commitments made to patients via appropriate informed consent processes.

Read More

Medicine doctor and stethoscope in hand touching icon medical network connection with modern virtual screen interface, medical technology network concept

Data-driven Medicine Needs a New Profession: Health Information Counseling

By Barbara Prainsack, Alena Buyx, and Amelia Fiske

Have you ever clicked ‘I agree’ to share information about yourself on a health app on your smartphone? Wondered if the results of new therapy reported on a patient community website were accurate? Considered altering a medical device to better meet your own needs, but had doubts about how the changes might affect its function?

While these kinds of decisions are increasingly routine, there is no clear path for getting information on health-related devices, advice on what data to collect, how to evaluate medical information found online, or concerns one might have around data sharing on patient platforms.

It’s not only patients who are facing these questions in the age of big data in medicine. Clinicians are also increasingly confronted with diverse forms of molecular, genetic, lifestyle, and digital data, and often the quality, meaning, and actionability of this data is unclear.

The difficulties of interpreting unstructured data, such as symptom logs recorded on personal devices, add another layer of complexity for clinicians trying to decide which course of action would best meet their duty of beneficence and enable the best possible care for patients.

Read More

Elderly Care in the Age of Machine and Automation

By Aobo Dong

Would you be willing to accept a professional care-giving robot as a replacement to a human companion when your loved ones are far away from you? During last week’s HLS Health Law Workshop, Professor Belinda Bennett provided a great overview on the imminent age of machine and automation and the legal and ethical challenges the new era entails, especially in health care law and bioethics. After discussing three areas of potential health law complications, Professor Bennett argued that the field of health law is undergoing a transition from the “bio” to the “digital” or “auto,” and that instead of playing a catching-up game with rapidly evolving technologies, more focus should be placed on learning from past and existing laws and regulations in order to meet new demands from the “second machine age.” However, I wish to propose a closely-related but alternative paradigm, that is, using the issues raised by new technologies as a vehicle for improving existing laws and reshaping social norms that once made existing laws inadequate or flawed. I will elaborate on my point through the author’s own example of elderly care.

Despite the fact that the author advocates a revisionist approach for thinking about health law and technology, her paradigm is still about laws serving the needs and solving concerns of the tech industry intersected with health care. I wonder whether it would be productive to view the issue from the opposite direction, that is, how could new technologies and the challenges they raise inform us about existing laws (revealing blind spots or providing opportunity to improve unjust/unfair/discriminatory laws). Viewed this way, we could not only strengthen connections between past laws and future technologies, but also be guided by a clearer sense of how future legal reforms and regulations could redress past neglect and meet new challenges. Read More

Sharing Data for 21st Century Cures – Two Steps Forward…

By Mary A. Majumder, Christi J. Guerrini, Juli M. Bollinger, Robert Cook-Deegan, and Amy L. McGuire

The 21st Century Cures Act was passed with support from both sides of the aisle (imagine that!) and signed into law by then-President Obama late last year. This ambitious legislation drives action in areas as diverse as drug and device regulation and response to the opioid epidemic. It also tackles the issue of how to make data more broadly available for research use and clinical purposes. In our recently published GIM article, “Sharing data under the 21st Century Cures Act,” we examine the Act’s potential to facilitate data-sharing, in line with a recent position statement of the American College of Medical Genetics and Genomics. We highlight a number of provisions of the Act that either explicitly advance data-sharing or promote policy developments that have the potential to advance it. For example, Section 2014 of the Act authorizes the Director of National Institutes of Health to require award recipients to share data, and Section 4006 requires the Secretary of Health and Human Services to promote policies ensuring that patients have access to their electronic health information and are supported in sharing this information with others.

Just as relevant, the Act takes steps to reduce some major barriers to data sharing. An important feature of the Act, which has not been extensively publicized, is its incorporation of provisions from legislation originally proposed by Senators Elizabeth Warren and Mike Enzi to protect the identifiable, sensitive information of research subjects. Senator Warren, in particular, has been a vocal advocate of data sharing. Arguably, one of the biggest barriers to sharing is public concern about privacy. The relevant provisions address this concern chiefly via Certificates of Confidentiality. Among other things, the Act makes issuance of Certificates automatic for federally-funded research in which identifiable, sensitive information is collected and prohibits disclosure of identifiable, sensitive information by covered researchers, with only a few exceptions such as disclosure for purposes of other research. These protections became effective June 11, 2017. While NIH has signaled its awareness of the Act, it has not yet updated its Certificates of Confidentiality webpage. Read More