Person typing on computer.

Telehealth Policy Brought to the Fore in the COVID-19 Pandemic

By Vrushab Gowda

The COVID-19 pandemic has highlighted the value of telehealth as both a tool of necessity (e.g., minimizing infection risk, conserving thinly stretched healthcare resources, reducing cost) as well as of innovation.

Telehealth services have surged in recent months; in April alone, they constituted over 40 percent of primary care visits nationwide and over 73 percent of those in Boston. “Increasing Access to Care: Telehealth during COVID-19,” a recent publication in the Journal of Law and the Biosciences, dissects the issues that have accompanied the growth of telehealth and identifies further areas of potential reform.

Read More

A frustrated woman sits at her desk, staring at her computer. Her head is resting in her hand

Patient-Directed Uses vs. The Platform

By Adrian Gropper, MD

This post originally appeared on The Health Care Blog.

This piece is part of the series “The Health Data Goldilocks Dilemma: Sharing? Privacy? Both?” which explores whether it’s possible to advance interoperability while maintaining privacy. Check out other pieces in the series here.

It’s 2023. Alice, a patient at Ascension Seton Medical Center Austin, decides to get a second opinion at Mayo Clinic. She’s heard great things about Mayo’s collaboration with Google that everyone calls “The Platform”. Alice is worried, and hoping Mayo’s version of Dr. Google says something more than Ascension’s version of Dr. Google. Is her Ascension doctor also using The Platform?

Alice makes an appointment in the breast cancer practice using the Mayo patient portal. Mayo asks permission to access her health records. Alice is offered two choices, one uses HIPAA without her consent and the other is under her control. Her choice is: Read More

Doctor types on a laptop

A Delicate Balance: Proposed Regulations May Upset the Tension between Accessibility and Privacy of Health Information

This piece was part of a symposium featuring commentary from participants in the Center for Health Policy and Law’s annual conference, Promises and Perils of Emerging Health Innovations, held on April 11-12, 2019 at Northeastern University School of Law. The symposium was originally posted through the Northeastern University Law Review Online Forum.

Promises and Perils of Emerging Health Innovations Blog Symposium

We are pleased to present this symposium featuring commentary from participants in the Center for Health Policy and Law’s annual conference, Promises and Perils of Emerging Health Innovations, held on April 11-12, 2019 at Northeastern University School of Law. Throughout the two-day conference, speakers and attendees discussed how innovations, including artificial intelligence, robotics, mobile technology, gene therapies, pharmaceuticals, big data analytics, tele- and virtual health care delivery, and new models of delivery, such as accountable care organizations (ACOs), retail clinics, and medical-legal partnerships (MLPs), have entered and changed the healthcare market. More dramatic innovations and market disruptions are likely in the years to come. These new technologies and market disruptions offer immense promise to advance health care quality and efficiency, and improve provider and patient engagement. Success will depend, however, on careful consideration of potential perils and well-planned interventions to ensure new methods ultimately further, rather than diminish, the health of patients, especially those who are the most vulnerable.

In his piece for the Promises and Perils of Emerging Health Innovations blog symposium, Oliver Kim emphasizes the important role trust plays in the provider-patient relationship. Kim unpacks the challenges that come with the introduction and incorporation of new health technology, and further cautions against the potential for erosion of trust when introducing third-parties into the relationship.

A Delicate Balance: Proposed Regulations May Upset the Tension between Accessibility and Privacy of Health Information

Read More

Photograph from above of a health care provider taking a patient's blood pressure.

Diving Deeper into Amazon Alexa’s HIPAA Compliance

By Adriana Krasniansky

Earlier this year, consumer technology company Amazon made waves in health care when it announced that its Alexa Skills Kit, a suite of tools for building voice programs, would be HIPAA compliant. Using the Alexa Skills Kit, companies could build voice experiences for Amazon Echo devices that communicate personal health information with patients. 

Amazon initially limited access to its HIPAA-updated voice platform to six health care companies, ranging from pharmacy benefit managers (PBMs) to hospitals. However, Amazon plans to expand access and has identified health care as a top focus area. Given Thursday’s announcement of new Alexa-enabled wearables (earbuds, glasses, a biometric ring)—likely indicators of upcoming personal health applications—let’s dive deeper into Alexa’s HIPAA compliance and its implications for the health care industry.
Read More

Image of a young woman sitting in her bedroom in workout clothes checking a smart watch health app

Do You Know the Terms and Conditions of Your Health Apps? HIPAA, Privacy and the Growth of Digital Health

As more health care is being provided virtually through apps and web-based services, there is a need to take a closer look at whether users are fully aware of what they are consenting to, as it relates to their health information.

There needs to be a re-evaluation of how health apps obtain consent. At the same time, digital health offers an important opportunity to embolden privacy practices in digital platforms. We ought to use this important opportunity. Read More

DNA Donors Must Demand Stronger Privacy Protection

By Mason Marks and Tiffany Li

An earlier version of this article was published in STAT.

The National Institutes of Health wants your DNA, and the DNA of one million other Americans, for an ambitious project called All of Us. Its goal — to “uncover paths toward delivering precision medicine” — is a good one. But until it can safeguard participants’ sensitive genetic information, you should decline the invitation to join unless you fully understand and accept the risks.

DNA databases like All of Us could provide valuable medical breakthroughs such as identifying new disease risk factors and potential drug targets. But these benefits could come with a high price: increased risk to individuals’ genetic data privacy, something that current U.S. laws do not adequately protect. Read More

HIPAA and the Physician-Patient Privilege: Can Doctors Defending Against Medical Malpractice Suit Carry Out Ex Parte Interviews with the Plaintiff’s Treating Physicians?

By Alex Stein

Whether a litigant’s right to conduct informal ex parte interviews with fact witnesses extends to the plaintiffs’ treating physicians, given the confidentiality provisions of the Health Insurance Portability and Accountability Act of 1996 (HIPAA), is a question of considerable practical importance. This question has recently received a positive answer from the Kentucky Supreme Court in Caldwell v. Chauvin, — S.W.3d —-, 2015 WL 3653447, (Ky. 2015), after “percolating through state courts, federal district courts, and academic circles for a decade.” Id. at *5. Read More

Ebola and Privacy

By Michele Goodwin

As the nation braces for possibly more Ebola cases, civil liberties should be considered, including patient privacy.  As news media feature headline-grabbing stories about quarantines,  let’s think about the laws governing privacy in healthcare. Despite federal laws enacted to protect patient privacy, the Ebola scare brings the vulnerability of individuals and the regulations intended to help them into sharp relief.

In 1996, Congress enacted the Health Insurance Portability and Accountability Act (HIPAA) to protect patient privacy.  Specifically, HIPAA’s Privacy Rule requires that healthcare providers and their business associates restrict access to patients’ health care information.  For many years, the law has been regarded as the strongest federal statement regarding patient privacy. But it may be tested in the wake of the Ebola scare with patients’ names, photographs, and even family information entering the public sphere.

Ebola hysteria raises questions not only about how to contain the disease, but also to what extent Americans value their healthcare privacy.  What liberties are Americans willing to sacrifice to calm their fears?  How to balance the concern for public welfare with legal and ethical privacy principles?  For example, will Americans tolerate profiling travelers based on their race or national origin as precautionary measures?  What type of reporting norms should govern Ebola cases?  Should reporting the existence of an Ebola case also include disclosing the name of the patient?  I don’t think so, but the jury appears out for many.

Apple’s mHealth Rules Fear to Tread Where Our Privacy Laws Fall Short

By Nicolas Terry

On September 9 Apple is hosting its ‘Wish We Could Say More’ event. In the interim we will be deluged with usually uninformed speculation about the new iPhone, an iWatch wearable, and who knows what else. What we do know, because Apple announced it back in June, is that iOS 8, Apple’s mobile operating system will include an App called ‘Health’ (backed by a ‘HealthKit’ API) that will aggregate health and fitness data from the iPhone’s own internal sensors, 3rd party wearables, and EMRs.

What has been less than clear is how the privacy of this data is to be protected. There is some low hanging legal fruit. For example, when Apple partners with the Mayo Clinic or EMR manufacturers to make EMR data available from covered entities they are squarely within the HIPAA Privacy and Security Rules triggering the requirements for Business Associate Agreements, etc.

But what of the health data being collected by the Apple health data aggregator or other apps that lies outside of protected HIPAA space? Fitness and health data picked up by apps and stored on the phone or on an app developer’s analytic cloud fails the HIPAA applicability test, yet may be as sensitive as anything stored on a hospital server (as I have argued elsewhere). HIPAA may not apply but this is not a completely unregulated area. The FTC is more aggressively policing the health data space and is paying particular attention to deviance from stated privacy policies by app developers. The FTC also enforces a narrow and oft-forgotten part of HIPAA that applies a breach notification rule to non-covered entity PHR vendors, some of whom no doubt will be selling their wares on the app store. Read More