Operating room Doctor or Surgeon anatomy on Advanced robotic surgery machine.

Protecting Consumer Privacy in DTC Tissue Testing

By Adithi Iyer

In my last piece, I discussed the hypothetical successor of 23andme — a tissue-based direct-to-consumer testing service I’ve called yourtissueandyou — and the promise and perils that it might bring in consumer health information and privacy. Now, as promised, a closer look at the “who” and “how” of protecting the consumer at the heart of direct-to-consumer precision medicine. While several potential consumer interests are at stake with these services, at top of mind is data privacy — especially when the data is medically relevant and incredibly difficult to truly de-anonymize.

As we’ve established, the data collected by a tissue-based service will be vaster and more varied than we’ve seen before, magnifying existing issues with traditional data privacy. Consumer protections for this type of information are, in a word, complicated. A singular “authority” for data privacy does not exist in the United States, instead being spread among individual state data privacy statutes and regulatory backstops (with overlapping sections of some federal statutes in the background). In the context of health, let alone highly sophisticated cell signaling and microenvironment data, the web gets even more tangled.

Read More

AI-generated image of robot doctor with surgical mask on.

Who’s Liable for Bad Medical Advice in the Age of ChatGPT?

By Matthew Chun

By now, everyone’s heard of ChatGPT — an artificial intelligence (AI) system by OpenAI that has captivated the world with its ability to process and generate humanlike text in various domains. In the field of medicine, ChatGPT already has been reported to ace the U.S. medical licensing exam, diagnose illnesses, and even outshine human doctors on measures of perceived empathy, raising many questions about how AI will reshape health care as we know it.

But what happens when AI gets things wrong? What are the risks of using generative AI systems like ChatGPT in medical practice, and who is ultimately held responsible for patient harm? This blog post will examine the liability risks for health care providers and AI providers alike as ChatGPT and similar AI models increasingly are used for medical applications.

Read More

handmade wooden chess board, two pawns face off.

FTC’s Proposed Non-Compete Rule: A Step in the Right Direction for Health Care and Biotechnology

By Aparajita Lath

In January, the Federal Trade Commission (FTC) released a new draft rule that would categorically ban non-compete clauses — contractual terms between an employer and an employee that prevent subsequent employment at a competing firm — across the country.

Banning non-compete clauses may help to promote competition and innovation in biotechnology and health care.

Read More

File folders containing medical records.

How Dobbs Threatens Health Privacy

By Wendy A. Bach and Nicolas Terry

Post-Dobbs, the fear is visceral. What was once personal, private, and one hoped, protected within the presumptively confidential space of the doctor-patient relationship, feels exposed. In response to all this fear, the Internet exploded – delete your period tracker; use encrypted apps; don’t take a pregnancy test. The Biden administration too, chimed in, just days after the Supreme Court’s decision, issuing guidance seeking to reassure both doctors and patients that the federal Health Privacy Rule (HIPAA) was robust and that reproductive health information would remain private. Given the history of women being prosecuted for their reproductive choices and the enormous holes in HIPAA that have long allowed prosecutors to rely on healthcare information as the basis for criminal charges, these assurances rang hollow (as detailed at length in our forthcoming article, HIPAA v. Dobbs). From a health care policy perspective, what is different now is not what might happen. All of this has been happening for decades. The only difference today is the sheer number of people affected and paying attention.

Read More

Group of athletic adult men and women performing sit up exercises to strengthen their core abdominal muscles at fitness training.

Exercise Equipment Advertisements and Consumer Distrust

By Jack Becker

Are you ready to learn about “the most innovative piece of exercise equipment ever”? To take advantage of “the momentum of gravity to target your entire midsection”? Doesn’t everybody want to “lose those love handles nobody loves”? To finally “have the flat washboard abs and the sexy v-shape [they’ve] always wanted”? Within “just weeks, not months,” anybody can “firm and flatten their stomach.” And “best of all, it’s fun and easy and takes just three minutes a day.”

Despite its endorsement from an expert fitness celebrity and customer testimonials, you might be skeptical of the Ab Circle Pro’s claims. After all, can you really cut out five minutes from the iconic 8-Minute Abs routine?

Massive and misleading promises are an unfortunate reality for many exercise equipment advertisements. Illegitimate advertising claims can harm consumers and impact overall consumer trust, which creates an uphill battle for honest companies. The Federal Trade Commission (FTC) already regulates exercise equipment, but supplementing its efforts with more consumer education and industry self-regulation could be a winning combination to restore trust in the fitness industry.

Read More

Game of whack-a-mole.

Stop Playing Health Care Antitrust Whack-A-Mole

By Jaime S. King

The time has come to meaningfully address the most significant driver of health care costs in the United States — the consolidation of provider market power. 

Over the last 30 years, our health care markets have consolidated to the point that nearly 95% of metropolitan areas have highly concentrated hospital markets and nearly 80% have highly concentrated specialist physician markets. Market research has consistently found that increased consolidation leads to higher health care prices (sometimes as much as 40% more). Provider consolidation has also been associated with reductions in quality of care and wages for nurses

In consolidated provider markets, insurance companies often must choose between paying dominant providers supracompetitive rates or exiting the market. Unfortunately, insurers have little incentive to push back against provider rate demands because they have the ability to pass those rate increases directly to employers and individuals, in the form of higher premiums. In turn, employers take premium increases out of employee wages, contributing to the growing disparity between health care price growth and employee wages. As a result, rising health care premiums mean that every year, consumers pay more, but receive less. 

Read More

What’s Next If the FDA Holds the Line on Social Media?

By Kate Greenwood
[Cross-posted at Health Reform Watch]

Earlier this week, the Food and Drug Administration announced that it was reopening the comment periods for the two draft guidances on the use of social media to promote prescription drugs and medical devices that it released in June:  Internet/Social Media Platforms with Character Space Limitations: Presenting Risk and Benefit Information for Prescription Drugs and Medical Devices and Internet/Social Media Platforms: Correcting Independent Third-Party Misinformation About Prescription Drugs and Medical Devices. Both guidances have drawn criticism from industry and observers, with the FDA being charged with, in the words of Pharmaguy at the Pharma Marketing Blog, “not being technically savvy enough to understand the nuances of social media and search engine advertising.”

In the draft guidance on social media platforms with character space limitations, such as Twitter and sponsored links on Google and Yahoo, the FDA states that “if a firm chooses to make a product benefit claim, the firm should also incorporate risk information within the same character-space-limited communication.” The draft guidance would allow companies to limit the risks that are presented within a character-and-space-limited communication to those that are the most serious, as long as the communication also includes a direct hyperlink to a destination (for example, a landing page) that is devoted exclusively to a complete discussion of the product’s risks. The FDA emphasizes in the draft guidance that “[i]f an accurate and balanced presentation of both risks and benefits is not possible within the constraints of the platform, then the firm should reconsider using that platform for the intended promotional message (other than for permitted reminder promotion).”  In the first round of comments, PhRMA commented that the amount of information that companies are required to include in a single communication “would make the use of Twitter and comparable platforms impossible in all but the rarest cases.” With regard to sponsored links, PhRMA also noted that the guidance assumes that advertisers have more control than they in fact do over “the appearance – and order of appearance – of information on such platforms.”

It will be interesting to see whether and how the FDA responds to these comments, as well as to any additional comments filed during the period that comments are reopened, which ends on October 29th. If the agency holds the line (as I think it should) and continues to require that companies provide at least some balance between risks and benefits in all advertising and labeling, regardless of platform, companies will no doubt (continue) to look for alternatives.  Read More

Apple’s mHealth Rules Fear to Tread Where Our Privacy Laws Fall Short

By Nicolas Terry

On September 9 Apple is hosting its ‘Wish We Could Say More’ event. In the interim we will be deluged with usually uninformed speculation about the new iPhone, an iWatch wearable, and who knows what else. What we do know, because Apple announced it back in June, is that iOS 8, Apple’s mobile operating system will include an App called ‘Health’ (backed by a ‘HealthKit’ API) that will aggregate health and fitness data from the iPhone’s own internal sensors, 3rd party wearables, and EMRs.

What has been less than clear is how the privacy of this data is to be protected. There is some low hanging legal fruit. For example, when Apple partners with the Mayo Clinic or EMR manufacturers to make EMR data available from covered entities they are squarely within the HIPAA Privacy and Security Rules triggering the requirements for Business Associate Agreements, etc.

But what of the health data being collected by the Apple health data aggregator or other apps that lies outside of protected HIPAA space? Fitness and health data picked up by apps and stored on the phone or on an app developer’s analytic cloud fails the HIPAA applicability test, yet may be as sensitive as anything stored on a hospital server (as I have argued elsewhere). HIPAA may not apply but this is not a completely unregulated area. The FTC is more aggressively policing the health data space and is paying particular attention to deviance from stated privacy policies by app developers. The FTC also enforces a narrow and oft-forgotten part of HIPAA that applies a breach notification rule to non-covered entity PHR vendors, some of whom no doubt will be selling their wares on the app store. Read More