Siri, Apple's voice-activated digital assistant, tells iPhone user, "Go ahead, I'm listening," which also displays as text on the screen.

On Siri and Recognitive Violence

By Joshua A. Halstead

As a disabled person who relies on speech recognition software to complete a range of daily writing tasks (from emails to book chapters), I am no stranger to the universe of voice assistants, having cut my teeth on Dragon Dictate in the ’90s.

Though I’m used to the software knowing my voice, that it now knows my location is uncanny. This discovery occurred on a morning stroll when Siri spelled “Urth Caffé” correctly, rather than, as was expected, scribing “earth café.” This is when I realized that my assistant had turned into a stalker.

In this short article, I argue that Apple’s decision to integrate user location data into Siri’s speech recognition system created a product that contributes to gentrification and could further marginalize disabled people.

Read More

Person looking at a Fitbit watch in a Best Buy store

Reviewing Health Announcements at Google, Facebook, and Apple

By Adriana Krasniansky

Over the past several days, technology players Google, Apple, and Facebook have each reported health-related business news. In this blog post, we examine their announcements and identify emerging ethical questions in the digital health space.

On Nov. 1, Google announced plans to acquire smartwatch maker Fitbit for $2.1 billion in 2020, subject to regulatory approval. The purchase is expected to jumpstart the production of Google’s own health wearables; the company has already invested at least $40 million in wearable research, absorbing watchmaker Fossil’s R&D technology in January 2019.

Read More

Getting Granular with Apple’s mHealth Guidelines

By Nicolas Terry

In a post last week I compared Apple’s new mHealth App store rules with our classic regulatory models. I noted that the ‘Health’ data aggregation app and other apps using the ‘HealthKit’ API that collected, stored or processed health data would seldom be subject to the HIPAA Privacy and Security rules. There will be exceptions, for example, apps linked to EMR data held by covered entities. Equally, the FTC will patrol the space looking for violations of privacy policies and most EMR and PHR apps will be subject to federal notification of breach regulations.

Apple has now publicly released its app store review guidelines for HealthKit and they make for an interesting read. First, it is disappointing that Apple has taken its cue from our dysfunctional health privacy laws and concentrated its regulation on data use, rather than collection. A prohibition on collecting user data other than for the primary purpose of the app would have been welcome. Second, apps using the framework cannot store user data in iCloud (which does not offer a BAA), begging the question where it will be acceptable for such data to be stored. Amazon Web Services? Third, while last week’s leaks are confirmed and there is a strong prohibition on using HealthKit data for advertising or other data-mining purposes, the official text has a squirrelly coda; “other than improving health, medical, and fitness management, or for the purpose of medical research.” This needs to be clarified, as does the choice architecture. Read More

Apple’s mHealth Rules Fear to Tread Where Our Privacy Laws Fall Short

By Nicolas Terry

On September 9 Apple is hosting its ‘Wish We Could Say More’ event. In the interim we will be deluged with usually uninformed speculation about the new iPhone, an iWatch wearable, and who knows what else. What we do know, because Apple announced it back in June, is that iOS 8, Apple’s mobile operating system will include an App called ‘Health’ (backed by a ‘HealthKit’ API) that will aggregate health and fitness data from the iPhone’s own internal sensors, 3rd party wearables, and EMRs.

What has been less than clear is how the privacy of this data is to be protected. There is some low hanging legal fruit. For example, when Apple partners with the Mayo Clinic or EMR manufacturers to make EMR data available from covered entities they are squarely within the HIPAA Privacy and Security Rules triggering the requirements for Business Associate Agreements, etc.

But what of the health data being collected by the Apple health data aggregator or other apps that lies outside of protected HIPAA space? Fitness and health data picked up by apps and stored on the phone or on an app developer’s analytic cloud fails the HIPAA applicability test, yet may be as sensitive as anything stored on a hospital server (as I have argued elsewhere). HIPAA may not apply but this is not a completely unregulated area. The FTC is more aggressively policing the health data space and is paying particular attention to deviance from stated privacy policies by app developers. The FTC also enforces a narrow and oft-forgotten part of HIPAA that applies a breach notification rule to non-covered entity PHR vendors, some of whom no doubt will be selling their wares on the app store. Read More