Privacy is never easy to think about. This week it became harder. Two pieces framed my week. First, Eben Moglen’s essay in The Guardian (based on his Columbia talks from late last year) took my breath away; glorious writing and stunning breadth combined to deliver a desperately sad (but not entirely hopeless) message about government and corporate overreaching in data collection and processing.
A wry speech posted by software developer Maciej Ceglowski also helped frame my thoughts. He wrote, “The Internet somehow contrives to remember too much and too little at the same time, and it maps poorly on our concepts of how memory should work.” There’s the problem in a nut. Ceglowski alludes to the divide between how human (offline) memory operates (it’s “fuzzy” and “memories tend to fade with time, and we remember only the more salient events”) and the online default of remembering everything. Government and Google and, for that matter, Big Data Brokers tell us that online rules now apply across the board and ‘that’s just peachy’ because we’ll have better national security, better searches, or more relevant advertising. But, that’s backwards.
Offline rules need to apply to online data. Not everything needs to be collected and some memories should degrade or disappear. That’s very hard for our silicon-based masters to comprehend. But, remarkably, just a few weeks ago that was the conclusion of the European Court of Justice. Case C‑131/12 held that unlawful data processing “may result not only from the fact that such data are inaccurate but, in particular, also from the fact that they are inadequate, irrelevant or excessive in relation to the purposes of the processing, that they are not kept up to date, or that they are kept for longer than is necessary unless they are required to be kept for historical, statistical or scientific purposes.”
How would the denizens of online rules comply? This week Google posted a form to deal with erasure requests from EU citizens. Of course, it’s in beta (“Please note that this form is an initial effort”). Maybe making it work will require hiring some people who understand the offline sensibilities. Unfortunately vested interests and as Eric Posner pointed out in Slate, the First Amendment, likely will conspire to deny such an offline, nuanced, balanced rule in the land of the free.
As momentous thoughts were put to paper and the EU revolution began to operationalize, what did the week bring U.S. data protection and (you thought I’d forgotten?) health privacy? Alas, not very much. The FTC published its report on the data broker industry (or more accurately on nine data brokers). Clearly the brokers subscribe to the online rule—they collect everything. And much of it, as I have argued previously, is health-related. The good news is that the agency took health data threats very seriously; a couple of years ago the attitude would have been—’well, HIPAA takes care of that.’ Not any more. The report specifically recommends, “Congress should … consider imposing important protections for sensitive information, such as certain health information, by requiring that consumer-facing sources obtain consumers’ affirmative express consent before collecting and sharing such information with data brokers.” However, although the FTC’s heart is in the right place, most of the report’s other recommendations are regulation-light, transparency and access—in this company, knives at a gunfight.
Although our offline memories seem fuzzy the more important reality is that we remember based on context and emotions, our context and emotions. Data brokers and the other powerful entities that have fallen under the gaze of Moglen, Ceglowski, and the European Court not only expropriate all our memories, but then they shape them to serve the contexts of others.