The politics of data control

It's time for a public conversation about the uses and limits of translucency. Is it really necessary to retain my social security number, or my search history, in order to provide a service? If not, what does it cost the provider of a service -- and cost the user, for that matter -- to achieve the benefit of translucency? Is this kind of opt-out a right that users of services should expect to enjoy for free, or is it a new kind of value-added service that provider can sell? [Full story at]
Several thoughtful emails in response to this column deserve mention. One veteran technologist told me that, five years ago, he was working on a translucent technology for which the tag line was "host-proof hosting." The pitch was: "There's going to be a breach at an ASP, and when that happens everyone will suddenly know that they need this." But that's a tough sale to make, especially when crytographic techniques are required.

The crypto is beast of problem to solve, but at least it can be solved. Reflecting back on the experience, he's left with one outstanding issue for which he sees no solution. How does a customer service organization support users when they can't see the data?

In all such cases, it come down to the same protocol suggested in this week's column: you attach a one-time permission to the protected data. Can the permittee misuse that permission? Sure. It's only a question of whether, on the whole, the benefit of translucency outweighs the costs. It might or might not, I don't know and I doubt anyone does, but what worries me is that we're not seriously trying to find out.

For Tim Sloane, it's fundamentally an imbalance of power:

I am now a consultant in the payments space and your example resonated with me at two levels:

As a consumer this is indeed exactly the type of service I would like to have. It provides me privacy for the personal data (the key) that I send to the (direct) service provider and allows me to acknowledge that I want that key to be used to release my personal data by the secondary service that stores that data (the vault).

The problem is that a large percent of the human population will not trust what they can't directly control and will not share control unless mandated to do so, and maybe not even then.

Consider the problems currently making headlines in the payment arena. Personal card numbers and PINs are being released into the wild at an unprecedented rate. Payment Associations (Amex, Discover, MasterCard, etc) all have regulations designed to prevent this data from being kept by retailers and yet retailers continue to store the information locally. It is estimated that it will take at least six years before even the major retailers can be proven to be in compliance with these regulations. Retailers mistakenly believe that by keeping this data they are better positioned to refute a dispute that the transaction never occurred which is a frequent fraud event. This is the issue of control.

There is also a reluctance to share control. The credit agencies you mention are all in business to service financial institutions, not consumers. Most efforts to provide consumers even rudimentary control over the data that has been collected about them has been refused. In fact, these credit agencies have already rejected the idea that a consumer should be able to confirm if their personal credit rating should be released. The only exception credit agencies have made is when the consumer indicates they believe they are the victim of identity theft -- that is, after the data has been spilled.

So despite the fact I agree there is a need, I see the following inverse rule in play: The more critical the data that one party wants to make translucent, the more likely it is all other participants will demand direct access. Only a major power base can break this inverse rule, but the power today lies primarily with organizations that hold all the cards.

I thought about this while editing last Friday's podcast with Phil Windley, part of which explored the failure of technologists to explain and evangelize ideas -- like translucency and selective disclosure -- in ways that resonate with the public and meaningfully influence the political process. If we want to invent technologies that enable, empower, and liberate, part of the challenge is to promote ideas and raise expectations about what's possible.

Former URL: