October 6, 2022

Collective Transparency – “You have zero privacy anyway-get over it.”

In 1999, Sun Microsystems CEO Scott McNealy famously stated, “You have zero privacy anyway — get over it.” He was referring to concerns around consumer data privacy, something which is as relevant today following the Facebook/Cambridge Analytica scandal as it was during the early Dot Com years.

McNealy’s comments raise two questions. Firstly, is he right in what he’s saying? Secondly, if he is, what is there to be done about it?

Involuntary Transparency

The answer to this first question is almost certainly: who knows? We live in an era of extensive and ever-growing data collection, so much so that it’s hard not to call us all somewhat digitally illiterate. But ignorance is a weak excuse in a discussion of privacy; ignorance circumvents any discussion by deferring to the, “what you don’t know can’t hurt you,” approach. But ignorance does support a stronger argument — what I’d obtusely call the, “you’re not special,” argument.

In his wonderful book New Dark Age, James Bridle discusses big data and algorithms which have been optimised beyond our wildest fantasies. He invokes science fiction writer Iain Banks in a discussion of Infinite Fun SpaceInfinite Fun Space is where hyper-sophisticated artificial intelligence goes and play, building worlds literally inconceivable to the human mind, while we mere humans exist in a fully optimised — and highly boring — world.

The, “you’re not special,” argument is a meshing of ignorance and hyper-intelligence: who cares if you have no privacy because the only things that can make sense of your data are computers who have better things to be doing. While I don’t like this argument, it does highlight that much of our privacy anxieties are just social anxieties, anxieties that emerge from exposure to other people. These anxieties aren’t going away, but through sensible social media management and a form of collective transparency (we’ll get onto this), some of these concerns can be ameliorated.

But privacy isn’t just about feelings and desires we’d rather the world didn’t know. Privacy can have extremely serious consequences for one’s finances and identity. Privacy in this regard is one of the much-applauded benefits of Bitcoin; while the transaction is necessarily transparent, the person behind the transaction remains anonymous. Ensuring privacy in these areas is vital, and where a great deal can be gained from doing so, malicious actors will spare no expense. If McNealy is right about privacy, then, it’s not good enough to suggest we just get over it.

There’s another version of privacy which we shouldn’t just get over; behavioural privacy. Behavioural economics contains an idea called nudging — small changes in how a choice is framed which can have a significant and predictable impact on the outcome. I’ve previously written about how big data and nudging can combine into hypernudges — ultra-targeted, personalised nudges that feed off a huge assortment of data to influence a person’s choices. Classic examples of hypernudges are personalised ads, such as adverts that show a product you were recently looking at on a different site. For some, these ads are creepy, while for others they’d rather see an ad for something they’re interested in than for something they’re not. Either way, hypernudges rely on a waving of your data privacy.

Post-Privacy

Regardless of whether McNealy is right to say we have no privacy, privacy is more complicated than McNealy suggests. Privacy can mean several different things, and be violated more many different reasons. But assuming McNealy is right, what can be done about it?

Well, how do we forgo our privacy? Often, we think about our personal data as a currency which we transact with for various services. I’ve argued previously this isn’t a good way to think, because the currency is worth the same amount no matter who owns it; data is not. An interesting alternative idea has been discussed by legal data scholar Karen Yeung — the collective right to privacy. Here, privacy is not just an individual idea, but a social idea, one that remains even if an individual, ‘sells,’ their personal data.

But big data changes this. With enough data, systems can often make accurate predictions about individuals who may have never provided them with any data. Regardless of the purpose of this privacy transgression, it is a transgression, and it raises serious moral — if not legal — concerns. But equally, the big data genie is out of the bottle, and thus we must start considering how to live in a post-privacy world.

If our collective right to privacy is dead, and our control over our individual right to privacy increasingly waning, what alternative is there? The answer may be what data economist Ernst Hafen calls the “right to a copy.” Under this proposal, we will have the right as citizens to request a copy of any data that anyone collects about us. This, of course, doesn’t solve the privacy issue, but it does return to us some empowerment and control.

For Hafen, this is important: with these copies, he proposes the creation of a central repository of data. This repository, he suggests, has two advantages: it would be democratically managed and would de-monopolise data, forcing big tech to compete on quality of service, not through data-hoarding, rent-seeking tactics. This central repository is one version of what I call collective transparency.

Towards Collective Transparency

To an extent, the right to a copy already exists, with social media services such as Facebook and Twitter allowing users to download copies of all their data. Achieving collective transparency, then, is less about returning control of data and more about building new data infrastructures. This is where the problems with collective transparency emerge: people still want privacy; such large amounts of data could be used for very unwelcome purposes; how do we ensure democratic control of the repository and no foul play?

These issues are valid, and until (or if) collective transparency becomes an active project, tangible technical answers to these problems are likely to be hard to come by. One early solution may build on blockchain technology, levering a pseudonymous approach like that of Bitcoin to retain transparency without exposing people. Cryptocurrencies have also been able to incorporate voting features, which may resolve democratic issues, through which such an important resource, a public and transparent governance structure seems necessary. This would push us to the very limits of dataified democracy.

Collective transparency may also not be the solution. Maybe something else will come along? Maybe we will discover the economic limits of big data which guarantees our privacy simply because it’s not worthwhile to violate. But if we accept McNealy’s proposition, and accept that we have no privacy — what Jean Baudrillard called, “involuntary transparency,” — collective transparency seems like an intriguing option to replace what we have lost.

About Post Author

Subscribe To Our Newsletter