This post is part of the Friday Commentary. In this series every Friday experts will shine a light on the digital industry. Where are we heading, what is going on and how should we approach this as decision makers? Barry Adams kicks off this second series of commentaries.
I have a confession to make: I’m a Trekkie. Admittedly I’m not the kind that dresses up and goes to conventions, though that’s more the result of a lack of opportunity rather than a lack of desire. I think I’d make a fetching Klingon, even if I say so myself.
I’m not an uncritical Star Trek fan, though. I enjoy most series, and actively love some of them, but there’s one I don’t have a lot of patience for. That’s Star Trek: Voyager.
When I first learned of its premise – a Federation ship stranded on the other side of the galaxy with a combined crew of Starfleet and Maquis – I thought it was very promising indeed. This could take the grit of Deep Space Nine to the next level, becoming a ‘Heart of Darkness’ type voyage through the worst aspects of galactic civilisations and human morality, to finally come out in to the light of Federation space bruised and scarred, but also wiser and more aware of the darkness that lies at the heart of each of us.
Instead we got a polished, formulaic, safe series that had very little surprises and plainly avoided most of the interesting challenges it could have faced head-on. But Voyager wasn’t all bad, though. There were some very good story arcs, and a few outright excellent episodes.
The Voyager Conspiracy
One of these is entitled ‘The Voyager Conspiracy’, a season 6 episode in which Seven of Nine rejigs some of her external cybernetic components to absorb more data from different sources in the ship’s database.
As the episode progresses, Seven of Nine becomes increasingly paranoid. The vast amounts of data she now processes leads her to see all kinds of conspiracies and hidden motives, where in fact there is nothing going on.
The enormous amounts of data Seven of Nine has access to enables her innate pattern recognition abilities to see patterns that don’t actually exist. This is, in fact, a very human trait. Our brains are, at their core, pattern recognition machines. That’s where all that parallel processing power in our brains is being used for: to see patterns and match those to behaviours and expected outcomes.
It’s a great Star Trek episode and, for being aired in 1999, eerily prophetic.
Data, Data Everywhere
We are now reaching a point in our society where vast amounts of data is being made available to a small number of entities. This data is our personal information, our purchasing behaviour, our interests, the websites we look at, the people we talk to, the businesses we engage with, the places we visit, etc. Vast troves of our data is being collected every second of every day.
And this data is parsed by algorithmic pattern recognition systems to create profiles of us. We are algorithmically sorted, catalogued, and processed in to digital personas. And these personas are expected to behave in certain ways, buy certain products, visit certain places.
This data is used then by advertisers to sell us stuff they think we’ll like. More often than not, they’re right and we buy more stuff we like (but often don’t actually need), which proves the value of these digital personas, so that investments are made to create even more powerful data-collection systems and ever more sophisticated pattern-recognition algorithms.
The consumerist aspect of all this, while a very worthy topic that needs to be explored in-depth, is not what I want to talk about today. Instead I want to talk about another user of that data: government.
Because government also has access to all of this data. This is not some off-the-wall conspiracy theory: we now know government agencies can have full unrestricted access to all our data at the wave of a (secret) court order. Some even have direct backdoor access to these data-collection systems.
And, despite recent revelations, government access to this data is increasing. Every opportunity is seized to place us under more surveillance, to give agencies more access, and to look at every aspect of our lives, all the time, everywhere we go.
The data that is available to governments is, indeed, almost universal. Soon there will be nothing governments don’t know about their citizens. We will be living in a digital panopticon.
There are still people out there who think this is a good thing. Who think that full surveillance is necessary to make us all safer. Who think they have nothing to hide.
But they are wrong. Because in a world of total surveillance, everyone is guilty.
What the Voyager episode so eloquently demonstrated is that with sufficient data, you can see patterns everywhere. Patterns making connections that simply don’t exist, leading to conclusions that are entirely false.
A poignant example of this are bible codes, where patterns are found in scripture that supposedly foretell all kinds of things. They aren’t true, of course, because these patterns don’t actually exists. They’re based on selective reading and preconceived notions about which patterns should be found. With sufficient data you can find almost every imaginable pattern, if you tweak your algorithm just so.
Selection bias is evident in all such endeavours; if you select your data with a specific objective in mind, you will find that objective in the data even if it isn’t real. Your pattern recognition algorithms will not be neutral, but skewed towards your bias. Your data analysis becomes a self-fulfilling prophecy: “I want to find secret codes in the bible, so I will select the data that matches my expectations and I will be certain to find secret codes.”
Translate this to government surveillance. The sheer volume of data means that humans won’t be processing it. Instead, algorithms will analyse the data. Specifically, pattern-recognition algorithms designed to look for specific things that could mark someone as a potential offender.
These algorithms will be written in such a way that they’ll only look those specific patterns that match their expectations of what an offender does and doesn’t do. As a result, these algorithms will have their own form of selection bias: ignore the data that doesn’t fit the pattern, and focus on the data that does.
The problem is of course that offenders rarely behave in clearly obvious, predictable ways. So the pattern needs to be made quite broad, with a lot of different data points and variables that can all raise a flag. But that also introduces more noise in to the system: If a pattern is sufficiently broad, it applies to everyone.
Everyone is Guilty
And that means everyone is guilty. Everyone will be seen as a potential offender. Everyone will have said something, or bought something, or visited somewhere, or looked at something, which raises a red flag and makes them guilty in the eyes of the omnipresent surveillance state.
And when everyone is guilty, no one is.
Enormous effort will be wasted on chasing after innocent people. Some innocents might even be locked up, convicted in secret courts based on misinterpreted data processed by biased pattern recognition systems. And in the end, instead of enhancing our security, this ubiquitous surveillance will make us less safe. Because it’ll be so much easier for the real offenders to hide among the rest of us, to become part of the noise rather than the signal, making it genuinely impossible for government agencies to track and apprehend them.
When you see conspiracies everywhere, how can you know which are fake and which are real?
Our governments, in their insatiable lust for more surveillance and more data, is starting to see conspiracies everywhere. No one is innocent. Everyone is a potential offender.
I don’t know about you, but that’s not a world I want to live in.