James Bridle on how complex technology darkens our life and culture, and the urgent need to shed digital fatalism
In 2016, when Nintendo’s Pokemon GO created a frenzy across the world, fans of the augmented reality (AR) game were up for a surprise in Russia. While playing the game — which basically means tracking down hidden Pokemons in real time and in real locations using AR technology on their smartphones — near Kremlin, many users found some functonality glitches on their devices,The Moscow Times reported. They found their GPS function compromised.
For starters, Pokemon GO uses Global Positioning System (GPS) to direct users to various locations where the funny comic characters would appear. Near Kremlin, many users found a mismatch between where the Pokemons appeared and the location marked on their devices. Technically, such a thing should not happen because GPS signals could not be tampered with. Or that was they, like many of us, had thought until then. And they were wrong.
Cyber security experts say what the gamers experienced in Kremlin was a process called GPS spoofing, giving enough evidence that Russian agencies were tampering with GPS by faking the signals. So, anyone would want to find a way to Kremlin using GPS would be virtually ‘relocated’ to Vnukovo Airport, which was 32 km away from the city centre. Many experts think this was done for defence purposes, to redirect incoming weapons targeting Kremlin using GPS. Instances such as GPS spoofing, where an advanced technology people believe is foolproof can be doctored and faked, reveal the “blind spots, structural dangers and engineered weaknesses” of computation in contemporary life, warns James Bridle in New Dark Age: Technology and the End of the Future, a brilliant, unparalleled work on the perils of modern technologies and how they obfuscate social realities.
A complex web
Bridle believes technology has made human life extremely complex today by creating layers and layers of processes and systems where humans are condemned to cohabit machine intelligence in ways they cannot comprehend.
As a result, we don’t necessarily realise where we need technology’s assistance and where we don’t. Even that ability is controlled by the systems and processes of technologies we use. “Our social lives are mediated through connectivity and algorithmic revision,” writes Bridle. He explains how the entire world “becomes a code/space” as smartphones becomes powerful personal computers and computation disappears into every device around us, from fridges to cars to fitness bands.
What happens then? This “ubiquity underscores our failure to understand” how computation impacts the “very ways in we think”. Bridle gives the example Wikipedia, which is a beacon among open internet projects. Currently, Wikipedia relies on an army of software agents – bots – to enforce and maintain correct formatting, build connections between articles, and moderate conflicts and incidences of vandalism. At the last survey, bots counted for 17 of the top 20 most prolific editors and collectively make about 16 per cent of all edits to Wikipedia. That’s a “concrete and measurable contribution to knowledge production by code itself,” notes Bridle.
What exactly is the danger here? Clearly, algorithms, which bear within themselves all the ugly biases and prejudices of their creators, are slowly and gradually interfering in our cultural spaces by contributing faster and in many cases better.
At the outset, there may not be a problem and we are free to think such technologies (bots here) are just augmenting our lives. Bridle disagrees: “Computation does not merely augment, frame, and shape culture; by operating beneath our everyday, casual awareness of it, it actually becomes culture.” In a way, software gibberish replaces healthy sociocultural discourses. This can have ramifications in spheres such as public policy, art, journalism, healthcare, sports, welfare distribution and such.
Why do such things happen? This happens largely because of a purely functional understanding of technology. Bridle explains, enchantingly, the dangerous fallout of it, which he calls “computational thinking”, which is the belief that any problem can be solved by the application of computation. “Whatever the practical or social problem we face, there is an app for it,” Bridle mocks. This is some kind of a “solutionism”, which essentially means technology can find a fix to problems. As Evgeny Morozov explains in his witty, insightful 2013 work To Save Everything Click Here: The Folly of Technological Solutionism, this approach is inherently faulty because it underestimates and masks that fact that our imperfections make us human.
Bridle agrees. He says computational thinking forces its apostles (businessmen, policymakers and such) to think that it is impossible to think or articulate the world in terms that are not computable. Soon, the thought reverses in an obscene fashion.
They think that to be solved, all problems should be computable. That which is not computable or not digitally mappable or measurable or code-able is runs the risk of losing a solution or even falling out of the radar of governance, business and culture.
Bridle warns that computational thinking is predominant in the world today, driving the worst trends in our societies and interactions, and must be opposed by a “real systemic literacy”. Technology cannot be left to the whims and fancies of those who keep it complex. It should be democratised. Systemic literacy is the thinking that deals with a world that is not computable, Bridle explains, while admitting that it is “irrevocably shaped and informed by computation”.
But that’s not an easy job, in a world where data companies control pretty much everything an individual does and force their users to ignore their fallibilities and become what this reviewer would call digital fatalists, where they become extremely submissive before their digital service providers and accept their propaganda and conclude that everything that happens is inevitable (in a way predetermined by a Super Code) and we have to reprogramme our lives to get them in synch with the digital realities.
This is not some soft-coded paranoia. This is a reality we face every day. When governments ask us to have digitally traceable (and controllable) unique identities and then make such computable citizenship or identity documents mandatory for availing services that do not necessarily require such strict screening by any measure, and when we succumb to such demands without a whimper. We even praise such efforts without really understanding the complexity of such systems or their hidden abilities to be manipulated, we become submissive subjects of computational thinking.
Bridle asks us to stand up and say our existence is be understandable only through computation. We are more than the data we are. Technologies need to be audited (Morozov has argued for algorithmic auditors) and updated to reflect human values such as justice, ethics and inclusiveness. Equally important is to know that systems are fallible and in a world where even the GPS can be faked and choreographed, overreliance on technologies can be dangerous.
Bridle’s work is a great handbook for those who want to probe more on this. He speaks with the calmness of a prophet and the alertness and passion of an evangelist.