If you think your body ends at the edge of your skin, think again. Right now, more than you think, your body extends out into the world. You are, already, more than you.
Moreover, other minds extend outwards, trying to escape from their bodies. Those minds, like ghosts in a machine, burrow into your mind, affecting your actions, desires, and beliefs; moving you, manipulating you, controlling you, predicting what you’ll do next.
Automation is not what it seems. And it’s here now.
Agriculture is being revolutionised. Across the 20th century the proportion of people working in agriculture in the West declined from 50% to 2%. Now, robots traverse fields zapping weeds, detecting whether strawberries are ripe before picking them, while drones scan the fields for fertility, transmitting data back to farmers’ computers and up to the cloud. Robots build cars, Flippy makes fast-food workers increasingly obsolete, Amazon has filed a patent for a blimp that deploys delivery drones.
In 2018, Dallas police used a robot to kill a gunman with explosives. This was likely the first time a state has used a robot to kill one of its own citizens.
Many commentators and economists have blamed stagnant wages and the decline of the labour-share (that’s the fraction of income going to workers) on the rise of automation.
Human beings are going out of fashion.
Author and technologist Wes Kussmaul has bemoaned of the human body that ‘protein is not an ideal material. It is stable only in a narrow temperature and pressure range, is very sensitive to radiation, and rules out many construction techniques and components … Only in the eyes of human chauvinists would it have an advantage’.
The robotic landscape, the rise of automation, and our wider understanding of the universe in the modern era, the death of god, has ushered in what some have called the post-human period; an era when our importance in the world is diminished, our capacity to act, to understand, to control depreciated, and our centrality in the universe devalued. We are, on the one hand, just another species, a blip, a cosmic speck, while on the hand, we’ve become more powerful than ever, extending our capabilities outwards, breaching our fragile limits, augmenting the reach of our minds. We are witnessing man’s attempt to escape from himself.
A period when maybe, as Michel Foucault predicted, ‘man would be erased, like a face drawn in sand at the edge of the sea’.
How can we understand this new robotic era? How might we theorise the politics and philosophy of automation and its consequences? How can we all make ourselves bodies without organs?
Our bodies have an urge to extend outwards, to escape from the limits of our skin. Most simply, that’s what technology does.
Marshall McLuhan described the electric lightbulb as ‘pure information’. The medium extends the perception of the human eye. Brain surgery and night sports are impossible without it.
The first machines – lathes and sharp tools – extended the capacity of our limbs to affect the world. To sharpen, quicken, strengthen our abilities. To adapt the stone into the image of our idea.
The industrial revolution extended the self into an automated machine. It was a revolution of extension – extending limbs, hands, feet, eyes into metal. To add pressure, scale, or to make smaller, more precise, to repeat, repeat, repeat quicker than a finger could.
The information revolution, the digital revolution, does the same for mental life. It extends our cognition.
In the same way machines are better than humans at certain physical tasks, AI will become better than humans at cognitive tasks. Newsweek wrote, ‘AI and automation will replace most human workers because they don’t have to be perfect – just better than you’.
But what’s often forgotten is that AI and automation is never neutral. It’s always doing things for someone. It’s always an extension of someone.
And it’s that for and of someone that I want to focus on.
As Robert Belk argued in an influential 1988 article, we’ve always had an extended-self.
The isolated self, cut off from the world, is an illusion. We get, have, and possess things, ideas, places, even people, because we are dependent organisms. We are part of a larger world that we try to integrate, in various ways, into our sense of self.
We view things as part of the self when we are able to understand them, or can predict how they’re going to act, or decide whether we like or dislike it, and when we learn how to control things in the same way we control our body.
As Belk pointed out, many thinkers, from Enlightenment philosophers Hegel and Locke to modern psychologists and sociologists, have commented on how we have a desire to assimilate the external material world with our inside ideational experience so as to understand, comprehend, and have confidence in our ability to interact with the world, to trust that we can control it, in some way, rather than it controlling us.
In short, we think of the exterior landscape as part of our sense of self.
We also think of properties or characteristics like our age, job, name, favourite food, clothes, where we live as a part of ourselves.
When we create something, we often think of it as an extension of ourselves, using the pronoun my.
My work of art, my invention, my song.
When we’re cycling or driving we think of the machine as an extension of ourselves.
The existentialist Jean-Paul Sartre saw three reasons we might see an external object as part of our selves.
To control it for personal use. To increase our power of acting. But we also conquer or master things: mountains, difficult instruments, new skills, as a test of our capabilities, of our mastery over the world.
When we master, comprehend, and control, we integrate the world into our own psychologies.
We might master, comprehend, and so have control over, a transport network, for example, when we move to a new city. It becomes a part of our psychological landscape, literally part of our schema of the world.
The things that we extend ourselves into also can take on our own traits, characteristics, and properties, extending our personality out into the world.
A farmer’s land has had their energy invested in it, say. Something we draw takes on our emotional state. A dog takes on some of our rhythms, our commands, even our personalities.
Machines have never just been mechanical, geometric, predictable structures; machines are an extension of us and can take on our creativity. Machines can be novel, random, dynamic, too.
As Spinoza said, ‘We do not even know what a body is capable of…’
What makes our new extended world different from the previous ones? What makes the robotics revolution different from the industrial revolution?
As all the tech giants now know, the aim of the algorithmic game is prediction.
Those robots that pick strawberries need to be able to predict when they’re ripe. The simple automated vacuum predicts where furniture is.
This is how the information society comes together; where big tech, social media, and robots collide.
Because they’re all ultimately about the same thing: data-collection for better prediction.
Social media platforms seek, vacuum and hoard as much data as possible so they can predict which advertisers to sell your attention to and at which times.
And it’s why we see big tech expanding into areas that, at first, don’t seem logical. Google into self-driving cars, Facebook into VR, Amazon into movies.
The more data they have, the better their bottom line. And we’ll see more of this in the future.
Smart speakers that collect every minute detail; chairs scraping, the frequency of your coughing, when you’re typing or washing up, when your dog’s barking.
All so they can know when to advertise cough sweets, when to play a certain song, when you’ve gotten up for a break and when you’re busy working.
Tesla cars – with their cameras and sensors picking up data from wherever their drivers goes – will know a neighbourhood better than its residents.
A company called ZeroEyes uses ‘Artificial Intelligence to actively monitor camera feeds to detect weapons and people who could be potential threats’.
Athena – backed by Peter Thiel – develops security software that can recognise ‘suspicious’ behaviours.
The so-called ‘internet of things’ turns every device in your home into an information collector: a washing machine that knows what clothes you wear and when, a dishwasher that can detect what you’ve eaten?
We will be surrounded by devices acting as corporate private detectives, scanning, tailing, interpreting our every move.
As theorist Mark Andrejevic notes, the ultimate goal is to understand the world as broadly and as specifically as possible.
He calls it pre-emption. These companies want to pre-empt what we’ll do, what we’ll need, and what we’ll most desire.
He cites Foucault’s concept of the panopticon – the prison designed by Jeremy Bentham in which prisoners are unaware if they were being watched by the guards or not. Foucault says the ‘major effect’ is to induce in the inmate ‘a state of conscious and permanent visibility that assures the automatic functioning of power’.
We live in a world where, increasingly, automation knows what you want better than you do. The strawberry picking machine knows better and quicker than the farmer when fruit is ripe, a dating app knows better and quicker who you’re likely to go on a date with, and Amazon knows better and quicker than you do what product you’re likely to buy.
Professor of marketing Praveen Kopalle writes, ‘think of the feelings you get when you see that an Amazon package has arrived at your door – it’s delightful and exciting, even though you know what it is. I bet those feelings amplify when you don’t know what’s in the box… We like getting things in the mail, even if we didn’t ask for them’.
This is a real patent filed by Amazon. It’s called anticipatory shipping.
Such vast treasure troves of data, used to predict what you’ll do, what the world will do, picking up information on such a minute scale and such a broad scale all the time leads to the digital landscape becoming incomprehensible.
There’s so much data, so much information, that it escapes any single person’s comprehension, it becomes impossible to understand in a single, universal way.
This has led air taxi CEO and science writer Chris Anderson to predict the End of Theory. He writes: ‘Out with every theory of human behavior, from linguistics to sociology. Forget taxonomy, ontology, and psychology. Who knows why people do what they do? The point is they do it, and we can track and measure it with unprecedented fidelity’.
So many data points are collected that any hope of understanding that data disappears. Whatever the algorithm says goes.
He points to Google’s page rank: it collects data about which webpage is best, but no-one can say why one page is better than the other, only that it’s statistically likely that it is.
If you’re most likely to give a $5 tip to your Uber driver that’s what it suggests. There’s no need to understand why.
One Google project let loose an AI on Youtube to see what it would do. It learned to detect cats but the engineer admitted they had no idea why that happened.
HunchLab looks at data to predict crime. It’s found that assaults happens less on windy days and cars get stolen nearer schools. Doesn’t matter why, security firms can just adapt in response.
This organisation takes mega billion pixel photos of cities like Shanghai and The Vatican. You can zoom right in and out and in again: https://www.indy100.com/viral/one-of-the-world-s-biggest-ever-photographs-has-a-hilarious-secret-7292226 Someone found this naked man.
Posthumanism, in its cynical form, is a dizzying vertigo.
Kierkegaard wrote of the modern age that: ‘Anxiety may be compared with dizziness. He whose eye happens to look down into the yawning abyss becomes dizzy. But what is the reason for this? It is just as much in his own eyes as in the abyss… Hence, anxiety is the dizziness of freedom’.
There’s so much data that it’s impossible to frame it in a single understandable universal efficient way.
Jane Bennett calls this vibrant materialism. ‘Interpretations and framings are so diverse that it’s impossible to make sense of anything. A city blackout can be said to be caused by deregulation, wildfires, the weather, the price of gas, an ill employee, a historical event, a sociological problem’. Meaning fractures and explodes.
Yet despite this, we must retain some sense of responsibility.
Framing robotics and automation through the idea of an extended-self brings something often missed back into focus: there’s always someone doing the automating.
The question when we look at a robot should always be not what’s it doing, but who is it doing it for?
Take the illustrative example of the dangers of autonomous weapons. An open letter with signatories including Steven Hawking and Noam Chomsky warns: ‘It will only be a matter of time until they appear on the black market and in the hands of terrorists, dictators wishing to better control their populace, warlords wishing to perpetrate ethnic cleansing, etc. Autonomous weapons are ideal for tasks such as assassinations, destabilizing nations, subduing populations and selectively killing a particular ethnic group. We therefore believe that a military AI arms race would not be beneficial for humanity’.
This is a cogent example because, while the dangers are obvious with weapons, the same dangers apply to any type of automation. Weapons can be a powerful metaphor for ideas.
A website that decides who you should vote for? Who is funding it? A police robot that keeps you safe? What happens when tyrannical laws are enforced? A script that gets the news for you? What happens when prioritising anger increases engagement?
These are not problems of the future: they are here. We are nudged by algorithms in ways we’re not always aware of. Opaque algorithms are as dangerous as laws you’re not allowed to see.
The real lesson of Orwell’s 1984 is much more than oppression. The power that has information doesn’t crudely blackmail, but manipulates you without you knowing. As the protagonist Winston tells us, ‘If you want to keep a secret, you must also hide it from yourself’.
Cass Sunstein has written that, ‘due to recent advances in AI and machine learning, algorithmic nudging is much more powerful than its non-algorithmic counterpart. With so much data about workers’ behavioral patterns at their fingertips, companies can now develop personalized strategies for changing individuals’ decisions and behaviors at large scale’.
The purpose of all these systems is to get as close to the desire as possible.
The goal is to know you better than you. To pre-empt what you want, what you need, what you desire. Some might say fine – what’s wrong with AI predicting what I need? What’s wrong with automated convenience?
The problem lies in algorithms not pre-empting what you want and need, but what you’re most likely to want and desire that aligns with what advertisers, corporations, and the powerful want and desire. Automation is not coded for need but coded for what you’re most likely to respond to. We have triggers, needs, addictions, weaknesses that can be capitalised on but that aren’t what’s really good for us.
We need to be coded to respond passionately to threats. But do we need to see more threatening Facebook posts more often? Obviously not. We’re built to want thrill, sugar, sex, to respond to scandal, but we don’t want a world that uses that to sell us things we don’t need.
Pre-emption is a melting pot of trickery and base desire driven by profit and control.
Thaler and Sunstein write that, ‘It is no coincidence that the enthusiasm for so called “nudge” approaches, which act indirectly on people through intervening in the choice “environment,” has coincided with the rise of online marketing and advertising’.
We become shaped by the Gorgon – the mythical creature that turned those under its stare to stone. In fact, take the Gorgon Stare, a new military drone that captures video of entire cities. As if naming military equipment after ancient monsters sends the message that we’re definitely the good guys.
Author Arthur Michel, who spends his life studying drones, wrote that ‘nothing kept me up at night the way Gorgon Stare did’.
They fly at 25,000 feet and capture the city below with a telescopic high resolution camera.
The most powerful flying eye on the planet has been flown above sports stadiums and American cities, solved murders with no witnesses, followed terrorists, spied on Baltimore, and are probably in the air domestically over the United States right now. In short, you have no idea if one’s capturing your movement.
In this video, an operator shows you a murder captured by the drone and how they followed the getaway car. I urge you to watch. The link’s below.
Michel writes that the Gorgon Stare is ‘a way of seeing everybody all the time. Fundamental to liberal democracy is the ability to have sacrosanct private spaces. That is where the life of civil society exists. It is where our own personal lives exist, where we are able to pursue our dreams and passions. And it is often where we hold power to account. When you uncover those spaces, you fundamentally put all of those things at risk’.
Of course, with all of this the question is what happens when we’re all turned to stone by the Gorgon Stare, our movements and data captured so efficiently and totally that we’re petrified into immobility.
What happens when something I’ll call ‘ethical escape’ cannot happen. When you cannot do what you believe you have to do or should do because your future desires, ideas, and movements are being manipulated, nudged, or pre-empted by someone else. When the ethical thing to do cannot escape the domination of the totalising stare of the status quo?
Tools that affect someone’s behaviour without them knowing are the instruments of tyrants, monopolists, manipulators, and puppet-masters.
The philosophers Giles Deleuze and Felix Guattari had a concept that seems apt: a body without organs.
It’s based on the fact that we aren’t only made up of our bodies. We are extended-selves, comprised of assemblages of objects, ideas, events, and other people; it’s the idea that all organisms are more than simply themselves. The bee and the pollen and the flower and the hive make an assemblage. Jeff Bezos, Amazon robotics, and the mail system make an assemblage.
The assemblage is a structured structure.
But the body without organs is a call for an assemblage to overcome the structure, to overcome its limits, to break out of that which structures it, to free itself from the hierarchy that external powers impose on it.
It is the idea of a metaphysical and ethical potential: what something could do if it was freed from limitations:
They write: ‘Is it really so sad and dangerous to be fed up with seeing with your eyes, breathing with your lungs, swallowing with your mouth, talking with your tongue, thinking with your brain, having an anus and larynx, head and legs? Why not walk on your head, sing with your sinuses, see through your skin, breathe with your belly?’
The body without organs, they declare, is the essence of freedom:
‘the body without organs howls: ‘They’ve made me an organism! They’ve wrongfully folded me! They’ve stolen my body!’
‘Why not walk on your head, sing with your sinuses, see through your skin, breathe with your belly: the simple Thing, the Entity, the full Body, the stationary Voyage, Anorexia, cutaneous Vision, Yoga, Krishna, Love, Experimentation. Where psychoanalysis says, “Stop, find your self again,” we should say instead, “Let’s go further still, we haven’t found our body without organs yet, we haven’t sufficiently dismantled our self.”
I think we live in an age of bodies without organs. Automation and robotics extends our powers outwards into the world, but this phenomenon is also reversed, as other people extend their own powers out into the world they’re extending it towards you; robotic tentacles wrapping themselves around your desires, trying to shape your beliefs, your actions, your identity.
You must make yourself a body without organs. It’s why we need to teach all kids to code, to be engineers, to all have Youtube channels and podcasts – anything that extends the self. These are all bodies without organs.
Deleuze and Guattari write enigmatically: ‘This is how it should be done: Lodge yourself on a stratum, experiment with the opportunities it offers, find an advantageous place on it, find potential movements of deterritorialization, possible lines of flight, experience them, produce flow conjunctions here and there, try out continuums of intensities segment by segment, have a small plot of new land at all times. It is through a meticulous relation with the strata that one succeeds in freeing lines of flight, causing conjugated flows to pass and escape and bringing forth continuous intensities for a body without organs’.
There are age-old debates about whether technology will free us from the burden of labour. Marx said the appeal of communism would be ‘to do one thing today and another tomorrow, to hunt in the morning, fish in the afternoon, rear cattle in the evening, philosophize after dinner, just as I have a mind, without ever becoming hunter, fisherman, herdsman or critic’.
The problem is that there is no such thing as pure automation. Its always someone’s automation. If you’re not controlling or contributing to it then you’re being controlled. In 1930, the economist John Maynard Keynes predicted a 15 hour work week by 2030. Others suggest a universal basic income.
These are important questions, but I think they neglect an important point. Automation is always someone’s automation. Automation is always control. Universal basic income is no good if you must surrender your control over the economy, over culture and ethics to people like Jeff Bezos.
Andrejevic writes: ‘There is an element of surrender in the appeal of automation: a willingness to concede that the complexity of social life under current technological conditions is beyond the reach of human comprehension and thus irrevocably alienated. Why not leave the administration of public life to the companies that simultaneously provide us with the endless stream of digital content that helps fill the void left by public life? This is a disturbing perversion of the hope that the widespread access to information made possible by the Internet would enhance democracy by creating a universally informed citizenry’.
As we saw from the extended-self, we all want to have some comprehension, some understanding, some mastery and control of our environment, of the things that affect us. Even the hermit desires control over the land, the kettle, the seed he sows.
Even the hermit desires control.
Tinder CEO Sean Rad has said that, ‘So imagine you open Tinder one day and, you know, the Tinder assistant says, “You know, Sean, there’s a beautiful girl, someone that you’re going to find very attractive down the street. You have a lot of things in common and your common friend is Justin and you’re both free Thursday night and there’s this great concert that you both want to go to and can I set up a date? And here is a little bit more info about her’.
Is there not something uncomfortable about this vision? The idea that something so profound can be outsourced and off-shored? Once we factor in Tinder’s two fundamental goals – keeping you engaged and increasing profits – we have to ask Sean Rad how much that influences the partners they suggest, the gigs, bars and experiences they recommend, the neighbourhoods and patterns of speech they emphasise.
When the interiors of stuffy old houses were being transformed in the 19th century, the textile designed William Morris recommended a golden rule: ‘have nothing in your houses that you do not know to be useful, or believe to be beautiful’.
As robotics run by big tech algorithms creep slowly into our homes, cars, workplaces, schools, hospitals, parliaments, skies, shops, media platforms, well, everywhere, this might be a good rule to keep in mind.
There’s a simple principle I return to often. It’s novel, but it’s called democracy. It means by the people. I understand it like this: people should have a say in the things that affect their lives.
We should not outsource the directing of desire to Silicon Valley, surrender our human impulse to choose to the biases of an opaque AI, abdicate control over our data in return for the cheap thrills of consumer advertising, and forgo and sign away our right to see the things that are seeing us.
In this way, questions about transparency and privacy online keeping coming up because they’re really about who gets to extend their reach and in what ways.
The politics of robotics, AI, data, pre-emption, framelessness – this strange new post-human landscape – should be defined by four things: transparency, democracy, privacy, and control.