During my time in China in 2018, face recognition technology was outwardly celebrated by the media and inwardly concerning for the general population.
In the canteen of the massive new science centre I was working at, TV screens alternated between images of Chinese military might, unflattering footage of Donald Trump and montages of the rollout and benefits of surveillance technology on the streets of cities across the country.
A widely shared article about how an “economic” criminal had been caught using facial recognition tech from amongst a crowd of 60,000 at a concert in Nanchang. Jay-walkers in Shenzen publicly shamed by having their faces displayed on prominent screens near the sites of their transgressions. Toilet-roll rationing and locating lost children were other applications of this pervasive automation.
Now, there is a lot to be discussed about surveillance technology, its ramifications in a tightly controlled country like China, and its eager adoption in the formally liberal West, but that’s not where I want to go in this article. Let’s assume that we’ll exist in a surveillance society in the not-too-distant future. The cameras are already here – it’s the processing software that hasn’t been applied yet. My own MSc thesis explored the dangers of tech solutionism and paralleled the Social Credit system in China. A recent OSINT investigation by myself revealed Irish Police interest in facial recognition software going back to the early 2000’s:
No, let’s push on, even past the murky areas of “emotion tracking” and into a slightly speculative idea of Posture Surveillance.
I always imagine the same scene when I think of surveillance technology. From a third-person perspective, I’m watching myself on a screen, walking across a busy open square. It’s daytime, the weather is dull (but not raining) and there’s enough people wandering around to suggest lunchtime in a large city. The camera zooms in and follows me through the throng.
A large rectangle surrounds my walking form and a smaller square tracks my face. My name appears on the screen, alongside a social security number. Everything is green, until the large rectangle changes to yellow and occasionally flashes red.
Jumping back into my first-person, I notice that I’ve slipped into my usual slouch-walk again. It’s a result of a mix between having a slightly curved spine from childhood, and a general desire to disappear into the crowd of a busy city. My shoulders are tensed and I have a low pain in my back. I remember that I need to be mindful of my posture, or I’ll be a bowed old man. I straighten up, push my chest forward and my shoulders back. It’s still such an unusual stance to me that I have to briefly assume the character of a middle-aged man facing up to unruly teens outside his house – saying “What’s going on here, then?”, in my head. It feels better.
Back in the control room, my rectangle turns back to green. The invisible observer moves on to another human, and the AI low-level flags my identity for 24 hours. I’m already being tracked on another camera in another location.
Is the AI concerned about my posture (perhaps to avoid future medical costs burdening the subsidised health care system), or is it looking for signs of furtiveness and potential anti-establishment tendencies? Is it something more commercial and capitalistic? Does the AI have a library of (silly) walks that are determined to show how malleable a person is to targeted advertising? Will the advertisements shown to me change after this?
Is it just wary of difference and marking me for further observation and pattern analysis?
Whatever it wants, as soon as we become aware of this ubiquitous observation, we will change our ways. Intentionally or subconsciously.
Like the afore-mentioned man puffing himself up to face the youths, like the lone citizen trying to strike the fine balance between submission and assertiveness walking down a dark alley, we will try to game the algorithms blindly. We will adjust our demeanours like websites adjust their keywords and content, in the hopes of ranking higher in the search engines. Slowly working out the system, until it changes again.
Imagine walking boldly and confidently up to the bank before applying for a loan. Trying to remember your posture status in work photographs. Climbing the social ladder through striding and faltering. Deleting uncertainty from your social media imagery. Becoming occasionally confused as to how you should be, when out in the open.
Imagine darkened basements where you can freestyle dance and be yourself, beyond the gaze of the machine.
Well, here I am at my white-Western-male dystopia again.
Black Mirror dystopian imaginings are fine, but we need to be looking at emerging tech like this right at the source. The current capitalistic and political structures will inevitably veer towards solutionism and thoughtless application. We need a new application layer across all entities responsible for our futures – design, politics, commerce, academia, etc. Something that puts the human first, offers alternatives and ethical ways to apply technology in positive ways.