top of page
Search

The Weight of the Watching

ree

It’s weird how being watched has been so normalised. The cameras in the street, the apps on your phone, all feeding invisible systems that claim to “learn” who you are. After a while, it makes you feel like you have to behave a bit more carefully, even when nobody’s there.


Erving Goffman once wrote about the way we 'perform' for others, the face we put on when we know we’re being seen. I often think about that in the age of AI. The difference now is that the audience isn’t human anymore. The watcher is a piece of code, or a lens on a drone, and the performance never ends. It’s exhausting, trying to stay presentable to something that never sleeps. Just a constant self-curation.


I notice it in small ways. My friend rewriting a tweet because she's nervous a bot will flag it as “aggressive.” My cousin who hides certain songs from their Spotify because they don’t want to be profiled as “sad.” The way we all pause before typing something into a search bar, in case it looks suspicious. We are all super aware of 'the algorithms' now. It’s like we’ve learned to tidy up our digital selves before anyone asks.


In my own research on automated decision-making, I saw how this plays out on a bigger scale. Systems that decide who gets a benefit, who gets stopped by police, who gets a loan are built on data that turns people into patterns - dehumanising them. Even when these systems work the way they’re supposed to, they chip away at the simple right to go unnoticed. They make us think about how we appear, all the time.


The Bridges v South Wales Police case stays with me for that reason. The court had to decide whether facial recognition technology breached privacy rights when it scanned the faces of ordinary people walking through public spaces. The police argued that it wasn’t intrusive because the images were deleted almost instantly when there was no match. But the judgment, thankfully, picked up on something important; that the very act of being individually analysed, even for a moment, changes things. Traditional CCTV might capture a crowd, but facial recognition isolates a person. It doesn’t just watch; it identifies. It removes the possibility of being just another face in the crowd. That feels like a kind of loss we haven’t yet learned how to measure.


ree

COVID blurred that lines even further. During the pandemic, we were told that surveillance was a public duty, that sharing our location, our health status, our proximity to others was a reasonable price for safety.

I certainly recognised this as a key moment in time. Tracking apps, restaurant sign ins and QR codes all normalised a kind of visibility that once would have felt intrusive. Now, it’s ordinary. What began as an emergency measure quietly altered our sense of what’s “reasonable” to expect in public life. We live with that expectation now, almost unconsciously.


But there is a paradox, as always. The same technologies that make us hyper-visible can also protect us. A woman or vulnerable person walking home at night under street cameras may feel seen in a way that’s finally safe. The potential infringement becomes its own form of privacy, allowing her the individual choice to be out and about.


So, does it make you feel safe? Or are you exhausted? Either way, there is no opt out.


Emma Kitcher, Privacy Nerd and Founder of CleanAI
Emma Kitcher, Privacy Nerd and Founder of CleanAI

 
 
 

Comments


00011-2939233035.png

DID YOU FIND THIS USEFUL?

Join our mailing list to get practical insights on ISO 27001, AI, and data protection; No fluff, just useful stuff.

You can unsubscribe at any time. You are welcome to read our Privacy Policy

bottom of page