Let’s talk about the gift horse.
You know the one. Gleaming. Beautiful. Suspiciously generous. The kind of gift that arrives with a bow on it and a card that says this is for you, we love you, you deserve this. The kind of gift where, if you were paying attention… really paying attention… you’d notice the faint sound of whispering coming from inside.
Meta launched the Ray-Ban smart glasses in 2023. By 2025, they’d sold seven million pairs. Seven million people looked at a camera disguised as a fashion accessory and thought: yes. That’s for me. I’ll have that.
And honestly? You can see why. They look good. They work. They let you take photos, play music, make calls, interact with AI, all hands-free, all seamlessly, all without the social catastrophe of strapping a computer to your face like it’s 2013 and you’ve just discovered Google Glass. These are cool. These are the thing you actually want.
Right there on the marketing: Designed for privacy. Controlled by you.
Seven million people read that and thought: great, sorted, where do I pay?
The horse is in the city now.
A joint investigation published in February 2026 by two Swedish newspapers revealed that footage captured by Meta Ray-Bans was being sent to a subcontracting facility in Nairobi, Kenya. There, workers… low-paid, under-resourced, doing their jobs… were manually reviewing and labelling the footage to train Meta’s AI systems.
What did they see?
Bank card numbers. Private messages. People getting undressed. People using the bathroom. People having sex.
One worker described watching a man set his glasses down on a bedside table before leaving the room. You can finish that sentence yourself.
I don’t think they know, one annotator told the reporters. Because if they knew, they wouldn’t be recording.
There it is. The most devastating line in the whole story, and it came from a contractor in Nairobi, not a regulator, not a tech journalist, not a lawyer. A person being paid very little to watch very intimate things, who still managed to locate the moral clarity that a billion-dollar corporation apparently mislaid somewhere between the product roadmap and the launch party.
They wouldn’t be recording if they knew.
The implication hanging in the air there, of course, is that Meta knew they didn’t know. And built the product around it.
Here’s the mechanism, because this is where it gets genuinely beautiful in the way that really horrible things sometimes are.
You cannot use the glasses’ AI features without your footage being shared with Meta. There is no version of using the product as designed where your data stays yours. The privacy settings are real. They just don’t cover the bit where you’re actually using it.
It’s not a loophole. It’s not an oversight. It’s architecture. The convenience and the surveillance are not separate features with one accidentally bleeding into the other. They are the same feature. The convenience is the surveillance. That’s what you bought. That’s what was always inside the horse.
Meta’s official response, when it came, was a small masterpiece of corporate non-denial. Something to the effect of: when users share content with Meta AI, we sometimes use contractors, and we take steps to protect privacy.
Which is, technically, true. In the same way that it’s technically true to say a Venus flytrap takes steps to welcome visiting insects.
A class action lawsuit landed in San Francisco on 4th March 2026. The language in the filing is worth reading slowly: Meta had, it alleged, “transformed the product from a personal device into a surveillance conduit,” exposing users to “unreasonable risks of dignitary harm, emotional distress, stalking, extortion, identity theft, and reputational injury.”
Dignitary harm. I want to sit with that phrase for a moment. Someone in legal had to invent a category called dignitary harm because the existing vocabulary wasn’t quite covering what happens when a Kenyan contractor watches you have sex because you forgot you were wearing a camera on your face.
The ICO wrote to Meta. The EU asked questions. MEPs tabled inquiries. The machinery of accountability began, very slowly, to turn.
None of it will move as fast as the problem already has.
There’s a detail I keep returning to, because it’s the part that the mainstream coverage keeps burying in paragraph nine.
The glasses look like sunglasses. Not “look a bit like” sunglasses. They are Ray-Bans. The camera is invisible unless you know to look for it. The small indicator light that shows they’re recording? Modifiable with cheap parts you can buy online. Disabled. Gone. No light. No warning. No way to know.
Meta reportedly has plans for a feature called “Name Tag”… facial recognition, in real time, identifying strangers as you look at them. The internal memo discussing the rollout timing noted, apparently without any sense of irony whatsoever, that civil society groups who would normally push back are “currently occupied with other priorities.”
They looked at the chaos in the world right now… the political fires, the overwhelmed institutions, the exhausted activists… and they saw a window. They planned the calendar around the opposition’s fatigue.
That’s not a company that made a privacy mistake. That’s a company that checked whether anyone was watching before it acted. Which is, now that I think about it, exactly what they built the glasses to do.
I write reports like this because I was done being managed. Done being guided, quietly and persistently, toward conclusions that served someone else’s interests while being told it was all for my benefit. Done with the Trojan horse dressed as a gift.
The Ray-Ban story is that dynamic in its purest form yet. Surveillance as fashion. Extraction as convenience. A product so beautifully designed, so genuinely useful, so elegantly presented, that seven million people walked it through their front door and set it on the bedside table.
The workers in Nairobi are not the villains here. They’re the only honest participants in the whole arrangement. They’re the ones who actually know what they’re looking at.
The question… and I mean this genuinely, not rhetorically… is whether the rest of us are willing to know.
Or whether the horse is just too pretty.
The lawsuit is in its early stages. The ICO has written its letter. The EU is asking its questions.
Seven million cameras are still out there.
Controlled by you.
Until Next Time

Discover more from Dominus Owen Markham
Subscribe to get the latest posts sent to your email.

