Nobody at TikTok lost sleep when the blackout challenge killed a ten-year-old. I mean that literally, not as provocation… nobody’s sleep is structured around that kind of outcome. Not the silence of grief, exactly. More the silence of a platform recalibrating. A content moderation team drafting a statement. A PR department choosing its words carefully. The silence, in other words, of an industry deciding how little it needs to say before the news cycle moves on.
It always moves on.
And then the next challenge starts trending.
Let’s not pretend this is new. Let’s not do the hand-wringing thing where we act as though the internet invented human stupidity, because it didn’t. Teenagers have always done reckless things for attention. Peer pressure has always been a more powerful force than common sense. The desire to belong, to be seen, to matter for fifteen seconds… that’s not a generational failure. That’s just being human and young and terrified of invisibility.
What is new — what is genuinely, structurally, obscenely new — is that there are now trillion-dollar companies whose entire business model depends on accelerating that impulse.
Think about what that means for a moment. Sit with it.
The blackout challenge, for those unfamiliar, involves deliberately cutting off oxygen to your own brain until you pass out. Children have died doing it. Not hypothetically. Not in a “well, technically” sense. Dead. Brain-damaged. Rushed to emergency rooms and not coming back out the way they went in. The Benadryl challenge involves overdosing on antihistamines to induce hallucinations — because someone, somewhere, thought that sounded like content. The milk crate challenge stacked unstable plastic boxes into a pyramid and dared people to climb them. The Orbeez challenge involved shooting water-bead guns at strangers in public. The fire challenge. The chroming challenge — inhaling aerosol chemicals until the nervous system starts misfiring.
Every single one of these spread because the algorithm rewarded it.
Not despite the danger. Because of it.
Here’s how it works, and I want to be precise about this because the platforms have spent an enormous amount of money making sure you’re not.
Engagement is the metric. Not wellbeing. Not safety. Not the long-term psychological health of the fourteen-year-old who just watched someone choke themselves unconscious and is now wondering if it would get her followers. Engagement. Clicks, views, shares, comments — the outrage comment counts the same as the admiring one. The “this is horrifying” share counts the same as the “try this” share. The algorithm does not read. It just counts.
A risky stunt gets views fast. Fast views tell the algorithm this is content worth amplifying. The algorithm amplifies it to more users. More users means more attempts, more variations, more one-upmanship — because the original got attention, so maybe if I do it higher, faster, more extremely, I’ll get more. The algorithm rewards the escalation. And so it goes, round and round, until someone’s in hospital and the platform issues a statement about how the safety of its community is its highest priority.
They have been issuing that statement, with slight variations, for over a decade.
What makes this particularly elegant, in the way that a perfectly functioning evil is sometimes elegant, is the liability structure. The platforms don’t create the challenges. They just host them. They don’t film the children. They just distribute the films. They don’t dare anyone to swallow anything, climb anything, choke on anything. They just build the system that makes doing so feel worth it.
When a child dies, the platform expresses condolences and removes the content. Sometimes. Eventually. After enough news coverage to make inaction untenable. They’ll announce new safeguards. A new age-verification gesture. A new reporting tool that routes to a moderation queue staffed at a fraction of the capacity needed to actually process it. And then, quietly, the next cycle begins.
The content moderation teams are not the problem. Many of them are doing genuinely harrowing work for genuinely poor pay, staring at things no person should have to stare at. The problem is architectural. The problem is that the architecture was deliberately designed to maximise the spread of engaging content, and dangerous content is engaging, and no one who made those architectural decisions has faced any meaningful consequence for making them.
Mark Zuckerberg sat before the US Senate and looked at the parents of dead children and said he was sorry. He then returned to running a company worth over a trillion dollars. The apology was real, I’m sure, in whatever way apologies are real when they change nothing.
The teenagers doing these challenges are not stupid. That’s something worth saying plainly, because the narrative often slides toward a kind of contempt for the participants that conveniently redirects attention away from the system. They’re not stupid. They’re responding rationally to the incentive structure they’ve been handed.
You want to be seen. Here is a machine that sees you… but only when you do something extreme enough to register. You want to belong. Here is a machine that manufactures belonging out of shared participation in a moment, however brief, however hollow. You want to matter. Here is a machine that will make you matter for exactly as long as the views keep climbing, and not one second longer.
The machine is not their friend. But it has done an extraordinary job of feeling like one.
Parents will be told to talk to their children. Schools will be told to run awareness programmes. Charities will produce leaflets. Journalists will write pieces — this piece, perhaps — cataloguing the risks and urging vigilance. And all of that is… fine. None of it is wrong. If a trend involves choking, inhaling chemicals, swallowing random substances, or staging a public stunt for a camera, it is not a bit of fun gone too far. It is a high-risk act being performed by someone who has been trained, systematically and deliberately, to believe that the risk is worth the reward.
But let’s not confuse awareness with accountability. Knowing the risks exists is not the same as dismantling the system that manufactures them. Teaching a child to recognise dangerous content does not change the fact that the platform hosting it is financially incentivised to keep it visible long enough to trend.
We are educating children about a hazard that adults created, maintain, and profit from.
There will not be a resolution at the end of this piece. I’m not going to tell you that change is coming, because I don’t believe it is — not at the scale required, not from the direction required. The platforms will not voluntarily restructure their engagement models. Governments have shown themselves to be either unwilling or too technically illiterate to regulate effectively. The market will not correct this, because the market is the mechanism producing it.
What there will be, instead, is more challenges. More trends. More content moderation statements expressing deep concern. More vigils. More parents in front of cameras with photographs of children who were alive before a trend found them.
And somewhere, in a data centre humming with the quiet industry of a system that has never once felt anything, the algorithm will keep counting.
Published for people done being managed.
Discover more from Dominus Owen Markham
Subscribe to get the latest posts sent to your email.

