Manufactured Chaos and the Industrialisation of Abuse
On the engineered cycle of urgency, distraction, and harm
Photo: © pixelsbyemm
Alt text: Black-and-white photo of an escalator under a glass arched roof, steel beams curving overhead — a study in structure, direction, and light.
First, Happy New Year! I wish you a joyful, restful and kind 2026!
For the past eight years, I’ve spent a significant part of my life responding to online spectacle and harm, translating emergencies into evidence, and harm into policy, platform, and public conversation.
But I don’t want to stay trapped inside the cycle of reaction. I’m recommitting to that in 2026.
I want to name the patterns. I want to name the systems. Because if we can see the architecture of harm clearly, we can design and advocate for transformative systems, and build healthier societies that protect the most vulnerable by default.
This week’s reporting on Grok is a reminder that what looks like a scandal is often the visible edge of a system that has been engineered for harm.
Years ago, I was interviewed by WIRED for my expertise on the rise of deepfake “nudifying” abuse. And in final chapter of my book I wrote about becoming trauma-informed, not trauma-led — that chapter was the beginning of my shift away from reactivity and toward naming systems.
I’m not here to amplify spectacle… I’m here to name systems.
“Undressing” tools and CSAM-generation aren’t fringe outcomes. They’re predictable, because the internet already treats women and girls as content to exploit.
We don’t need more outrage, PR statements, or emergency funding, we need clear, grounded, sustainable, and collective action.
The whack-a-mole game the platform economy (Musk and others) have created is exploiting a glitch: policymakers’ ability to govern, civil society’s ability to work collectively and strategically, and donors being shamed into funding harm reduction and reactive funding cycles without changing the systems creating these conditions.
There’s money being made from manufactured chaos. Attention is being diverted. I’m interested in what else is happening in tech right now that benefits from our energy being split and weakened.
There are UK elections are in May. This season is about resisting fascism and that means practising resistance to urgency, perfectionism, scarcity, and entitlement.
Algorithms are designed to reward outrage, attention capture, and outrage merchants. Be strategic and trauma-informed — not trauma-led.
Over the next few weeks, I’ll be writing less about individual scandals and more about the systems underneath them — the incentives, governance gaps, design decisions, and cultural conditions that keep producing the same harms. Reaction keeps us busy. Systems thinking helps us become effective.
I’ll be exploring questions like: Who benefits when our attention is split? What does safety look like as infrastructure? How do we resource coordination rather than fragmentation? And what would it take to make dignity a default setting in digital life? This is the work of 21/20 — and the direction of my writing here.
Systems are built. So are alternatives.
If you’re thinking about how to have conversations about digital safety at home, I previously shared a post for parents and carers here:
—
About 21/20
21/20 is a cultural studio shaping how society understands and negotiates power, technology, care, and speech — using art, humour, policy, and public conversation — and shaping digital ecosystems that protect dignity, expand participation, and shift power.


