The Narrative Factory

In November 2025, the TBIJ published a story documenting how a man in Sri Lanka made $300k by producing anti-immigration content for UK Facebook users. He wasn’t British, and he seemingly wasn’t ideological. He was simply fluent in the economics of attention. AI tools, Facebook pages, emotional triggers — stitched together into a profitable operation.

It’s an example that feels small on the surface, but it points at something much bigger: narrative has become an industrial product.

The stories we think in
The “self” feels solid, but I’ve come to think of it more like an internal narrator — stitching perception, memory and prediction into something coherent enough to live inside. Consciousness often feels like reading that story as it writes itself.

Once you see it that way, narrative stops being entertainment.
It becomes the architecture we use to make sense of the world.

Culture as shared code
Genes are the hardcoded layer.
Memes — the Dawkins kind, not the JPEG kind — are the soft weights floating between minds.

Culture becomes a distributed prediction system running on millions of brains at once. Not centralised, or coordinated, but emergent — shaped by repeated behaviour and emotional reinforcement.

Cultural change is simply what happens when a large enough number of brains update their internal stories. Sometimes gradually. Sometimes abruptly when the old stories stop matching reality.

We feel like we’re actively participating and choosing our beliefs. I suspect that most of the time, we’re inheriting them.

Where the factory externalises itself
If culture is the software running across millions of minds, platforms are the hardware. They don’t just transmit stories — they optimise them for scale. A narrative factory running nonstop, optimised for engagement and ad revenue rather than truth or insight.

It’s no surprise that people learn to exploit this factory.
What’s surprising is that more people aren’t doing it.

It reinforces a belief I’ve had for years: the advertising stacks built by the likes of Google and Meta might be one of the most socially net-negative inventions that humans have created. Not because advertising itself is inherently harmful, but because these ecosystems sit on top of total opacity, monopoly leverage and incentives that reward emotional manipulation at scale.

They didn’t invent the narrative factory. They industrialised it; creating the assembly line of emotion.

Consciousness isn’t immune to its environment
Working closely with AI has made me reflect more on how humans think and how easily that process is to nudge. Large language models generate narrative in the same basic rhythm our brains do: prediction layered on prediction.

When the information we consume is shaped by machinery tuned for provocation, it rewires the inner narrator. New instincts. New fears. New certainties.

Shift the internal story, and behaviour follows.
Shift enough behaviour, and culture follows.

Technology without shiny object syndrome
This is why I’m careful with how I think about technology. I still believe in technology for good — I wouldn’t work in this industry otherwise — but the platforms shaping our informational diets are increasingly misaligned with anything beneficial.

Grouping “big tech” together is often unhelpful. Hyperscale cloud and enterprise SaaS share almost nothing with the behavioural extraction of ad-funded platforms. One is infrastructure, with legitimate concerns around sustainability and monopolisation. The other demonstrates psychological leverage.

Not everything built by big tech is the same flavour of harmful, but the parts that monetise attention are doing measurable, systemic damage.

Why this matters now
Part of the reason I’m thinking about all this is personal. Social media has changed radically in the last decade. I stopped using Instagram and Facebook in 2020 due to the overwhelming volume of sponsored content, and time wasting techniques being woven into to keep me engaged at any cost. I played around with TikTok shortly after the Musical.ly merger, and removed that from my devices as soon as possible. I still use Reddit and X, which isn’t exactly a badge of honour, but even there the drift toward provocation and emotional baiting is obvious.

At the same time, political conversations with friends and family feel more intense, more emotional, more narrative-driven — and less grounded in shared reality.

We’re living inside a narrative factory, and the production line doesn’t stop.

The open question
The problem isn’t that bad actors exist, or that AI can generate infinite content, or that platforms incentivise emotion.

The scary part is how seamlessly all of this plugs into the way human consciousness already works.

If our minds run on stories, and our culture is shaped by shared stories, then the systems that control story production hold enormous influence over the future.

So the real question becomes:
How do we stay conscious in a world constantly trying to rewrite the story inside our own heads?

I don’t think anyone has the answer yet.
However, noticing the factory at work is the first act of resistance.
Refusing to let it write your story is the second.

Leave a Reply

Your email address will not be published. Required fields are marked *