| |

After AI replaces technical labour, we will have to confront what we accidentally erased

After AI replaces technical labour, we will have to confront what we accidentally erased
Opinion

The real danger of AI isn’t job loss. It’s what disappears when judgment, dissent and cultural texture are treated as inefficiencies to be engineered away.


“Even if I did speak Irish, I’d always be an outsider here, wouldn’t I? I may learn the password, but the language of the tribe will always elude me …”

In Irish playwright Brian Friel’s Translations, language is not lost through violence, but through efficiency. Irish place names are replaced with English ones, not because they are wrong, but because they are inconvenient.

The new system works better on paper: communication improves, and administration becomes simpler. And yet something essential disappears: not just vocabulary, but memory, identity, local logic, and cultural texture. “Faster and cheaper” is the language of standardisation and, by definition, it removes friction. But in Translations that friction is where meaning lives. Until it was quietly surrendered.

When efficiency becomes the organising principle

Last week, a documentary maker told me he no longer spends thousands on voiceover artists or additional production staff. AI can now generate a clean, competent alternative in minutes: no diaries to juggle, no studio time, no negotiation. Faster. Cheaper. Done.

That same logic ran quietly through this week’s announcements at CES. Take Havas’ new AVA platform, unveiled at CES and framed as a “human-led AI ecosystem” designed to amplify creativity and judgement. The ambition is clear: centralise access to a handful of models, accelerate the journey from brief to output, and standardise how intelligence is applied across the organisation.

Nine AI tool announcements from CES 2026

None of that is irrational. But it rests on a quiet assumption: judgment can be encoded, centralised, and scaled without loss. What’s being replaced isn’t just technical skill. It’s people with experience, instincts, dissent, personality, and the ability to slow things down when the process would rather keep moving.

AI tools are designed to be functional, compliant and smooth. Human contributors often aren’t. They interrupt, question, hesitate, push back, or insist that something doesn’t quite feel right. Is this inefficiency? Or where taste and judgement live?

When judgment gets codified

Amid all this focus on AI as evidence that our industry is making progress, let’s pause to consider how casually our industry has decided what AI should be used for. It’s painfully obvious that large parts of the media and advertising industries have begun using these tools to draft opinion pieces, write award entries, and generate social content, not to sharpen thinking but to replace it.

The result is not necessarily efficiency. It’s more like convergence. WPP’s new Agent Hub makes this logic explicit. Decades of creative, strategic and behavioural expertise are being codified into “Super Agents”, designed to emulate expert thinking and make it available instantly, at scale.

That sounds like progress until you ask: what happens when judgment becomes something you “access” rather than something you argue over, test, or push against?

Why everything starts to sound the same

Convergence means that everyone starts to sound the same: verbose, declarative, self-assured, and oddly empty. Not because the technology is bad, but because it’s being used to remove the most valuable part of the process.

I may be unusually sensitive to this because it’s how I make a living—thinking about things that matter and trying to make others see them differently through language. I’ve tried outsourcing that work to machines, but it doesn’t scale. It merely smooths the edges until nothing interesting remains. If anything, I’ve found that using AI well actually slows me down. It’s useful when I use it to test an argument, challenge a lazy assumption, or notice where my own reasoning doesn’t hold up.

Imagine what a laughing stock I would be as a CEO who promised to use AI to slow down processes or increase costs? That would ridicule the unspoken yet unshakeable belief in a perfect system waiting to be discovered, if only we could strip away all the messiness. Even though the humans on the ground know that messiness, slowness, and friction are often what separate good work from great work.

Efficiency creates fragility

But zoom out, and this risk becomes clearer. As we remove friction, we remove human judgment (more decisions made by fewer people). On the whole, our companies become, altogether strangely brittle: very good at producing something, but increasingly bad at knowing whether that something is worth producing at all.

There is no competitive edge in everyone developing AI tools based on engines provided by the same handful of companies, any more than there is an advantage in everyone using the same email system from Google or Microsoft. What separates organisations isn’t the tools they adopt, but the cultures they build around them. Some design their systems to maximise responsiveness and output.

Others impose limits, protect thinking time, and deliberately slow decisions down when the stakes are high. Those choices don’t appear in product demonstrations or press releases. But they determine whether AI becomes a force for better judgment or a machine for scaling mediocrity.

What we’re optimising away

After AI replaces technical labour, the real question isn’t how much faster or cheaper our industry can become. It’s whether we properly account for what kind of project we’ve embarked on.

In Translations, the replacement of Irish with English is presented as an administrative improvement: clearer maps, standardised schools, smoother governance.

The system ‘works’. But it also quietly redefines history, identity and aspiration; not always through force, but through the steady, monotonous insistence that efficiency is neutral.

That is the risk facing media and advertising now. As we reshape what we are capable of producing, we change what we are capable of understanding. This is not something this industry can afford to get wrong.


Omar Oakes was the founding editor of The Media Leader and continues to write a column as a freelance journalist and communications consultant for advertising and media companies. He has reported on advertising and media for 10 years

Leave a comment

Your email address will not be published.

*

*

*

Hamish Nicklin, CEO / Co-Founder, AgentFlow, on 12 Jan 2026
“Omar - I want to agreeably agree with half of what you've written and agreeably disagree with the other half. Your headline says 'technical labour' but your argument is about creative judgment. Those aren't the same thing. And I think blurring the line risks scaring people away from automations that would free them to do the work you're trying to protect. My response got too long for a comment, so I've written it up properly here: https://goagentflow.com/founders.html#agreeable-disagreeing-about-ai. I'm genuinely curious what you think.”

Media Jobs