|

This article was written by a human

This article was written by a human
Opinion

With so many publications stung by AI articles, what is the future for freelance journalism when there’s no code of practice and moderation is weak?


There have been a few cases this summer of respectable publications having to remove articles from alleged freelance journalists but written by AI.

Both Wired and Business Insider have recently been stung.

In the case of Wired, a feature entitled “They fell in love playing Minecraft” was taken down after editors decided it had been fabricated by AI.

Business Insider took down two first-person essays after verification checks showed they did not “meet Business Insider standards”.

These cases raise a number of important questions.

How prevalent are such AI freelance frauds and have many more got through the system undetected?

Could AI be used increasingly by “bad actors” as a route to planting false, and even dangerous, information in the established media?

PPA asks CMA to require greater transparency of Google’s AI search features

Growing problem

Among the issues raised: what is a freelance journalist in the age of AI? Where should lines be drawn to police what is acceptable AI research help and unacceptable use of the technology to produce phoney articles?

On the prevalence of the phenomenon, let’s ask ChatGPT.

Within a couple of minutes, it produced lots of evidence that there is indeed a lot of it about, although more evident in the US than the UK — and usually, but not always, at the fringes of journalism rather than mainstream publications.

The Chicago Sun-Times fired a freelance contributor to a summer book supplement, which contained some books that did not exist, after the author admitted using AI.

Sports Illustrated faced embarrassment when it had to pull a number of web articles after it was found they were credited to fake authors and came with stock or AI-generated headshots.

In the UK, a Reach publication may have been caught last year by an AI freelancer when it published an article setting out five property laws you might have been unwittingly breaking.

One was Street Naming and Numbering Regulation 1999, which claimed that you would be liable to a fine if you did not display the number of your house clearly enough.

After a reader complained to the Independent Press Standards Organisation, it emerged that such a law does not exist.

In its apology, the Hull Daily Mail admitted the information had been sourced online without adequate verification. It was not clear whether the article had been produced by AI or a more commonplace faker.

There have also been “zombie” websites such as National Wales 2.0 or Bournemouth Observer, which used AI-generated material including articles under false bylines such as Harry Jazz.

Then there were the many pieces allegedly produced by AI or fabricated by the supposed freelancer Margaux Blanchard.

Blanchard seems to have caught out Index on Censorship (UK). It had to take down a Blanchard “dispatch” from Guatemala after concluding that “it appears to have been written by AI”.

‘Endless possibilities’: Perhaps AI can be creative

Efficiency vs integrity

The above examples, and many more, were delivered in less than two minutes by ChatGPT, which asked politely whether more details were required.

Isn’t this just the much, much more efficient modern-day equivalent of going to the newspaper library and asking for paper files?

You spot what appears to be an outlying anomaly and you can find out, almost instantly, whether you have discovered a trend or not.

Is there anything problematic about any of this for freelance journalists, editors or publishers?

One issue is that ChatGPT is almost too helpful. On top of offering what you have asked for, it suggests organising or grouping it and an approach to producing something that might quickly resemble an article.

Oh, the temptation on a sunny day to press the button, head off to the pub and then tidy up the final product when you come back.

There is another, more stylistic, reason to worry about AI. With too much information and too many examples, the resulting articles could read like a list.

However, it is likely that before long a modest code of practice may have to emerge to cover the use of AI by freelancers or out-of-office staffers.

It might require a disclaimer that it has been written with the help of AI, if the assistance was greater than basic preliminary research.

In future, freelancers may have to sign disclaimers setting out the acceptable limits to the use of AI in the way that Financial Times journalists have to, or at least used to have to, list any shares they own to avoid conflicts of interest in companies they write about.

For editors, the safest route is to deal only with recognisable human beings and preferably humans they have an established relationship with — a relationship where integrity has been established.

Editors must be on their guard when even apparently proficient articles come from unknown sources.

We can be sure that, by now, Blanchard has retired and that she — or he — is just as busy pushing out multiple articles under new bylines.

AI search presents ‘existential’ challenge to publishers

Humans are still better

One of the few cards the mainstream media still hold is a reputation for accuracy and integrity — a reputation that should be defended at all costs.

Strictly speaking, the mainstream media is no longer the mainstream media, at least numerically, because more people now get their information from the likes of TikTok, Facebook and Google.

It is thus even more important that the original mainstream media holds firm against AI-generated freelance “journalism”, because TikTok is one of the multibillionaire companies planning to sack hundreds of those who moderate its content, including in the UK.

We can be certain of one thing: the main motivation is money. We can be less certain of another: that AI content moderation will be as good as that performed by humans — at least for now.

This article was aided by AI-generated research. But that was it. The rest was produced by human hands.


Raymond Snoddy is a media consultant, national newspaper columnist and former presenter of NewsWatch on BBC News. He writes for The Media Leader on Wednesdays — bookmark his column here.

Media Jobs