Earlier this month, journalist Benj Edwards was terminated from his position as a senior AI reporter at Ars Technica for committing two cardinal sins in journalism: using AI to fabricate quotes and misattributing said quotes to a source that didn’t say them.
This outcome should come as no surprise. As noted by Stanford AI expert Dr. Fei-Fei Li: “The danger of AI is not that it will become too autonomous, but that we will become too dependent.”
Edwards learned this the hard way.
Trust in the media is already fragile; only 28% of Americans do so. In a field where credibility is currency, journalists can’t afford to use AI without taking necessary precautions.
Where the Reporter Went Wrong
Newsrooms are under pressure. They must move faster with fewer resources and tighter deadlines, making AI usage inevitable.
Edwards leaned on AI for this very reason.
While sick with COVID and an assignment due date approaching, he used an AI tool to quickly extract relevant verbatim source material for an article he was writing (ironically, about an AI agent that ran a hit piece).
What he ended up with was “a paraphrased version of the source’s words rather than his actual words.”
Because he was rushing to finish, Edwards failed to verify the quotes against the original source material before including them in his final draft.
Edwards’ mistake was treating AI outputs as gospel, as absolute truth. Doing so ended with him losing his job.
This is a dangerous mistake because misinformation can run rampant in AI systems.
We know from experience; we discovered ChatGPT thought one of our clients was dead when they were very much still in business.
With 77% of journalists using AI tools now, they must define where AI can add value and where human oversight is needed, so they don’t fall victim to the same fate.
Here are three rules for using AI in journalism:
- AI can summarize. It cannot be a source.
- Treat AI outputs as a draft, not as fact.
- Quotes must always come from original material
Use AI to Summarize, Not as a Source
Between researching their next story, conducting interviews and checking endless pitches (likely from us), reporters have little time to comb through massive amounts of information for a story.
This is where AI shines.
AI is great for understanding and summarizing large amounts of information. Think: identifying interview themes, organizing research notes or recapping transcripts. Since the goal is to quickly understand, detail is less important.
But AI falls short in guaranteeing accuracy. As one of my colleagues often says, “AI is like a toddler that wants to please you.” So it may create quotes, attributions or details it thinks will help make your point. Or it may misinterpret information published elsewhere. These can all impact accuracy in journalism.
AI can help reporters grasp a story faster, but it should never be the source of the story itself.
Treat AI as a Starting Point
AI’s accuracy is inconsistent.
At a PR Summit in San Francisco, journalist Jane King, who reports from The New York Stock Exchange, shared that when using ChatGPT to find local business stories, she kept getting days-old information or content from the wrong city.
Because of this, reporters should treat AI’s outputs as a starting point, not the end result.
If a journalist needs an idea for a story, AI can brainstorm possible topics or refine existing ideas.
Quotes Must Come From the Source
Whatever AI produces, whether it’s quotes or a summary of an event, must be verified and checked against the original source.
We can’t take AI’s word for it. Benj Edwards is a prime example of why.
If journalists must use AI for quotes, limit it to AI-powered grammar tools to help catch typos or other grammatical errors. But the original quote must come from the source who said it.
Trust is the currency that makes journalism work. Readers assume that quotes are precise, reporting reflects what sources actually said and facts used to describe events are accurate and verified.
AI can help reporters work faster by analyzing more information and uncovering new insights.
But the responsibility for accuracy will always belong to the person behind the article.