While the discussion about generative AI in the industry continues unabated, the initial dust is settling somewhat, says Cecilia Campbell. In its wake, it has revealed a few key aspects she thinks should guide the news industry in how we relate to and work with this new technology
Have a sense of constructive urgency
While AI tech is changing rapidly, the knee-jerk reaction to it in parts of the publishing industry is steadfast. I recently received an e-mail from an opinion editor at one of the United Kingdom’s national newspapers asking me whether I wanted to write an op-ed “on how AI is going to change journalism and put us all out of a job!”
I declined. We need to replace this sense of panic with intentional and constructive urgency. A few weeks ago, the renowned Future Today Institute released its annual Tech Trends Report, a mammoth piece of work looking at more than 700 trends across 14 different aspects of the world from climate and energy to bioengineering and space. News and information is one section.
In early April, the institute ran a webinar on its findings within our sector. AI was a big focus, and senior expert advisor Sam Guzik made the comment that the biggest mistake the publishing industry can make at this juncture is “to assume the future will look exactly like the past.” He encouraged everyone to track developments in AI and foster discussions internally.
He also pointed out that partnerships and collaborations can be a helpful support in getting started, mentioning industry initiatives such as AP’s Local News AI and Journalism AI at the London School of Economics. (The Tech Trend Report has a list of “ones to watch” in the news and information section, which includes these initiatives and other industry players, including, I’m happy to say, my employer, United Robots.)
Understand what generative AI can and can’t do
We’ve heard a lot about how AI built on large language models (LLMs) tends to hallucinate, to make up “facts” from thin air. From a journalism point of view, this feature of the tech becomes particularly troublesome when it starts to reference non-existent sources, and even articles that were never actually written, as documented by The Guardian recently.
At our workshop during INMA’s Media Subscription Summit in Stockholm, Elin Stueland from Stavanger Aftenblad in Norway was asked if they had ever tried putting the soccer data (used for their automated reporting with rules-based AI) into ChatGPT. She said they’d tested it, but that the persistent problem was that it kept adding events that had never happened. And yet, clearly this type of AI can provide valuable efficiencies if used right.
At United Robots, our core tech is built on rules-based AI, where the rules try to emulate the decision-making abilities of a human expert. We use this type of AI because the texts we generate must adhere to well-defined language patterns and editorial styles as well as include the facts that are in the data (and only those!).
We are now leveraging large language models to generate variations on text segments, like headlines, in our robots. A next step will be to test deploying LLMs to speed up the actual code generation process for our natural language generation (NGL) algorithms.
The challenge for the news industry more widely will be to work LLMs into robust, reliable, and useful processes. It will be crucial to keep a razor-sharp focus on the use we’re trying to extract from the tech and not get side-tracked by its inherent capabilities. Humans in the loop will be key. Which leads me to my final point.
Differentiate your journalistic offering on what only people can bring
Good journalism is about people — those who produce it and those who consume it. AI does not understand the concept or processes of journalism, which is where the real value of the news industry lies.
Professor Charlie Beckett heads up the Journalism AI project at the London School of Economics and he put it perfectly in a new Practical AI in Local Media Report: “AI is going to change how you think about your journalism. If the routine stuff becomes automatable… the onus is very much on what you can add. Can you add empathy, entertainment, insight, expertise, judgement, the human touch, creativity? All those things are going to be at a premium.”
AI will affect the behaviour and expectations of publishers, journalists, and — not least — readers. Publishers who work proactively and make the right choices in how they use the tech, not least in the context of trust and transparency, will be at a clear advantage.
(The writer is chief marketing officer at United Robots in Malmö, Sweden. She can be reached at [email protected] or @CianMian.)
(Courtesy: The International News Media Organisation or INMA)