The economic fallout from the COVID-19 pandemic has caused an unprecedented crisis in journalism that could decimate media organizations around the world.
The future of journalism — and its survival — could lie in artificial intelligence (AI). AI refers “to intelligent machines that learn from experience and perform tasks like humans,” according to Francesco Marconi, a professor of journalism at Columbia University in New York, who has just published a book on the subject: Newsmakers, Artificial Intelligence and the Future of Journalism.
Marconi was head of the media lab at the Wall Street Journal and the Associated Press, one of the largest news organizations in the world. His thesis is clear and incontrovertible: the journalism world is not keeping pace with the evolution of new technologies. So, newsrooms need to take advantage of what AI can offer and come up with new a business model.
For Marconi, journalists and media owners are missing out and AI needs to be at the heart of journalism’s business model in the future. As a professor of journalism at the Université du Québec à Montréal, I have been closely following the evolution of this profession since 1990, and I am mostly in agreement with him.
In Canada, The Canadian Press news agency is, for example, one of the rare media outlets to use AI in its newsrooms. It has developed a system to speed up translations based on AI. The Agence France-Presse_ news agency (AFP) also uses AI to detect doctored photos.
AI does not replace journalists
Artificial intelligence is not there to replace journalists or eliminate jobs. Marconi believes that only eight to 12 per cent of reporters’ current tasks will be taken over by machines, which will in fact reorient editors and journalists towards value-added content: long-form journalism, feature interviews, analysis, data-driven journalism and investigative journalism.
At the moment, AI robots perform basic tasks like writing two to six paragraphs on sports scores and quarterly earnings reports at the Associated Press, election results in Switzerland and Olympic results at the Washington Post. The outcomes are convincing, but they also show the limits of AI.
AI robots analyzing large databases can send journalists at Bloomberg News an alert as soon as a trend or anomaly emerges from big data.
AI can also save reporters a lot of time by transcribing audio and video interviews. AFP has a tool for that. The same is true for major reports on pollution or violence, which rely on vast databases. The machines can analyze complex data in no time at all.
Afterwards, the journalist does his or her essential work of fact-checking, analyzing, contextualizing and gathering information. AI can hardly replace this. In this sense, humans must remain central to the entire journalistic process.
A broken business model
Marconi is quite right when he explains that the media must develop a paid subscription model, get closer to their communities with even more relevant content, develop new products (newsletters, events, podcasts, videos) and new content. AI can facilitate some of this by generating personalized news: recommendations for readers, for example.
In this sense, AI is part of a new business model based on breaking down media silos. There needs to be a symbiosis in the sense of establishing a “close collaboration” between the editorial staff and other media teams such as engineers, computer scientists, statisticians, sales or marketing staff.
In a newsroom, more than ever before, databases must be used to find stories that are relevant to readers, listeners, viewers and internet users.
And there are already various AI tools available to detect trends or hot topics on the internet and social media. These tools can also help newsrooms distribute content.
Beware of bias
Of course, newsroom size must be taken into account. A small weekly or a hyper-local media organization may not have the means to act quickly in adopting AI. But for the others, it’s important to start taking action right away. Journalists need to be better trained and begin to work with start-ups and universities to get the best out of this. AI is not a fad. It is here to stay.
One of the dangers of AI, on the other hand, is algorithm bias. Because algorithms are designed by humans, there will always be biases that can alter data analysis and lead to serious consequences. And human verification of content before publication will always remain a safeguard against errors. Take the current example of COVID-19. This is an opportunity to analyze public health data to make connections, analyze and dig into the data neighborhood by neighborhood and street by street. AI can help with that. But it takes well-trained data reporters to do this work.
AI has also helped developing systems for detecting fake videos (deepfakes) and fake news, which are of course supported by experienced journalists from Reuters and AFP, for example.
In this sense, the transformation of newsrooms is only just beginning and Marconi’s essay is a must-read for identifying survival scenarios for media organizations and journalists. Because that’s what it’s all about. We need to better equip our newsrooms and completely rethink the workflow to achieve better collaboration and better content that will attract new and paying subscribers.