Artificial intelligence, its place in journalism under discussion
A keyboard is seen reflected on a computer screen displaying the website of ChatGPT, Feb. 8, 2023. (Reuters Photo)


Artificial intelligence has taken the world by storm ever since the release of ChatGPT – an artificial intelligence-driven chatbot developed by research and development firm OpenAI – back in November of 2022, and it has even found its way even into journalism as journalists discuss its potential influence on the news industry and media.

ChatGPT is capable of automatically generating text based on written prompts and engaging in conversational interactions with users by answering their questions.

It has gained immense popularity, reaching more than 100 million users within the first two months.

The chatbot's appeal lies in its detailed and human-like responses in a conversational format, including the ability to challenge incorrect assumptions and answer follow-up questions.

The Associated Press, Reuters, The Washington Post, the BBC and The New York Times benefit from artificial intelligence to produce content, personalize their offerings and improve audience engagement.

The U.S.-based media outlet BuzzFeed is also planning to use ChatGPT to improve their quizzes and personalize certain content for their audiences.

Speaking to Anadolu Agency (AA), Jonathan Soma, who runs a data journalism program at Columbia University's Journalism School, described ChatGPT as a "fantastic tool" for producing ideas and giving suggestions to journalists.

"But just like any suggestions, they are apt to be misleading or incorrect. That is why ChatGPT works best alongside journalists, as a tool to assist their process, not as a standalone product that does the work of a journalist," Soma said in an email interview.

He said that many news organizations are enthusiastic about utilizing GPT-powered instruments to publish stories but noted that this is a higher-cost investment.

"This could be seen when CNET recently published a large number of error-prone articles: even when it's claimed that editors review and revise AI-generated pieces, they probably don't!"

"It's very easy for the business case of 'increasing productivity' to overrule the ability of journalists to take care and produce their best work," he added.

CNET, a U.S.-based tech website, has reportedly published AI-generated content.

Asked about ChatGPT's role to enhance the quality and efficiency of journalism, Soma said it can do a good job at fact-checking despite its "tendency to hallucinate."

"GPT-based automated tools for analyzing datasets and querying large sets of documents are quickly maturing and can do a lot to improve the accuracy of reporting."

"For example, if I'm writing about a rise in shoplifting, it can automatically query a database to see if this is an accurate portrayal."

'Absolutely accuracy'

In February 2023, OpenAI announced its plan to offer a subscription service called ChatGPT Plus.

This service provides numerous advantages to its subscribers such as quicker responses with priority access to new updates and enhancements. It costs $20 per month.

ChatGPT has some limitations, as it can give wrong responses, not just once but possibly multiple times.

OpenAI already accepts the limitations.

"ChatGPT sometimes writes a plausible-sounding but incorrect or nonsensical answer," the company says.

Soma also agrees that the biggest problem with ChatGPT is "absolutely accuracy," which brings some potential ethical concerns associated with the use of it in journalism such as bias or accuracy issues.

"Large language models tend to 'hallucinate,' and give answers to questions that are incorrect but sound accurate," he said.

"Someone who can say 'I don't know' is more trustworthy than someone who always has an answer, and it is unfortunately very difficult to cajole ChatGPT into saying 'I don't know.'"

Asked about the challenges journalists would face in integrating ChatGPT into their workflow, he said "fear" and a "lack of knowledge" probably are the largest issues for the news industry.

"The messaging around ChatGPT is one of those things – it's perfect and all-knowing, or it's a biased garbage machine."

"If journalists can take the time – not on deadline, not explicitly for work – to play around with ChatGPT in a guided environment, it could do a lot to help them see its strengths and weaknesses."