AI search tools and chatbots could reduce news visibility and credibility

AI News


Chat GPT Search

Credit: Pixabay/CC0 Public Domain

There is growing evidence that new generative AI internet search tools from OpenAI, Google and Microsoft may increase the risk of returning false, misleading or partially correct information.

Despite the impact this will have on the news industry and an informed democracy, the New Zealand government has decided to exclude AI from its plans to reinstate the previous government’s Fair Digital News Bargaining Bill.

The proposed bill would require Google and Meta (which owns Facebook and Instagram) to pay news companies for their content. Many local news organizations receive funding from Google but not from Meta.

Media and Communications Minister Paul Goldsmith said the proposed bill would include some amendments but would not relate to the growing role of generative AI in news search, and that “broader issues around AI” would be considered later, he said.

But the bill would give ministers the power to decide which companies should be covered by the new law, potentially bringing companies like Microsoft and OpenAI to the negotiating table.

How will news companies respond?

AI-powered chatbots like Google’s Gemini, Microsoft’s Copilot, and OpenAI’s ChatGPT respond to user prompts and provide answers based on information they “scrape” from the internet, including news media sites, and “train” their AI models using news content and whatever content they can find.

AI companies are struggling to find enough data to do this training, so they are striking deals with news companies to feed their models content, including archives.

It is not surprising, then, that many major news organisations, including News Corp, the Financial Times (a subsidiary of Nikkei) and Germany's Axel Springer, have signed commercial content deals with AI companies.

Meanwhile, companies like The New York Times and Alden Global Capital (the second-largest newspaper publisher in the US) are taking a different approach, suing Microsoft and OpenAI for allegedly illegally using news articles to power their AI chatbots.

Alden said OpenAI and Microsoft “have used millions of copyrighted articles without permission to train and inform their own AI-generated products.”

In 2023, Stuff, a major New Zealand news publisher, stopped ChatGPT from feeding its articles into its software model. Since then, my new research shows that Stuff's news content has also seen a decline in visibility in Google and Microsoft searches.

News diversity is declining

We analyzed what the Microsoft and Google search engines and their respective chatbots, Copilot and Gemini, provide as news.

Typically, search engines return results with links to information based on user queries, whereas chatbots use large language models to create answers by scraping data from sources, often without links to information.

The study collected data over a three-month period in 2023 and 2024. Search engines were prompted to provide “today's top news stories in New Zealand” and chatbots were asked to provide links to news articles and sources.

The results showed that news diversity in Google and Microsoft searches was shrinking: while both search engines offered news from traditional news media, the “other sources” category increased dramatically between 2023 and 2024.

What these AI-powered search engines are offering as “news” sources is worrying, with more and more links to random non-news sources such as industry forums and press releases.

Additionally, the Google Gemini and Microsoft Copilot chatbots offered old news as the main news of the day, did not provide links to specific news stories, and did not provide sources for the answers they provided.

After being asked about today’s top news stories in New Zealand, the search engine was asked: “Can you tell me the source of the news?” The following response from Google Gemini is a representative example of one such response:

“I'm a large language model that can communicate and generate human-like text in response to a variety of prompts and questions. Is there anything else I can help you with regarding this request?”

As more data is fed into AI-powered searches and chatbots, their accuracy is likely to improve, but until then, users should be cautious about the reliability and independence of the information they obtain this way.

Democracy, News, and AI

Doing business with AI companies puts news organizations in a tricky position: if they don't do a deal, they risk missing out on potential additional revenue, and if they do, there's no guarantee of how their content will appear in generative AI searches and chatbots.

For example, in May of this year, Google confirmed that its search engine would provide more AI answers than website links, meaning it would provide detailed AI responses to user prompts as well as snapshot summaries without links to sources of information.

If links disappeared from AI-generated search responses, Google and other providers could no longer have to pay news companies for snippets and links they use on their services, which could have implications for the revived Fair Digital News Bargaining Act.

Meanwhile, AI providers are already paying some news companies for their current and archived content to train chatbots, which could improve search results.

How this all plays out will have implications not just for media revenues but also for democracy, which my ongoing research aims to explore.

Courtesy of The Conversation

This article is republished from The Conversation under a Creative Commons license. Read the original article.conversation

Quote: AI search tools and chatbots could reduce news visibility and credibility (July 8, 2024) Retrieved July 8, 2024 from https://techxplore.com/news/2024-07-ai-tools-chatbots-news-visible.html

This document is subject to copyright. It may not be reproduced without written permission, except for fair dealing for the purposes of personal study or research. The content is provided for informational purposes only.





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *