'Catch the train or miss it': The relentless rise of artificial intelligence forces journalists to make tough choices – World

AI News


Perugia: The rise of artificial intelligence is forcing a growing number of journalists to grapple with the ethical and editorial challenges posed by rapidly expanding technology.

At the International Journalism Festival in Perugia, Italy, which concludes on Sunday, one of the questions asked was the role of AI in supporting or completely transforming newsrooms.

What will happen to employment?

AI tools that mimic human intelligence are widely used in newsrooms around the world to transcribe audio files, summarize text, and translate.

In early 2023, Germany's Axel Springer Group announced job cuts at its factories. built and die welt The newspaper said AI could “replace” some journalists.

Generative AI, which can generate text and images in response to simple requests in everyday language, has been breaking new ground and raising concerns for the past year and a half.

One problem is that we can now clone voices and faces to create podcasts and show news on TV. Last year, Filipino website Rappler launched a youth-oriented brand by converting long articles into comics, graphics, and even videos.

Media professionals agree that their jobs must now focus on those that provide the greatest “value-add.”

“You are the ones doing the work,” Google News general manager Shailesh Prakash said at a festival in Perugia. “The tools we create will be your assistants.” .

Since ChatGPT debuted in late 2022, the cost of generative AI has plummeted and the tool, designed by US startup OpenAI, has become available to even small newsrooms.

Colombian investigative journalism agency Cuestion Publica has developed a tool that allows it to use its engineers to comb its archives and find relevant background information when breaking news occurs.

However, many media organizations have not created their own language models that are the core of their AI interfaces, says Natali Herberger, a professor at the University of Amsterdam.

The threat of disinformation

According to one estimate published last year by Everypixel Journal, AI created as many images in one year as it created photos in 150 years.

This raises serious questions about how news can be extracted from the tidal wave of content containing deepfakes.

Media and technology organizations are working together to tackle this threat, particularly through the Coalition for Content Origin and Authenticity, which aims to set common standards.

From the Wild West to Regulation

Media rights watchdog Reporters Without Borders has expanded its media rights brief to include defending trustworthy news, announcing the Paris Charter on AI and Journalism late last year.

“One of the things I really liked about the Paris Charter was the emphasis on transparency,” said Anya Shiffrin, a lecturer in global media, innovation and human rights at Columbia University in the US.

AI editorial guidelines are updated every three months on the Indian website. quintillion mediasaid its boss Ritu Kapoor. None of this organization's articles could be written by AI, and the images produced by AI cannot represent real life.

Will you resist or cooperate?

AI models feed data, but the hunger for critical goods is causing discomfort among providers. In December, new york times It sued OpenAI and its major investor Microsoft for copyright infringement.

In contrast, other media organizations have signed deals with OpenAI: Axel Springer, American news agency Associated Press, French daily newspaper Le Monde Spanish group Prisa Media, titles include: el pais and as newspaper.

Collaborating with new technologies is attractive because the media industry's resources are stretched thin, explains Emily Bell, a professor at Columbia University's School of Journalism. She feels increasing pressure from outside to “get on the train, don't miss the train.”

Published at Dawn on April 21, 2024



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *