'Facts In, Facts Out': International campaign urges leading AI companies to prioritize trustworthiness of information in decision-making. More and more people are using AI tools to access news, which is often distorted or decontextualized compared to the original source. That's why the media is calling on AI giants to take responsibility for their role in the digital revolution.
Alessandro Gisotti
It’s time for AI companies to engage in dialogue with the media about source transparency and the use of journalistic content. This is the main request we received from the world's leading media associations throughout the campaign “I know the facts, I don't know the facts.” The initiative is promoted by the European Broadcasting Union (EBU), the World Association of News Publishers (WAN-IFRA), and the International Federation of Magazine Publishers (FIPP), of which the Vatican Broadcasting Corporation is a founding member. This campaign was particularly influenced by: AI assistant news completeness Report produced by the BBC and EBU. The study, published in June 2025, highlighted how AI tools systematically alter, decontextualize, and even misuse news from trusted sources, such as media websites, regardless of geographic location, language, or platform.
AI tools are not yet reliable sources of news
EBU news director Liz Corbyn said: “Despite its power and potential, AI is not yet a reliable source of news and information. But the AI industry has not made it a priority,” and said this was the reason for the campaign. “Facts, facts” It calls for urgent attention to source transparency. The credibility of journalism is at stake. WAN-IFRA CEO Vincent Peyregne said, “If the AI assistant were to incorporate facts published by trusted news providers, the facts would be revealed by the other side, but this is not currently the case.”
Campaigners say more people are using AI platforms as a channel to access news. When these tools distort, alter, or even falsify information, the result is a serious erosion of trust in mass media, which is an essential component of democratic systems. That's why the campaign emphasizes the urgent need to address this issue, given that the use of AI to access information will only increase in the coming years, especially among younger generations.
Five principles for information transparency in AI
of “Facts, facts” This campaign is part of a broader effort News honesty in the age of AIIt outlines five basic principles that AI companies should follow. 1) No Consent – No Content. News content should only be used in AI tools with the sender's permission. 2) Fair evaluation. The value of trusted news content needs to be recognized when used by third parties. 3) Accuracy, Attribution, and Sources. The original source behind AI-generated content must be visible and verifiable. 4) Pluralism and diversity. AI systems must reflect the diversity of the global news ecosystem. 5) Transparency and dialogue. Technology companies need to engage openly with news organizations to develop common standards of safety, accuracy, and transparency.
Based on these principles, our goal is to work together to ensure the truth and reliability of information. EBU's Liz Corbyn said: “This is not a condemnation. We are calling on tech companies to engage in meaningful dialogue with us. The public rightly wants access to quality, trusted journalism, no matter what technology they use.”
