This editor’s blog explains our journalism and what’s happening at CBC News. you can Find other blogs here.
“Choose news over noise.”
That’s the tagline of a new CBC campaign aimed at reminding Canadians how our journalism can provide safe harbor from the turbulent sea of misinformation, fake news and AI-generated content floating around in our feeds.
There are numerous studies that suggest that Canadians feel overwhelmed by this flood of problematic content.
new research According to a Canadian Journalism Foundation (CJF) survey, 88% of respondents expressed concern about AI deception in news. Nearly half of those surveyed said they encounter misleading or false information every day or several times a day.
Our goal is to position CBC News as an antidote to this growing problem, a place where you can be sure to find fact-based journalism that is always produced, verified and overseen by humans. Where we are publicly accountable for our work. independent ombud And when we clearly admit to being wrong. (We are humans, after all.)
Two and a half years have passed since its publication. first set of guidelines CBC News on the use of artificial intelligence. As technology advances and new tools emerge, we feel it’s time to update these guidelines and provide teams with some modern examples of how to use AI responsibly for the benefit of journalism and how to avoid potential pitfalls that can undermine public trust.

In the interest of complete transparency, we would like to share with you new internal guidance for staff. And perhaps these updated guidelines will be of interest to others grappling with how to use this technology in newsrooms, workplaces, schools, etc.
The basics of our guidelines remain the same. Human supervision is required. AI is a tool, never a creator. Final editorial decisions, fact-checking, and accountability always rest with journalists. our journalistic standards must always be fulfilled. We will be open and transparent about all significant uses of AI in journalism. And the general public we serve, never We have to wonder if what we create is real or if it was generated by an AI.
CBC News distinguishes between: Generation AIthe content is primarily or entirely created by AI tools with minimal human intervention; AI supporta collaborative process where AI is used as a tool to enhance, accelerate, and facilitate human journalism.
We believe that using generative AI to create original public content (text, video, audio, or images) can be challenging and poses the greatest risk to public trust if not carefully managed. We will be very conservative in using this, only with close human oversight and transparency. As stated in the original guidelines:no surprises: Viewers will know about AI-generated content before they hear, watch, or read it. ”
However, we believe there is an opportunity to leverage AI as a support function. For example, this tool allows you to quickly examine large amounts of data. Recently, we analyzed a number of town council meetings to identify controversial stories that local media hasn’t yet told (subject to our own verification, of course).
For the record, below are the guidelines we have shared with our staff.
introduction
These guidelines establish a framework for the responsible and ethical use of artificial intelligence (AI) at CBC News. Our core philosophy is that AI empowers our staff and strengthens our journalism. By carefully integrating AI tools, we aim to increase productivity and free up more time for demanding journalistic activities, while ultimately improving the viewer experience. Compliance with these guidelines is mandatory for all staff working under News. CBC News is committed to further expanding comprehensive AI literacy and ongoing training for all journalists.
Because AI is a rapidly evolving field, these guidelines will be continually reviewed, updated, and communicated.
Core principles and goals
Use of AI is governed by: basic principles and strategic objectives that align with our goals. Journalism standards and practices.
principle
- Mandatory human supervision: Journalists must always be involved in the editorial process. AI is a tool. It is not the creator. Final editorial judgment, fact-checking, and accountability rest with our staff.
- Accuracy and reliability: To maintain audience trust, all content, AI-assisted or not, must meet rigorous standards for verification and accuracy. This is achieved through a commitment to journalistic standards.
- transparency: We will be open with colleagues and audiences about how we use AI in our work, especially when it has a significant impact on content.
the goal
- Increase productivity and save time (improving employee experience): Automate and streamline routine tasks responsibly, freeing up journalism production staff for higher-value work.
- Improving viewer experiences: Use AI responsibly to, for example, deliver content in more accessible formats across different platforms.
Categories of AI use
It is important to distinguish between two main ways that AI can be involved in content creation.
At CBC News, we’re focused on AI-powered jobs.
- AI support: This is a collaborative process in which journalists use AI tools to enhance, sharpen, and accelerate their work. Humans remain the primary authors and retain full control over the creative and editorial direction. AI acts as a collaborator.
- AI generated: This is content created primarily or entirely by AI tools with minimal human intervention. as we have committed to 2023: “We do not use or present AI-generated content to our audiences without full disclosure. It should come as no surprise. Audiences will be aware of AI-generated content before they hear, watch, or read it.”

Creative support functions allowed
With the above core principles and goals in mind, AI tools are approved for specific features that aid workflows and increase efficiency. Normal editorial and vetting processes should be applied to all scenarios. The goal is to uphold the unique values of human journalistic judgment and creativity.
Approved uses fall into several categories.
1. Research and story development
- Brainstorming and Outlining: Generate suggestions for story ideas and structure.
- Investigation and data analysis: Use AI to find information, identify trends in data, and gather background context.
2. Support and Reviews
- Suggest headings and questions: Generate a wide range of choices for users to consider and narrow down.
- Audience Engagement Suggestions: Generate a variety of options for consideration and adjustment, including social media text, program descriptions, titles, Chiron text, radio/TV billing, search engine optimization (SEO), push alerts, and more.
- Summary: Create a concise version of an article or transcript for internal and audience use, always vetted by a human.
- Feedback: Use tools for grammar checking, style consistency, and general story feedback.
- Translation: Perform the first translation. Must be verified by fluent human speakers for nuance and accuracy.
Notes on approved experiments: In order to innovate responsibly, the CBC News AI Steering Committee may approve limited, short-term experiments to explore new AI tools and use cases. These projects have special supervision and are designed to support our learning. Participants will be clearly defined and the results will be used to inform future versions of these guidelines.
A note about content conversion: Certain approved instances where AI is used to transform content, such as text-to-speech or closed captioning, do not require human involvement prior to publication.
accountability and responsibility
Our guiding principle is that journalists, not AI, are responsible for their journalism. This means:
- We are responsible for our deliverables. The final work is ours. We are responsible for its accuracy and completeness regardless of the tools used.
- Be able to explain your process: If you are asked these questions during the editorial review process, remember how you used the AI tool and be prepared to explain it.
- Be transparent: We will be open and honest with our colleagues about how we use AI in our work.
Usage guidelines for news teams
Mandatory practices:
- Employees must use a corporate AI account that has been approved by CBC/Radio-Canada for internal use. Avoid using personal AI tools to avoid leaking sensitive data or having your content used as training data.
- Drafting content is only allowed in corporate AI accounts or approved internal CBC applications.
- Employees should focus on fact-checking and verify and cross-reference all AI output.
Prohibited uses:
- News department staff may not use AI to create articles or manuscripts.
- News department staff may not use image and video generators to create content for the public.
- News department staff must not use any generative AI features in photo or video software.
audience disclosure
Transparency is key to maintaining audience trust. However, not all uses of AI require disclosure.
When to disclose
Disclosure is required if the contribution of the generative AI materially impacts the content or if the content could not be created without the use of the AI.
”We do not use or present AI-generated content to our audiences without full disclosure.. Of course, viewers will know about AI-generated content before they hear, watch, or read it. ”
We encourage discussion when deciding whether to include disclosures. If you need clarification on whether disclosure is required, please consult with your respective leadership team.
Examples of recommended disclosures:
- Analysis of huge data sets that is impossible for humans.
- Automated text-to-speech.
- Automatic subtitles.
An important question to ask is: Is there a risk that your audience will be misled if you don’t disclose your use of AI? If the answer is yes, disclosure is required.
If disclosure is required, please include the following details:
- What did the AI tool do?
- Why journalists used AI. Ideally, explain how AI has benefited and improved news reporting.
- How humans were or were not involved in the process and/or reviewed the content before publication.
- A description of how the content meets the newsroom’s ethical and accuracy standards. Link to newsroom standards.
Disclosure example:
In this story, we used (AI/tool/tool description) to help (what the AI/tool did or helped you do). When we use (AI/tools), we have (fact-checked and human-checked to ensure it meets our journalistic standards). This has allowed us to (do more of X, dig deeper, offer content on more platforms, etc.).
When disclosure is not necessary
There is no need to disclose the use of AI for routine assistance tasks that do not materially form the final editorial product. Examples include, but are not limited to:
- Use generative AI tools for background research.
- Use generative AI for brainstorming.
- Use standard spell check, general story feedback, or grammar checking software.
- Using AI tools for audio restoration/repair or color correction.
