How does this AI process work for your story?
We're not trying to replace student journalism at all. We're trying to fill a gap that hasn't been covered before. Anything that city government puts out in a press release. A local organization like the RDU Airport Authority can rewrite that press release as a news article. [by running it through a chatbot].
What inspired you to generate local news through AI?
Local newsrooms don't have the resources to cover all the stories that matter to their readers. AI projects allow them to cover smaller stories that newsrooms (including ours) don't assign reporters to, like road closures, traffic jams, and event announcements. Some of these stories may not be covered due to lack of funding or manpower, but we still believe they are valuable news for our readers. AI allows us to write stories that we don't have the resources to write quickly and with less effort.
How can we trust what AI generates?
We are aware of the dangers that AI poses. Especially in the AI world, it's called “hallucination,” where the AI makes something up and states it as fact. That's where my job comes in. Once the AI generates something, part of my job is to go back and fact-check it to make sure that what the AI is saying is accurate and reflects what was originally released in that statement or press release. There is value in reporting what governments and officials say in press releases. If someone finds a falsehood, there is value in knowing that the government said something false.
How do you know if your project is successful?
I think a good way to find out is to see what people are reading. Over 100 people signed up within the first day or two of the project going live, which shows that people want content they can't get anywhere else. If Blackwell Street near the Bulls Ballpark downtown is being closed, people need to know that, and that's information they're not going to get by reading the Raleigh News & Observer or other news outlets.


How can we build a healthier relationship between AI and our readers?
Disclosure is important to us. The other thing is, we have fun. One of the things we do is use our AI image generator to create funny story mashups. We take some AI breaking news stories and tell our image generator, “Make something that represents these stories.” A section of NC 751 was renamed Coach K Highway. We asked the AI to create an image of Coach K Highway, with people dribbling basketballs all over the road. Of course, this isn't what Coach K Highway actually looks like, but it's a fun one that hints at what AI can do and that it probably still has some limitations.
What concerns do you have about this whole project?
I think a big concern with AI and journalism is the illusion part. Can we be sure that what we're putting out is factual? That's why we emphasize the need for human editors to review all of our AI-created articles. And in terms of disclosure, we tell our readers that the article was created by AI and reviewed by a human editor. That's an important part of us communicating to our readers that they're getting the human touch, even if it's AI-generated.
