Journalists Face an Identity Crisis With AI

AI Basics


Will generative AI replace journalists? Many of those in the
business think
so, if preventative measures are not taken. Others say it’s a
tool for brainstorming and increased productivity. In June,
Insider listed media
roles (including journalism) as one of the top 10 fields that may
soon be replaced by AI, emphasizing that mid-career, mid-ability
workers will be the first on the chopping block. 

There’s no question that large language model-based chatbots like
ChatGPT, while notoriously
imperfect in their fact-checking, can write more quickly
than a journalist with a beating heart. Humans have limitations
in the form of attention bandwidth, processing speed and fingers.
In the time it has taken to formulate and type the previous
sentence, a well-worded prompt to ChatGPT could have generated a
less biased article with an infinitely broader frame of
references, if the software’s training data were up to date (it’s
currently limited to September 2021). Once updated, chatbots can
mine data from this very article to generate an arguably better
one, using this exact writing style if prompted. 

But perhaps the most startling revelation is that citations are
not required. Copyright law currently allows the scraping of
massive amounts of data from publication websites under the
umbrella of “fair
use,” allowing readers to bypass paywalls and diverting
traffic away from websites that rely on clicks for advertising
revenue. 

“There’s obviously no law that makes anyone cite their sources.
But readers usually won’t trust information that doesn’t have
adequate attribution,” says Phillip Reese, data specialist at The
Sacramento Bee and associate professor of journalism at
Sacramento State University. “It’s well known by now that
chatbots hallucinate. I don’t think anyone would or should trust
LLM (large language model) generated news articles at this
moment. But AI will get better, likely as AI companies pay news
publishers to scan reported, sourced content.”

Publishers are pivoting

Some media mammoths including IAC and News Corp, the Rupert
Murdoch-owned company that owns the Wall Street Journal and Fox
News, are
seeking to change copyright law “if necessary” and
litigate against those who recycle their content without
permission. Others are striking deals. Last month, the Associated
Press, the world’s largest news gathering
organization, reached
an agreement with ChatGPT developer OpenAI to “share
access to select news content and technology as they examine
potential use cases for generative AI in news products and
services.” An AP article describes the collaboration as part of
nearly a decade-long effort to “use automation to make its
journalism more effective, as well as help local news outlets
integrate the technology into their operations.” 

The agreement demonstrates that publishers do have leverage.
Chatbots rely on data from reputable sources like news outlets to
improve their own accuracy, Reese says. “I don’t think news
publishers will stand by as chatbots crawl their sites and take
the spoils of their original reporting. I expect publishers will
push for agreements with technology companies that will share
revenue from stories generated by AI,” he adds. 

But will this really “save” the journalist if AI is the one
writing the article, or will the print media landscape become a
barren rehashing of bot-generated content? The AP says
it does
not use generative AI in its stories, but as data
accuracy improves through collaboration, it isn’t hard to imagine
a world where most news articles do. In fact, we’re almost there.
We’ve all seen the “slightly off” social media post or read the
Buzzfeed listicle that may or may not have been written
by a robot. With this kind of click-baity content, the reader
is often none the wiser — it’s the would-be human writer who pays
the price. (Buzzfeed laid
off 12% of its workforce in 2022.) At higher levels of
professional writing, editors are
dealing with opportunist hacks clogging their inboxes
with poorly written computer-generated content. 

But as chatbots continue to learn and to write better, the talent
gap between human and bot will inevitably narrow and perhaps one
day close. Journalists not only face a crisis of utility but also
a tsunami of seemingly infinite content through which they must
distinguish themselves. It begs the most existential questions:
What does a journalist do, and how can a human do that better
than a robot?

Back to basics

Before generative AI, the rise of the internet rocked journalism.
Not only from a business perspective — forcing print publications
to fold or find new ways to monetize — but also expanding the
breadth of online “content” writers could produce. Ironically,
these roles will be the first to go. 

Traditional, boots-on-the-ground journalists are far more
difficult to replace. Chatbots can’t wear boots, at least not
yet. A good reporter exists in the field and on the phone,
piecing together stories long before they’re written and coaxing
reluctant sources to go on the record, then asking the right
questions, the right way. 

According to the oracle itself, ChatGPT, “Jobs that are least
likely to be replaced by AI are those that involve complex human
interactions, creativity, emotional intelligence, physical
dexterity, and high-level decision-making. These types of jobs
require a level of human intuition, empathy, and adaptability
that AI and automation currently struggle to replicate.”

In journalism, there is also the element of human drive. Most
journalists are motivated by their curiosity, a perceived
injustice, a moral imperative to report the facts or even a
competitive spirit. Robots can’t know the feelings that arise
while interviewing the mother of a murder victim or a player on
the winning team, and that matters. 

Good journalism is defined by famed Watergate scandal
investigative reporter Carl Bernstein as a medium that “should
challenge people, not just mindlessly amuse them,” and by TV news
pioneer John Chancellor as “to take information and add value to
it.” It is difficult to challenge and to add value without
tasting the multiplex organic salad of sensory input and human
behavior.

“AI can’t go into the field and do reporting, which is how great
journalism happens. So I hope this puts renewed focus on news
gathering,” says Reese. “I do think the days of human journalists
sitting at a computer and writing eight stories a day based on
aggregation of other people’s original reporting are nearing an
end. AI will do that work soon. I hope those journalists are
given the chance to use their skills to go out into the world and
do original reporting.”

It may also be wise for journalists to incorporate AI tools into
their work to boost efficiency and become more competitive. And
they should probably lean into their editing skills, too. 

But for now, Reese wouldn’t trust chatbots with anything more
than cleaning up grammar. 

“I’m sure it will get more reliable,” he adds. “It’s not hard to
imagine a reporter recording interviews, feeding them into AI and
getting a decent story template that they can build upon, but we
aren’t there yet. If and when we get there, we need to hold to
basic journalism practices; namely, citing our sources and making
it clear when and how AI is used.”

Stay up to date on business in the Capital Region:
Subscribe to the Comstock’s newsletter
today.





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *