(CNN) The future of artificial intelligence promises technology with a human touch. But are you human enough to deliver a compelling spiritual message?
Pastors and rabbis recently discovered that ChatGPT, an AI language-learning model that can spit out passable prose in a few prompts, isn’t so bad at creating sermons that are the foundation of many religions’ worship services. Did.
Historically, these speeches are based on generations of knowledge, keen text analysis and scholasticism combined with the unique charisma and experience of each worship leader. Sermon writing is considered an art. even a divine call.
Watching computers approximate similar works in seconds has prompted religious leaders to grapple with an interesting problem. Can AI reproduce truly human, spiritual messages? And if it can, are computers that good? Or does the human message need to be tweaked in some way?
What does an AI sermon sound like
In December 2022, Rabbi Joshua Franklin delivered an unusual message during a service at the Hamptons Jewish Center in East Hampton, New York.
“I’m going to plagiarize a sermon,” he told the congregation. “And you have to guess who wrote it.”
Next, Franklin delved into the illumination of Parashat Vaigash, the part of the Torah that tells the story of reconciliation between brothers.
“This is a powerful lesson for all of us, as it shows the importance of willingly reaching out to others, even when it’s difficult, in order to build intimacy and connection,” he continued.
At the end of the sermon, which lasted about two minutes, Franklin revealed the true author to a thunderous applause. ChatGPT.
“You are clapping,” he said. “But I’m scared!”
Franklin added that it gave the language-learning model very specific prompts and allowed us to see details of the work that experienced academics would never get to see, but the experiment was received with half awe and half terror. Will the work of religious leaders, so closely tied to the human experience, ultimately be overshadowed by computers?
How AI language modeling works
Popular entertainment does little to assuage human fears of an AI-generated future in which computers realize consciousness, ethics, souls, and ultimately humanity. In reality, artificial intelligence tools like ChatGPT are just convincing imitations.
When asked how CNN works, ChatGPT provided the following response:
“ChatGPT works using a deep learning algorithm called Transformer Neural Network, which was trained on a huge corpus of text data from the internet, books, and other sources. During training, the model has learned to identify patterns in data and use them to generate text that resembles human speech.”
“The way ChatGPT creates its own responses is by using a technique called language modeling,” he continued. “Language modeling is the task of training a model to predict the probability of the next word in a sentence given the previous word.”
Ultimately, the response may sound human. Perhaps even human enough to satisfy a crowd expecting religious enlightenment.
Some experts say it’s still clear something is missing.
“It lacks soul.
However, ChatGPT is “trained”, i.e. inherently fed, with a myriad of texts so that it can refer to the sacred texts it has been trained on in a matter of seconds to do some of the work of a scholar. increase.
when asked ChatGPT was initially hesitant about what kind of text it would refer to in its Christian sermons, stating that as an AI language model it “doesn’t hold personal beliefs or religious affiliations.” A slightly modified survey provided the following:
“When a language model like ChatGPT is asked to produce Christian sermons, it typically relies on training data and knowledge of Christian theology and practice. It includes a wide range of Christian texts, including writings, and books of sermons by other preachers.”
“To create Christian sermons, language models are usually given a specific topic or theme to focus on,” he continued. “The vast database of knowledge is then searched to find relevant information and examples related to the topic. may include the identification of
How Human Preaching Is Organized
Sermon writing is an important skill for most worship leaders. Effective preachers have followed in the footsteps of notable religious leaders such as Martin Luther King Jr. and Ojas Tone.
“These messages are what bring people to the door and contribute to the character of a particular church,” says Stu Strachan, founder of Pastor’s Workshop, a resource for Christian preaching and teaching. increase.
Preaching is so important, in fact, that there’s a thriving industry that makes preaching a little easier by giving you tools to worship your leaders.
“Pastoral positions are very demanding,” Strachan told CNN. “Even something as important as the sermon of the week can get lost in other assignments. Having this inspiration helps with the basics and allows the pastor to focus on putting together a message.”
However, some practices may extend the ethics of that pursuit. Ghostwriting of sermons is not uncommon, and things can be in jeopardy if the preacher doesn’t reveal where they got their great ideas from.
“Ghostwriting is definitely frowned upon when it misleads the congregation, uses it fraudulently, or substitutes for actual scholarship. Yes,” says Strachan.
Where AI – and humans – fall short
Proponents of AI say AI can help handle some of the burden of the creative process, but many faith leaders say something is always lacking in AI’s writings. I’m here.
ChatGPT is free for public use, so you can easily test its possibilities and limitations. A basic input such as “Write a sermon on X topics” returns basic results with little detail or insight. The model returns more sophisticated results when requests include detailed themes, specific text markers, and the addition of common narrative devices such as personal anecdotes, cultural references, and quotes. .
In other words, the quality of ChatGPT’s output is only as good as the prompts you receive. Still, it has some serious drawbacks. ChatGPT and other language learning models may return inaccurate, incomplete, or biased information. It’s a weakness clearly stated in ChatGPT’s interface.
Ken Sandet Jones, a professor of theology and philosophy at Grand View College in Des Moines, Iowa, summarized other shortcomings he saw in an article for the Christian nonprofit 1517.
“Even if Pastor AI ChatGPT were able to move beyond preaching as information distribution, it could miss preaching boats because it only works in the realm of a hypothetical audience,” he wrote. “There is nothing special about what it produces, so it is not ‘for you’.”
Jones argues that the fact that AI is not human and inherently unburdened by life and death means that it cannot predict or create anything that would appease the human soul.
But spiritual nourishment is the result of preaching at its best. Indeed, like any other rhetorical art, writing a sermon, even if it is done by a human, can be lazy, unreliable, and inhuman.
A Twitter user explained this to interesting effect when asking ChatGPT for a devotional meeting. Full of Christian buzzwords “Daddy God”, “Desired Prayer”, “Protective Hedge”, “God’s Tool”, etc.
The result is a chaotic salad of clichés and vague advice.
“I know AI is dangerous, because this reads like something I would have definitely heard from a Christian influencer this week,” someone jokingly replied.
Exercises like this have led some faith experts to alternate logic. AI preaching lacks essential humanity. Even if they can produce something that fits someone’s actual message, it probably lacks essential humanity as well.
“Creativity requires independence, insight, imagination, all the abilities that ChatGPT lacks,” writes Chaim Steinmetz of The Jewish Journal. “But ChatGPT isn’t the only one pushing cliches and deceptions. Many people do as well. If our own opinion sounds like ChatGPT, it’s not a flaw. It’s a product of our own failures..Like parrots, ChatGPT forces us to hear what it really sounds like.”
There is another element of powerful religious rhetoric that artificial intelligence has always found lacking, regardless of what it can produce. It is the actual human being who delivers the message.
“Language models like ChatGPT can provide something close to sermons, but they can’t replicate the full experience of a live sermon given by a human preacher,” ChatGPT said in a conversation about how sermons are created. said in “The human element of sermons, such as vocal inflections, emotional expressions, and audience interactions, is an important part of the sermon experience that cannot be replicated by language models.”