
According to one of the UK's industry-leading technology practitioners, audiences decide how the production community responds to AI use.
“Regulation, Ethics, and ai-ah!” Benjamin Field, a panel at the Banff World Media Festival earlier this week, Deep Fusion Films boss, producer and ethical AI expert, said the growth of the creator economy and social video platform (which can operate outside of industry-specific regulations and guardrails) can help consumers decide what the technology can taste.
“The audience promotes how we respond. [to AI-infused programming]. We like to think that we are in control of what people see. We don't think we're here anymore,” Field told the panelists.
“We are regulated by any job and we need to adapt to it. We are at this point in time that we are going to be adapted or death, and we need to be able to self-regulate, while still being able to respond to what the audience trend is.”
Keep the pace
Field comments came as a panel. This also included the authors' guild of extraordinary AI chief Devlin Karlson Smith, Darksrope executive chair Rajakanna and Canada's executive director Victoria Shen.

Shen has been retroactively censored by companies that have vehemently advocated regulations and guardrails to protect writers, and misused their lack of regulations to “justify theft” of existing creative work.
She said that producers who are trying to monetize and protect her content “want to engage writers who don't use AI,” but agreed that the rate of content growing on social video platforms has changed viewing habits, and that this will spark conversations, especially as the production of animation and child programming is hit hard in Canada.
“We're losing our eyes to them, especially in the younger generation, as an industry that competes with YouTube and Tik Tok,” she said. “We have to think about the type of content we're creating for the younger generation. What they're used to seeing and what they want to see has changed.”
Shen also pointed out that “all debates are speculative” because the seemingly effective and reasonable uses of AI, such as reducing costs, have had a negative knock-on effect.
“What if you don't have much money to create content?” she asked. “If you have constant expectations of creating the same content with the same or less money [that’s an issue]. That's what I saw in manufacturing. We reduced our costs so now they pay less. ”
Define appropriate use

Proper use of AI will remain an important area of discussion for Khanna, a virtual production and gaming company. Darksrope said it had “leaved the station” on the regulations because the trains can't regulate content on the Internet and can't decide who will use AI.
“In our bubble and in the industry, what are we going to do about it?” he asked. “For me, about their work, which is worthy of creative, story, and narrative perspectives, comes down to having to decide where each company wants to play.
“Want to be an AI-generated pump-out Tik Tok [creators] Or do you want to do something with more crafts, stories, and art? Regulators are not going to communicate any part of the spectrum [you want to be in]you need to decide. ”
Field, who has written a policy paper on AI and shares fears about government planning on technology, argued that the main thing is to ensure that creative communities are involved and stay up to date with technology.
“People tend to chunk AI as binary stuff, but AI has good and bad actors. It's up to the individual to find out who they are,” he said.
“The industry is moving faster than ever. Yes, it feels a lot. We don't have to feel like our creators and creative economy are cut off and alone.”
