This is how it’s done – The Hollywood Reporter

AI News


All creators on the planet are feeling the guidance of AI.

On social media, TikToker garners tons of views by tailoring content to algorithms meticulously designed to trigger the release of dopamine. In Hollywood, producers are rewarded with lucrative movie deals by developing projects that feed the black box AI of studios and streaming platforms. That viewing data is built through the feedback loop created by the recommendation engine, enriched by the viewer behavior that the recommendation engine initially shapes. Value creation is increasingly dominated by machines, narrowing the precious space for human-first innovation between TikTok and streaming platforms. Convert to TikTok I am transitioning.

The Writers Guild is right to seek protections against AI, but nowhere are those protections more urgently needed than in the documentary and non-fiction fields, where I have worked as both a producer and a writer.

The stakes are high and creative careers are in balance. But the biggest threat to the wider culture posed by the machines around us isn’t his bottom-up, AI-generated art that exists on social media (think Wes Anderson Directs). Star Wars). It’s top-down, AI-powered art platforming that we’re already seeing in the media world — where algorithms decide how and which stories to tell on a global scale — and in the realm of non-fiction. It is particularly gloomy. .

“The danger is not about AI in documentary making, actual production, etc. curation says Amit Day, executive vice president of nonfiction at MRC, which is producing an untitled documentary about Sly Stone and Rudy Giuliani. “If human-made films were to compete with robot-made films in the marketplace, it would be one thing. It’s a whole different thing when it comes to shaping decisions about what… So when and what to buy.What will be platformed and where.What story will be told.”

Media veteran and producer Evan Shapiro, who recently headlined MIPTV, says outsourcing accountability is a long-standing tradition in Hollywood. “From dial testing to focus grouping to ‘my kid didn’t like it,’ a certain type of his TV executive has a variety of third-party safety nets that actually protect him from making his own choices.” We’ve been delegating decisions for a long time,” Shapiro says. “These devices allow executives to take credit when a show does well, and easily pass the blame when it doesn’t. AI is just a hot excuse these days.” .”

Nonetheless, AI is already working hard at every level of filmmaking.

At XTR endorsing Magic Johnson’s document They Call Me Magic For Apple TV+, CEO Bryn Moozer built a proprietary algorithm named “Rachel” to help guide the development process. He calls it a “zeitgeist machine,” combing through his social media for trends and focusing development on those signals.

“I got a lot of shit about it,” says Mooser. “Then ChatGPT came along and the world changed overnight. We always thought of it as a tool. Built to be layered with historical data in the documentary business What works and what doesn’t Its application as a tool to enhance what filmmakers can do is incredible. It’s as powerful and important as ever. And I hope it’s accepted.”

It’s also true that human executives still make the final green light decisions on these platforms, but AI-generated data insights (which have been proven to drive audience engagement, for better or worse) Executives die in the hills of their own (human) opinions as the wealth and power of data insights) grow. For example, the data shows that the true crime genre is a sure-fire hit factory, so why should you risk a more novel concept? It is human nature to do. I don’t blame any one of them. But with his rampant CYA culture in Hollywood now leveraging AI, the executive may be hiding his presence.

Without the intervention of wise (human) executives, we as viewers will have to challenge our dastardly instincts and tap on puppy videos relentlessly. That may be the case with TikTok. From a more sophisticated aesthetic standpoint, the unbridled race to maximize audience engagement is a race to the bottom. Worse, from a journalism ethics standpoint, the realm of non-fiction is a race between ignorance and delusion.

In 2021, filmmaker Morgan Neville famously recreated the voice of Anthony Bourdain in his documentary using AI. road runner, and the move received a backlash. Morgan pulled only real quotes from Bourdain’s actual print interviews and used deepfake technology to “bring them to life.”And last year, the Netflix documentary series Diary of Andy Warhol I ventured into similar terrain when recreating the voice of Warhol’s narration. This kind of controversy will become less incendiary in 2023 when his AI technology advances and completely fake audio, video or photos look like real ones.

Much has been said about the moving goals of ethics in documentary making these days, with or without the use of AI as a filmmaking tool. But there is a more sinister force at work, causing what we consider to be widespread ethical violations: the ceding of human curation to algorithms, leveraging data to decide which projects to buy. , the possibility to even decide how to shape them based on behavior. – Basis of action. Yes, we’ve had focus groups and dial tests in the past. Yes, we had Nielsen data. But the process behind the insight was transparent. There was human accountability. As the industry relegates many of these decisions to black-box AI, technology will become less of a tool for streamlining development and maximizing profits. itself becomes the decider.

And I don’t think we need black mirror An episode that sketches the horror of this scenario, especially in a documentary.

Nonfiction storytelling shapes our understanding of the real world. This makes the preservation of human-curated documentaries more urgent than other genres. Hollywood has always tried to strike a balance between commercialism and artistic expression. This has historically allowed us to create our own brand of art for the masses. But now, more than ever in history, the relationship between the world and reality is at stake. A disinformation plague is already rife on social media, largely due to the curation he algorithm.

Moreover, to fulfill its duty to truth, nonfiction requires trust from its audience, a trust rooted in transparency and integrity, and relies solely on end-to-end human control to build it. increase.

Using deepfake technology as an example, movies lose their power when the viewer cannot trust the veracity of the images they see or the sounds they hear. If the integrity of the documentary cannot be relied upon as a work of non-fiction, the documentary falls apart. “Joe Hunting. The Ross brothers. Jessica Bethil. They’re filmmakers making a difference with their artistry,” he adds Mooser. “It will be a long time before AI can compete.”

When it comes to accountability, the same can be said for the non-fiction management level, the editorial role played by film executives (and increasingly shaped by AI-generated data insights). With humans at the helm, audiences can question the motives of studios and platforms for allowing films, but whether it’s commercial, political, or both. Content “Popular”.

For Josh Brown, co-president of the documentary mega-submarine, there is a deep-seated hunger for breaking the rules that defines us as human beings, and this is manifested in a constant desire for something fresh and new. “It’s a potential nightmare scenario savior. People have an instinctive reaction to things, no matter how you slice them. This pushes the most interesting docs back to distribution companies.” A24. Magnolia. IFC. Those are the places where we’re looking at deals,” said Brown.

And the indie market could be a bulwark. “The more esoteric titles that people want will be the ones that will revitalize the theatrical market,” adds Brown. “No algorithm-driven platform gives you the same level of choice.”

As the industry integrates AI into all aspects of business, technology must remain a tool, not a replacement for human judgment and accountability. That’s what the Writers Guild is pushing right now, in conflict with producers and studios. Because non-fiction storytelling is one of the few realms where trust in truth and a shared understanding of the world is sacred. Ultimately, everything that happens under the influence of AI must be guided by principles of honesty, transparency, and respect for the dignity of the people involved in the story being told, and done with a strong moral compass. yeah.

But it may be too late.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *