Media and entertainment are undergoing structural changes due to audience fragmentation, new content formats, and the growing role of AI in production workflows. Traditional broadcast models built around scheduled programming and horizontal viewing no longer reflect how viewers consume content.
Platforms like TikTok, Instagram Reels, and YouTube Shorts are normalizing vertical video as a primary format, especially among younger audiences. For organizations like Fox Corp., this creates the dual challenge of delivering content to multiple platforms while managing cost and operational complexity.
At the same time, AI is more than an experiment. Now the real value is in the reasoning. Inference involves AI operating in real-time within a production environment and shaping output as content is created.
Expand your live production across formats
Live sports production, in particular, is one of the most complex areas of media. Content needs to be captured, formatted, and delivered instantly across multiple channels.
Previously, adapting a 16:9 broadcast feed to a vertical format required a dedicated team, manual framing, and post-event editing. This approach makes it difficult to meet audience expectations for real-time highlights and engagement during live events.
Fox addressed this issue by building a dedicated “vertical control room” to support social content. While effective, this model was labor-intensive and limited the amount of storytelling the team could focus on.
Across the industry, pressure points are consistent.
- Delays reduce engagement
- Manual workflows limit scale
- Fragmented processes reduce revenue
Meeting demand now requires real-time transformation built directly into the production pipeline.
Incorporate AI into your workflow
Amazon Web Services Inc. and Fox’s collaboration focused on integrating AI inference into live production rather than adding it as a separate step.
The goals are:
- Generate multiple formats from a single live stream
- Identify key moments in real time
- Enable instant distribution across platforms
- Reduce manual effort while maintaining editorial control
This approach positions AI as a set of support agents that handle repetitive tasks, freeing production teams to focus on creative decision-making.
Element inference behavior
AWS introduced Elemental Inference as an extension of the Elemental Media Service, integrating directly into tools like MediaLive. Inference becomes part of the operational stack rather than a standalone system.
The main features are:
- Real-time vertical transformation: Dynamically reframe live video by tracking subjects and movement to create a natural vertical viewing experience
- Live highlight detection: Identify critical moments as they occur, enabling rapid clipping and publishing
- Parallel output: Generate horizontal and vertical formats simultaneously, reducing turnaround time
- Integrated delivery: Supports editing and publishing during live events to drive instant engagement
Although sports is the primary use case, the same model can also be applied to news and talk shows through speaker and scene detection.
From AI efficiency to profit
Elemental Inference improves operational efficiency by reducing manual workflows and speeding time to publish. More importantly, it increases engagement by enabling real-time content distribution across platforms.
It will also change the way teams work. By offloading repetitive tasks, production staff can focus on storytelling and audience engagement. At the same time, tight integration of content and audience data creates a feedback loop of creation → distribution → analysis → real-time optimization.
bigger changes
At NAB 2026, inference has emerged as a foundational layer for media production. AI is no longer an add-on. It’s becoming embedded infrastructure.
For news organizations, the implications are clear.
- Inference is becoming the control layer for live content
- Speed and format flexibility have become competitive requirements
- AI-driven workflows enable both efficiency and creative scale
This pattern extends beyond media. As AI is integrated into real-world systems and workflows, its value is realized across a variety of industries.
For more insights, visit theCUBEresearch.com.
Support our mission of keeping content open and free by joining the theCUBE community. Join theCUBE’s Alumni Trust Networka place where technology leaders connect, share intelligence, and create opportunities.
- over 15 million viewers of theCUBE videospowering conversations across AI, cloud, cybersecurity, and more
- 11.4k+ theCUBE Alumni — Connect with over 11,400 technology and business leaders who are shaping the future through our trusted, unique network.
About SiliconANGLE Media
Founded by technology visionaries John Furrier and Dave Vellante, SiliconANGLE Media has built a dynamic ecosystem of industry-leading digital media brands that reach more than 15 million elite technology professionals. Our new, proprietary theCUBE AI Video Cloud leverages theCUBEai.com neural networks to deliver breakthrough advances in audience interaction, helping technology companies make data-driven decisions and stay at the forefront of industry conversations.
