Intel's 'Software Defined Broadcasting' helps OBS meet growing digital demands
The 2024 Paris Olympics is shaping up to be the latest global sporting event to prove that the future of live event production lies not only in traditional linear needs, but also in growing digital needs. Last month he spoke at NAB 2024, Yannis Exarchos, CEO of Olympic Broadcasting Servicesnoted that OBS will broadcast approximately 3,400 hours of competition, but nearly three times that amount of content will be produced. “About 11,000 hours of content, or 450 days' worth of content, should be produced in just 17 days.” why? To feed the insatiable digital beast. ”
OBS expects broadcasters to release more than 500,000 hours of content and for about half of the world's population to experience the Olympics. All types of content will be produced in extremely high quality for traditional television, streaming, digital, social, and other mediums. All of this is meant to serve a niche audience that is underserved in far-flung areas, as well as younger generations.
“If you're talking about young people, demographics, the proliferation of digital media, obviously you have to look at it from a different perspective,” Exarchos says. The Olympics have a big problem: scale. ”
He described the engineering feats that will be accomplished at this summer's Paris Games. “In 45 venues he has installed 450 equipment racks, 50 broadcast vans and in a very demanding configuration he has to install 70 galleries. In addition, he has to install 70 galleries to highlight thousands of events. It needs to be cut within seconds or even during the event.
“In collaboration with Intel, we've been using their Geti AI platform. It's not going to save us. We don't have the resources to cut this many highlights at so many events. We couldn’t do that before because we didn’t have anyone.”
Yannis Exarchos, Olympic Broadcaster: “It's not just about efficiency. It's about providing new opportunities for the creative part of our industry.”
Exarchos also touched on what was previously known as volumetric replay, or 360-degree playback capabilities.matrix-Replay 'Like'. The ability to quickly switch between these types of highlights for instant use is only possible through efficient, low-latency IP transmission.
Across four events across three venues, all live production, including 8K workflows, and post-production in connected facilities will be powered by Intel Xeon Scalable processors, off-the-shelf hardware, and what Intel calls Software Defined Broadcasting. It will be executed.
“We are using Intel processors to transform broadcast,” Exarchos said. “Ultra-fast, ultra-low-latency content transmission from the venue to external broadcast fans improves the fan experience by facilitating real-time content production and highlight creation.
“And I would like to say something more,” he continued. “We're allowing storytelling to become more active, more interesting, more creative, and go in different directions. It's not just about efficiency. It's about new innovations in the creative part of our industry. It's about providing opportunities and doing things that are actually impossible. In fact, we are expanding our creative possibilities many times over.”
Software-Defined Broadcast is Intel's “Polaris Point”
Much of what Exarchos details on the content side is outlined from a solution perspective. Nagesh Puppala, General Manager, Edge/Cloud Video, Intel. He laid out his infrastructure framework for an open source virtual outdoor broadcast (OB) van using off-the-shelf hardware, a solution codenamed “Polaris Point”.
Nagesh Puppala from Intel discussed the Polaris Point solution framework.
Polaris Point has three core parts, Poupala said. “This is an open source implementation of ST 2021 and combines it with Kubernetes and Helm. [software], runs on standard IT servers. It includes the Intel Media Transport Library, an open source implementation of ST 2110. Media Communications Mesh is a media-optimized open source microservice for low-latency communications. and the JPEG XS codec, which was recently open sourced. ”
In Paris, the solution will address the provisioning, orchestration and lifecycle management of virtual OB van architectures. Best of all, this solution uses the same media-enabled infrastructure across both live production and post-production workflows, regardless of location. “The same architecture used on the production side is also used for post-production activities at the International Broadcasting Center,” said Poupala. “This provides dramatic efficiencies in physical engineering and flexibility in the deployment of human resources.”
Exarchos framed much of his discussion in historical context. “When I talked about cloud in 2018, many of my colleagues raised their eyebrows,” he noted. “They ask, up in the clouds?” High definition, 4K video, live, via the cloud? Well, we have the latest information. His 40% of Paris' international transmissions will be done on the cloud. ”
