A hybrid, multicloud approach is needed to address the goals of broadcasters to commission and evolve scalable software-based broadcast systems composed of elements from diverse vendors, on Commercial Off-The-Shelf (COTS) hardware, avoiding lock-in at the facility and in the cloud. These software elements need to support common control, integration with enterprise management systems and a sufficient level of automation.
To that end, broadcast vendors’ software needs to run everywhere; use standards and open specifications for control and transport interoperability; support fine-grained resource allocation; and leverage IT best practice, e.g., for monitoring, security, and orchestration. We propose an open-source container orchestration platform, virtualized infrastructure layers, and common software APIs for control interoperability, allowing the vendor to focus where their value and revenue is – the application and UX.
As a case study, we construct a scalable edge platform for transcoding, AI inference and other video and audio processing, that can reduce the cost, latency, and power footprint of cloud-based media production.
Our multi-architecture containerized applications are deployed and managed with Kubernetes. This provides fine-grained allocation of hardware resources, including Graphics Processing Units (GPUs) and ST 2110-capable network interface controllers (NICs). Service discovery and connection management are achieved using the Networked Media Open Specifications (NMOS). The example video and audio processing pipelines are based on the GStreamer open-source multimedia framework, leveraging the high-performance capabilities of GPU and NIC. Prospective services include transcoding, video and/or audio clean-up, super resolution, automatic closed captioning, content moderation, object identification and compositing of chat or data feeds.
Keywords. Software-based broadcast, Hybrid, Multicloud, Edge, Virtualization, Containers, Kubernetes, GPU, NIC, DPU, ST 2110, AMWA NMOS.