The 2024 Media Technology Summit
About Us
The Media Technology Summit is SMPTE’s annual conference focusing on media innovations, solutions, and technologies. As one of the only peer-reviewed, non-commercial conferences of its kind, each day of the Summit features something new, including four days of programming, two days of exhibits, a slew of networking events, and even several tours! Some of the most influential members of the industry attend the Summit, including C-Suite Level executives from the largest companies in the world. The Media Technology Summit is the perfect way to expand your network, learn something new, and advance your media technology career.
Featured Sessions
Monday, October 21, 2024 | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
3:00 PM - 4:30 PM | This 90-minute session dives deep into the critical aspects of color rendition in digital cameras and LED lighting, essential for accurate color imaging in the entertainment industry. Attendees will explore how various digital cameras and LED lights can be effectively tested, documented, and aligned to ensure accurate color reproduction. The session will address the impact of human psychology and diverse skin tones on color representation. Key topics include understanding varying color spectrums, creative exposure decisions, differences between tungsten and LED lighting, the significance of the HS Scope, chroma responses, and the importance of a calibrated color reference chart. This session is ideal for professionals seeking to enhance their technical skills and ensure consistent, high-quality visual output across various media platforms. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
3:30 PM - 4:00 PM | Explore the shift from fixed-function appliances to general-purpose COTS hardware in live media production, emphasizing the flexibility and scalability of software-driven workflows. This track delves into optimizing software performance on COTS systems to maximize media processing density, reducing costs in data centers and cloud services. Learn about offloading media processing to GPUs and high-bandwidth NICs, and discover a novel approach to incrementally process SMPTE-2110-20 streams as frames are received. The session highlights how this method lowers latency and increases the number of streams per system, enhancing overall efficiency in live media workflows. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
5:30 PM - 6:00 PM | Software and IP technologies have revolutionized live event broadcasting over the past decade. However, evolving media consumption habits have outstripped the capabilities of traditional hardware-centric broadcast infrastructures. Broadcasters face increased market fragmentation and pressure to produce more content with fewer resources. IT and cloud computing innovations offer promising solutions but transitioning from hardware to IT-based architectures poses challenges. Unlike broadcast's clock-driven real-time synchronization, IT equipment operates asynchronously, necessitating a shift in how live video is managed. This paper explores a new paradigm in software-based broadcast infrastructure, capable of bridging broadcast and IT domains while meeting high broadcast standards. It will cover: Synchronous vs. asynchronous operations System architecture, including framework design, media microservices, timing, and application control Empirical data showing time savings from asynchronous processing Benefits for live production, such as scalability, reliability, agility, and composability | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Tuesday, October 22, 2024 | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
3:00 PM - 3:30 PM | This panel discussion addresses the evolving landscape of color grading and color science, focusing on preserving the craft's richness amid contemporary homogenization trends. As visual styles increasingly converge, there's a risk of losing the nuanced diversity and historical context that have long defined color grading. The panel will explore strategies to bridge generational gaps, emphasizing education, mentorship, and interdisciplinary collaboration to maintain and enhance the unique visual qualities of color grading from film to digital formats. Through anecdotal cases and theoretical color science pipelines, the discussion will highlight how blending technological advancements with traditional aesthetic principles can sustain visual distinctiveness. Attendees will gain insights into creating color grading practices that balance innovation with heritage, ensuring that future films and television shows retain their unique visual impact. The goal is to empower colorists and color scientists to craft compelling visual experiences that resonate with audiences while honoring the craft's legacy. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
5:00 PM - 5:30 PM | This case study explores the transition of The Eurovision Song Contest's video system from traditional SDI to the SMPTE ST 2110 standard, showcasing its advantages for large-scale live events. Eurovision, a hybrid of arena-sized live performance and broadcast show, upgraded to ST 2110 over a 100Gb dual redundant backbone. The transition addressed challenges unique to its production, such as managing vast amounts of content for 37 delegations and transporting it to LED screens and display devices. ST 2110's robust network-based infrastructure supports large-scale, high-redundancy systems, reducing potential delays and points of failure compared to SDI. This study highlights Eurovision’s successful migration, illustrating how other live events can leverage ST 2110 for enhanced performance and reliability. Attendees will gain insights into the benefits of network-based solutions for large LED display systems and how to apply these learnings to future productions. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
5:30 PM - 6:00 PM | Broadcasters are increasingly adopting IP infrastructure to enhance live sports coverage with HDR video and immersive audio. The publication of ST 2110-41 (Fast Metadata Framework) has streamlined real-time IP-based metadata workflows, supporting Dolby Vision and Dolby Atmos in live linear services. Ateme, Dolby, and Jünger Audio are leveraging ST 2110-based systems to enable end-to-end metadata communication, driving Dolby Vision, Atmos, and AC-4 distribution. ST 2110-41 facilitates the transmission of HDR video and Next Generation Audio (NGA) metadata through S-ADM (serialized Audio Definition Model) from live sources to encoders, automating workflows for ATSC 1.0/DVB and advanced ATSC 3.0/DVB-T2 use cases. France TV will use this during the Olympics 2024, and major European broadcasters will soon offer UHD Dolby Vision and Atmos services. In North America, ATSC 3.0 services will also benefit. This integration ensures precise HDR video and customizable audio experiences, simplifies UHD signal flow, and enhances scalability. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Wednesday, October 23, 2024 | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
3:00 PM - 3:30 PM | Sky Sports launched its live HDR service for English Premier League football in August 2022, expanding to over 6000 hours of UHD HDR sport production across multiple platforms. This paper explores the end-to-end considerations for integrating live HDR while maintaining HD delivery. It details the choice of Hybrid Log-Gamma (HLG) for HDR broadcasts and how Sky uses a single UHD and HD workflow to deliver HDR and Wide Color Gamut (WCG) benefits cost-effectively. The paper examines the impact of HDR and WCG across different sports, the necessity of UHD for HDR, and the operational demands, including training for engineers. Key areas covered include the design of workflows for editing, archiving, and encoding, as well as LUT-based conversions and broadcast graphics workflows. Additionally, the paper discusses critical viewing and management of end devices in assessment environments. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
5:00 PM - 5:30 PM | Vector-based image search technology, which encodes images and queries as vectors in a high-dimensional space, is increasingly used for efficient image retrieval. This approach involves several key steps: extracting embeddings through convolutional neural networks, developing an indexing workflow that includes preprocessing, extracting, and compressing these embeddings, and inserting them into a searchable index. The search engine utilizes approximate nearest neighbor search (ANNS) and compression methods to enhance efficiency. Traditionally, search engine optimization for images relies on keywords either embedded in metadata or placed on hosting pages. However, vector-based searches face challenges in adaptively enhancing image discoverability due to the static nature of image vectors. This paper introduces a novel method to optimize image discoverability for vector-based search engines while minimizing visual impact. The approach frames the problem as an optimization task, using an iterative process to adjust images based on intended and unwanted search queries. It employs backward propagation with loss functions to fine-tune images, generating embeddings that align with target queries and minimizing visual deviations through perceptual loss. Segmentation masks can further direct visual adjustments to specific areas of an image. This technique can be applied to newly uploaded images or existing asset libraries to improve search ranking. |
Keynote Speaker- Conversation with Lawrence Sher: His Film Career and ShotDeck's Impact on the Industry
Lawrence Sher, ASC
Lawrence Sher is an award winning cinematographer/director and the founder of the popular filmmaking tool ShotDeck. As a cinematographer, Mr. Sher has shot over 35 feature films, numerous pilots and 100’s of commercials and music videos in his 30 years in the entertainment industry. Known for such films as The Hangover trilogy, Garden State, Due Date, and most recently the Joker films, his films have grossed over 4 billion dollars in Worldwide Box Office. For Joker, his cinematography was nominated for an Oscar, BAFTA and won the Golden Frog at the Camerimage Film Festival in Poland. As a director he recently helped kickstart the Peacock original series Rutherford Falls directing multiple episodes including the pilot as well as the 2017 feature Father Figures which starred Owen Wilson, Ed Helms, Glenn Close and JK Simmons. Recently as cinematographer, he completed the Joker followup film Joker Folie au Deux which arrives in theaters October 4th and The Bride from Maggie Gyllenhaal which stars Christian Bale and Jessie Buckley which comes out in fall 2025.
Moderated by: Carolyn Giardina
Carolyn Giardina is senior entertainment technology and crafts editor at Variety and Variety Intelligence Platform (VIP+).
She has devoted her career to covering the art and science of entertainment, bringing to her work a wealth of knowledge of entertainment technology and the creative arts, as well as film history. She has covered such industry inflection points as the DTV and digital cinema transitions; entertainment technology, including production, post and exhibition; the creative arts including cinematography, editing, and sound; and related topics such as labor issues.
Carolyn has been honored with American Cinema Editors’ Robert Wise Award for journalistic contributions to film editing; the International Cinematographers Guild’s Technicolor William A. Fraker Award for journalistic contributions to cinematography; and the Advanced Imaging Society’s Distinguished Leadership Lumiere Award.
She joined Variety and Variety VIP+ in February, following more than a decade at The Hollywood Reporter. Earlier, she worked in the U.S. and aboard, on titles including SHOOT, Film & Video and the SMPTE Motion Imaging Journal. She also co-authored a book about stereoscopic 3D filmmaking, titled “Exploring 3D” (Focal Press, 2013).
Sponsors
Media Partners