Image-Based Lighting in practice - current limitations and future capabilities
Date & Time
Tuesday, October 25, 2022, 3:30 PM - 4:00 PM
The widespread adoption of Virtual Production workflows in general and ICVFX (In-Camera Visual Effects) with LED video screens in particular undoubtedly represents some of the biggest technological advancements in film and television production in recent years. With color management workflows and frameworks such as ACES and OCIO, and the underlying infrastructure providing mechanisms for metadata exchange between devices, we can gain full colorimetric control over our entire video/real-time pipeline, and we can guarantee that a scene is accurately rendered on an LED video screen with the same colors and brightness levels its creator intended. However, Virtual Production is not based on LED video screens alone. Image-based lighting (IBL), a computer graphics lighting method well understood in visual effects production, has now evolved into a principal photography lighting workflow where the LED video screens are augmented with traditional lighting fixtures, and the video content used to drive the LED video screens also drives the lighting fixtures. The advantages of this workflow are many, but although both lighting fixtures and media servers are becoming increasingly intelligent, the infrastructure that currently enables IBL to be applicable in practice still largely depends upon archaic lighting data communication protocols and infrastructure that limit essential lighting color data for fully robust IBL implementations. While today’s lighting fixtures will produce accurate, beautiful, high quality light for magnificent skin tones, these protocols must evolve be able to communicate all necessarily color and other device parameters to do so: video colorimetry (color gamut, white point, & electro optical transfer functions), spectral definitions for colorimetry, beam characteristics and orientation, and any other host of advanced fixture control parameters. In this paper, we will review and evaluate existing control protocols and data exchange formats and propose a unified, device independent control and metadata infrastructure for both video and lighting systems that will generate a predictable outcome regardless of manufacturer, device type, color space, media source, and control system.