Uniting physical and virtual lighting to create compelling concert experiences

Live concerts consistently wow audiences with increasingly immersive, jaw-dropping moments. Incentivizing users to attend a virtual show means delivering some extra visual magic that is only possible through game engine technology. For us, that means playing to the strengths of the medium - immersing users into hypnotic, evolving worlds that are grounded in reality but avoid practical constraints such as physics and budgets for physical fixtures & fx. 

At least for now, an XR concert may lack the intensity of a stack of Funktion-One subwoofers or the electrifying reaction of a crowd after a pyrotechnic blast, but we can surely compensate for those in other ways.  

In a virtual production context, this philosophy requires a suite of lighting tech that allows designers to rapidly iterate on virtual cues that simultaneously affect physical fixtures on stage. This is important, as while filming on LED gives some ambient light level to the talent, it’s not sufficient to provide plausible, high-intensity, interactive lighting that physical fixtures deliver. Virtual lighting pushes the fantasticism for pixels-on-screen, while physical lighting unifies the image and grounds one foot back in reality. 

Dynamic procedural lighting in Sebastian Yatrá’s performance of Traicionera, shot at the Verizon 5G Lab in Los Angeles. Credit: Verizon

Lighting design with math

Many of the resources for virtual lighting design rely on recording the DMX output - the standard data protocol for lighting hardware - from a dedicated console. This is a fantastic method for accurately simulating realistic fixtures (perfect for previs!) but it pushes the design and iteration phase upstream to the console. In a collaborative version-controlled studio environment, this isn’t always desirable. Also, once frame-by-frame DMX keys are recorded into Unreal Engine, it’s difficult to make adjustments. Attempting to retime or adjust the animations becomes cumbersome when working with hundreds of densely baked keyframes. 

Since we were prioritizing rapid iteration, we built a system within Unreal Engine to procedurally animate large groups of lighting fixtures using template sequences. Rather than keyframes, fixture groups have custom parameters - speed, wavelength, offset, etc. - which plug into a suite of math function libraries we purpose built. 

As one example, a Wave Rotation function could produce a cascading wing effect.

An early prototype of the lighting tool. 

Wave Rotation: Sin(index / offset + time * rate) * rotationAmount - displacement. Credit: MAGNOPUS

In Unreal Engine, template sequences act as a container for generic animation data which can be applied to multiple classes. In our use case, we generated template sequences using a custom editor UI that automatically placed functions into the master level sequence. Rotation, position, dimmer (intensity), color, gobo (texture masks), and zoom (beam width, when applicable) can separately be controlled using these sequences, in a modular fashion to achieve unique permutations of light. 

Combining color, dimmer, and rotation effects. Credit: MAGNOPUS

Designers can utilize the unique traits of the templates to control the look of the scene. Overlapping sequences crossfade between each parameter of the function, giving the designer fine control over the length of transitions. Sequences can be looped, sped up, or slowed down - a configurable strobe effect across all the fixtures can be created with just two keyframes (dimmer = 1 to dimmer = 0) and a speed-controlled loop.

Below is the arrangement of template sequences for the tunnel animation. Unreal’s native sequencer tools allow for designers to easily adjust timing and transitions. The three rows correspond to color, dimmer, and rotation respectively. 

Master level sequence for the tunnel demo.Credit: MAGNOPUS

When designing this functionality, we took notes from the traditional lighting console workflow - fixtures can belong to multiple groups, so they can be addressed as ‘Stage Left’, ‘Strobe’, etc. Within each group, fixtures are dynamically sorted into an order of indices using the relative transforms of each element. If fixtures are moved or reordered, effects will update accordingly.  

Animating rotation and dimmer based on worldspace position. Credit: MAGNOPUS

To ensure that transitions between looks are always smooth and grounded in realism, each fixture’s parameters constantly interpolate towards target values. Moving heads will pan and tilt towards their target locations. When switching between vastly different lighting scenarios, the animator does not have to invest time into keyframing each element and can instead focus on composing the look as a whole. 

Prototype showcasing automated transitions. Credit: MAGNOPUS

In order to achieve the kind of surrealism which works so well in XR, we made this system agnostic by utilizing generic components. The designer can turn any object into a ‘fixture’ by adding our custom LightController component - the emissivity of a group of 500-foot tall icicles could be animated to follow a bouncing wave pattern and effortly synchronized to a riff in the music. Walls of laser light can animate open and close in just a few clicks, allowing designers more time to focus on the creative aspect of sequencing, and less on the tech. 

Any actor can be a fixture.Credit: MAGNOPUS

Additionally, this powerful combination of template sequences and components allows us to reuse effects across scenes. As one example, a preset that was built for a traditional moving head setup could be dragged into another level to control an array of floating drums. Since these objects are universally described by their transform and an emissive color on a material or light, template sequences apply their behavior without additional efforts from the designer. 

Physical lighting reactivity

For LED walls, brighter is not always better, and low nit values can be necessary in order to balance the correct exposure between the talent and the scene. However, this comes at the cost of spatially accurate ambient lighting from the wall. When shooting elaborate dynamic environments in volumes, we augment the LED light with pixel mapped physical fixtures. Depending on the needs of the project, we use systems of mapping 2D render targets or 3D color intensity levels to real-world space. This ensures that the talent, physical props, and set design integrate with the colors of the virtual environment in the frame. 

Pixel mapped MAC Aura’s and KL Panels. Credit: MAGNOPUS

In the case of sparse fixtures, pixel mapping is a process of taking sections of a 2D image, averaging the color within that section, and patching it to certain properties of a physical light. 

Generic pixel mapping setup for a 4x4 matrix fixture. Credit: Epic Games

Using one or multiple virtual cameras in the scene, we send spatially correct color data to the lights through the pixel map. If the virtual environment should light the talent from the back-right, fixtures in the back-right of the LED stage will fire with their correct colors and intensities. If the scene orientation is adjusted by an operator in a multi-user session, lights will accurately react in real-time. 

Dynamic positional rim lighting. Credit: Verizon

This effect can be strengthened by using directional spotlights such as Arri Orbiters or multi-engine fixtures such as Arri S360’s to pump out even more dynamic positional light.

Upward-angled spotlights. Credit: Verizon

Exposing controls to the director

We’ve learned to expect (and prepare!) for dramatic creative changes on shoot days. One of the key benefits of a virtual production workflow is the ability to provide the director or DP with dials and knobs to adjust , so they can fine-tune the look of an environment or the timing of a sequence to their liking. Since we designed the tools to avoid any baking of explicit lighting data, every aspect of our lighting pipeline is configurable in real-time. 

Collection of controls for pixel mapped physical fixtures. Credit: MAGNOPUS

When pixel mapping physical fixtures, we apply a set of material transforms before the data for that screen region is sent to the light. If directors want punchier contrast between light and dark areas of the setup without crunching the color, we increase the exponential power node. If they want a smoother distribution of light across the volume, we increase the blur amount of our image sampling. If directors like the look of a light at 70% intensity but still want it to react somewhat dynamically to the brightness of the scene, we set the low and high clamp to 65% and 75% respectively.

In a Disguise workflow, these controls can be exposed and controlled in real-time using Dynamic Material Instances. In a pure Unreal workflow, multi-user sessions allow for one operator to drive lights without even being in the chain of render nodes. It is also possible to map these parameters into physical hardware controls such as a dimmer board or an iPad. This flexibility allows us to react to the ever-changing needs of an on-set environment and to the preferred working style of the creative team. 

Future interactions

At its bare bones, virtual production can be boiled down to a bunch of data streams flying around a warehouse, with the end goal of making something meaningful. But looking forward - how can we intercept those data streams with Unreal, Disguise, TouchDesigner, Stage Precision, etc. to transform them into something completely different? 

Tools like Move AI or Stype tracking beacons could be used to allow performers to interact with procedural lighting and shaders as they dance. Launchpads and other MIDI controllers could provide an intuitive way for artists to compose live with stems while triggering cues in AR around them. In a full VR environment, we can construct intensely tactile art installations with wearable subwoofers like Woojer or Subpac, spatial audio, and entirely new light effects. 

I find the most exciting areas of concert production to lie in these connections between performers, procedural design systems, and hardware - symphonies of technology creating some kind of cybernetic synesthesia. 

Testing out procedural geometry with pixel mapping in a volume. Credit: MAGNOPUS

Addison Herr

Senior Creative Technologist, XR at Magnopus

Previous
Previous

Meet The Magnopians: Bailin

Next
Next

It’s World Space Week! Take a Virtual Trip To The International Space Station.