MAKING LIVE PERFORMANCE TAKE FLIGHT

BEST PRACTICES FOR MIND-BLOWING CONVERGED REALITY™.

By Robb Wagner

wings

Converged Reality™ is cutting-edge narrative made by merging live performance with technology such as LED screens. You’ll see it on broadcast events like “America’s Got Talent”, at concert tours and in digital creator space.

For Carnival Cruise Lines we use Converged Reality™ to simulate interactivity between the live in-person performers onstage and the LED screens that are part of the scenic design. Featured here, “Epic Rock”, is an award winning live stage production we created with Carnival.

THERE ARE 5 KEY STAGES IN THE PROCESS.

STAGE 1. CREATIVE DEVELOPMENT.
STAGE 2. PIPELINE DEVELOPMENT.
STAGE 3. LIVE ACTION WORKSHOP.
STAGE 4. MEDIA DEVELOPMENT.
STAGE 5. LIVE INTEGRATION.

STAGE 1. Your process will begin with Creative Development. What is the big idea and how can you merge technology with live performance to capture the imagination? The final deliverable of your creative development process should be a strong storyboard that shows as much detail as possible. A best practice is going one step further to create an animatic, but you shouldn’t do that until after you’ve completed the next stage. Here’s why.

STAGE 2. Figuring out the entire pipeline before you begin making any assets is more than a best practice. It is mission critical and there’s a lot to figure out. You need to make sure your frame the rates are correct, you have to worry about your final LED pixel pitch, latency, scaling, media deliverable specs and any limitations of the media display systems including the screens. The danger in not figuring out your pipeline is, you can go through the whole motion of the process and if you have overlooked one small detail, that’s enough to bring down the project.

STAGE 3. With your Creative and Pipeline Development complete you’re ready to record a precise reference of what the live, in-person performance will look like. The Live Action Workshop is where the exact movements of the live, in-person performance are choreographed, rehearsed and captured. There are a handful of technologies you can use to capture the data, including mo-cap. A simple way is just using a video camera.

EVEN WITH THE SIMPLEST TECHNOLOGY THERE ARE PITFALLS YOU NEED TO WATCH OUT FOR.

A best practice is making sure the frame rates of everything you do match the final media delivery system. This should include your music, camera recordings and media renders. If any of these get off you can find yourself in deep trouble down the line. Even the difference between 29.97 and 30 FPS is enough to cause major problems.

media-development

STAGE 4. When your Live Action Workshop is done you can move ahead to Media Development. By using the captured workshop data as a precise reference for timing and positioning, you can track the movements of the live in-person performance on a timeline and design the media elements to match those movements exactly.

TOTAL CONTROL OF THE PROCESS IS REQUIRED TO DEVELOP THE MEDIA WITH PRECISION AND CONFIDENCE.

Even the smallest oversight can throw your project off track. Once your elements don’t sync up properly it will be painfully difficult if not impossible to recover.

live-integration
STAGE 5. The final stage is Live Integration, which in live entertainment means the final product. For Carnival it’s the live event where the in-person performers appear on stage along with the LED screens. The media elements, the display technology and the live performance all converge to form the final picture.

IF YOU DO THIS WORK WELL THE AUDIENCE WILL SEE IT AS ONE-BIG-IMPOSSIBLE-PICTURE AND YOU’LL BLOW THEM AWAY.

The video below offers a glimpse into the making of “Epic Rock”.

Share on FacebookTweet about this on TwitterShare on LinkedIn