Photo: Suzanne Teresa, photographed at the Hollywood Bowl, courtesy of the L.A. Phil
The Los Angeles Philharmonic kicked off its 100th anniversary season on September 30 in a big way. The orchestra started the day with a free festival designed to showcase the city’s creative spirit; it stretched the eight miles between the Walt Disney Concert Hall and the Hollywood Bowl and featured 1,800 musicians, artists, and dancers, plus art installations, food trucks, screen printing, and more.
But the real highlight of the celebrations came after sunset, when performers including Katy Perry, Herbie Hancock, and John Williams performed a free concert for 18,000 fans at the Hollywood Bowl.
It wasn’t a typical show at the Bowl, though. The concert marked the debut event from Xite Labs, a new project from content creator Greg Russell (formerly of Tandem Digital Entertainment) and visual artist Vello Virkhaus (formerly of V Squared Labs).
The duo, who recently merged their companies, teamed up to create projection mapping that was responsive to the music—in real time. More than half of the projection mapping was improvised depending on the performances, with technology designed to react to the orchestra’s audio signals.
“We were not working with a timecoded show or click-track,” explained Russell and Virkhaus in an email. “Instead, the actual tempo and purposefully subtle fluctuations were going to be reliant upon [L.A. Phil music director] Gustavo Dudamel’s and John Williams’ internal clocks. Even though they are masters, we knew that we would need to be able to be smart and flexible with our visual experience.”
A team of 12 computer animators, programmers, and digital artists spent two months creating 16 looks to accompany the performers, using more than 17.4 million pixels. The result? Colorful, high-energy projections on the Bowl’s famous proscenium that moved perfectly in time with the music, fully immersing the audience in the performances.
“Using Notch Designer and Playback tools, we were able to design looks for songs like Katy Perry’s “Firework,” which used a real-time generative particle system—and then have those looks react to the orchestra’s audio signals,” said Russell and Virkhaus, who called out a performance of the Star Wars theme as a particular highlight.
“In Star Wars, there were moments when the whole Hollywood Bowl looked like the interior of the Millennium Falcon, Han Solo’s ship,” noted the team. “In those segments the audio from the orchestra, divided into eight subgroups, fed into our equipment racks and into our pre-programmed design from Cinema 4-D—which had been ported into Notch and programmed to react to sample audio that we used as reference tracks before the show.”
To put it more simply: The team created a 3-D design that lit up in certain areas and at certain speeds and intensities in time with the music. “You could literally see the orchestra playing the physical structure of the Hollywood Bowl, which we designed as a transportation device for the audience,” they explained. “So much came together in those moments and we felt like we were bringing the vision and the team—which included the audience—together to share in the magic and glory of John Williams’ epic masterpiece.”
Due to the improvised nature of the projections, Xite Labs was not able to participate in a formal rehearsal with the orchestra. Other challenges included working around the physical constraints of the bowl’s stage while trying to position the interactive camera system and finding a way to run an 800-foot fiber cable to the top of the bowl. Another hurdle: The team was given only 90 minutes to align the mesh needed to properly map and align the visuals on the stage.
Despite the challenges, Russell and Virkhaus are happy with the result. “We love working on projection mapping because of the sheer scale and immersion it provides to creators and audiences,” they said. “It’s not new at this point, but pushing the quality and scale of these types of shows is exciting and still gets everybody’s adrenaline going.”
They think the future of the medium is finding new ways to blend projection mapping, sculptural lighting, LED, practical lighting, and sound design.
“We are starting to integrate higher-level embellishments to our mapping projects, such as live body tracking, camera tracking, and interactive elements that bring performers into the mapping shows,” they said. “These environments will become extensions of our digital reality, embedded with machine learning, A.I. attributes, and interactive interfaces. We will be able to participate collectively in concert experiences through our devices.”