All the World’s a Stage
Assumed by many to be far out of reach, in-camera real-time visual effects will soon be available to Canadian productions of all sizes. Technology experts and new converts explain why, for virtual production, the sky’s no limit.
If you’ve taken part in a Zoom call at any point since March 2020, you’re well aware that the pandemic has resulted in the rapid shifting of countless paradigms — and traditional models of film and television production have not been spared. When COVID-19 arrived, shooting on location became impractical for many productions due to travel restrictions and the closure of public spaces. In response to these unprecedented challenges, industry innovators focused their efforts on developing new ways to create content, one of the most significant being virtual production.
LED to believe
Like video conferencing, virtual production predates COVID-19, but the pandemic has served to fast-track its evolution and adoption — and Sheridan College’s Screen Industries Research & Training (SIRT) Centre has played a significant role in facilitating its advancement in Canada. Funded in part by the government, SIRT is located at Pinewood Toronto Studios in Toronto and bills itself as “a premier destination for training, collaboration and creation for the screen-based industry in Ontario and around the world.”
According to Jason Hunter, production lead at SIRT, virtual production is a broad term, but “what everyone is talking about right now, à la The Mandalorian, is a stream of virtual production that is in-camera visual effects — using large-scale LED walls to project your backgrounds, and then filming live-action people in front of that.”
It’s impossible to talk about this technology without referencing The Mandalorian — the Disney+ series set in the Star Wars universe that depicts the adventures of an interstellar bounty hunter. Creator Jon Favreau worked with famed visual-effects (VFX) company Industrial Light & Magic to render other-worldly landscapes on enormous walls composed of light-emitting diodes (LEDs). These backgrounds appear photorealistic, but what makes them feel downright magical is how they dynamically interact with the cameras, via motion capture, to convey the illusion of depth and distance. One of the linchpins of this achievement was video-game technology. David Dexter, director of SIRT, explains that “over the last couple of years, Epic Games has jumped in with both feet, releasing their virtual-production toolkit.” That toolkit is called Unreal Engine. It’s free to use and has quickly become the gold standard. “We’re super excited about the advancement in technology to enable real-time final-pixel visual effects and shots,” says Dexter.
Setting and sharing the stage
While these breakthroughs in virtual production throw open the doors of possibility, they also raise the question of access. Most content creators don’t have the financial backing of a media conglomerate like Disney, so the option of constructing a multi-million-dollar LED stage — often referred to as “the volume” — to simulate cosmic travel is about as economically feasible as building a rocket and launching it into space.
This is where businesses like Pixomondo come in. The VFX company, which owns studios on three continents (including three in Canada), was an early adopter of Unreal Engine and collaborated on the first season of The Mandalorian. Fully convinced of virtual production’s advantages, they set about constructing Canada’s first LED stage in Brampton — and promptly pre-booked it to CBS Studios for three years. It’s since become the home of Star Trek: Discovery.
“That’s when I realized, okay, this is the way,” says Mahmoud Rahnama, head of studio and VFX supervisor for Pixomondo’s Toronto and Montreal operations. “So we built our second one very close to Pinewood Studios, and we’re now building the world’s largest LED stage in Vancouver.” These facilities have been constructed in partnership with Canada’s homegrown production equipment-rental company William F. White International.
With the prohibitively high up-front costs already shouldered, virtual production becomes decidedly more affordable for smaller production companies who can simply book the stages for as many days as they require. But where are they going to find crew members with volume-studio experience? SIRT and Pixomondo are two steps ahead of this problem: they’re both developing hands-on virtual-production training programs that will dramatically increase the number of industry professionals who are capable of assisting with these projects.
Masterclass in magic
The Directors Guild of Canada (DGC) is also doing its part to upskill its members. In February, the BC chapter held a two-day virtual-production workshop, which was sponsored by the CMPA, among other industry organizations. According to Zach Lipovsky, chair of the National Directors Division of the DGC, “The ambition of this was to throw eight directors at you with eight different ideas of what they want to do with eight different films — and they shot the films in two hours each.” (Lipovsky’s upcoming film, Deadpoint, is a rock-climbing thriller that will make abundant use of LED panels.)
Heather Hawthorn Doyle, one of the directors who had the opportunity to take the volume studio for a test drive, describes how game-changing the experience was: “In our world, for us to be able to shoot something in two hours that is a complete micro-short is unheard of. By changing the background, you go from the Arctic to a deserted bridge to a forest. And it is so easy to do.”
For workshop participants, the many advantages of virtual production became stunningly apparent: the use of green screens can be significantly reduced, along with the faint green light they cast on surfaces. In addition to rendering completely convincing environments — doing away with cumbersome on-location shoots — the LED screens act as light sources and can simulate dawn or dusk in perpetuity (“I never want to shoot a sunset scene the traditional way again,” laughs Hawthorn Doyle). All actors and crew members can see and respond to the effects in real time, rather than having to interact with blank screens. And all of this serves to speed up the production process, which equals significant cost savings.
This is not to say that the technology is perfect. Some difficulties did present themselves during the workshop, but none of the challenges were insurmountable, and many gave rise to new opportunities.
Lipovsky notes that reverse shots are captured by rotating the virtual environment, rather than the camera, which “is very tricky on everyone’s brain, because all the geography that’s around them hasn’t moved.” But Hawthorn Doyle points out that the time saved on setup compensates for this brief disorientation. “The ability to have more shoot time and have the actors give you different choices or a more nuanced performance is a huge step forward,” she says.
Where she sees a challenge is in the seeming inflexibility of working with digital environments in real time. “We sometimes have a joke: you can fix it in post,” Hawthorn Doyle explains. “But when you’re shooting in a volume studio with LED screens, you need to fix it in prep.” In other words, while in-camera effects significantly reduce the need for post-production work, SFX artists now need to create those effects in pre-production, essentially inverting the traditional process. But as Dexter notes, recent innovations in virtual production — such as “the introduction of artificial intelligence and machine learning, and the capability to change seasons” — are already beginning to address this issue.
Having had the opportunity to take part in hands-on training sessions, many industry players are eagerly putting their virtual-production knowledge into practice. Ingo Lou and his business partner, Amy Fox, own Trembling Void Studios, an independent film and TV production company in Vancouver that has been developing an “optimistic” science-fiction series called Synthesis since 2017. Lou attended the DGC-BC workshop, which is where he was introduced to Showmax, the event-services company that provided the LED panels and other hardware for the event.
“Showmax was really looking to get into the virtual-production scene in a very big way,” Lou remembers. “And what I think they found attractive in talking to us was the idea that there would be a hard science-fiction project that offered the ability to fail-test their system. They essentially said, ‘Come up with the most complicated and most ambitious project for us to try to do.’ And we said, ‘Hold my beer.’”
Trembling Void assembled an LED stage at Emily Carr University of Art + Design and got to work on putting together a proof-of-concept trailer for Synthesis. The result was eye-opening for Lou. “For the first time, I dared to believe that this is actually on the same footing as any major network television series on any platform,” he says.
Adjusting the volume
Since the present state of virtual production already seems so advanced, it feels redundant to ask what the future holds for the process. But the researchers at SIRT are confident that they have some idea. Hunter notes that it’s becoming easier to scale the technology to better suit the scope and budget of each project. “Whatever surface you have to project onto — whether it be a 30-inch TV or 85-inch TV, or 10 LED panels or 3,000 LED panels — the core of the pipeline is still the same,” he says.
What stands out about the possibilities of virtual production for Lou is how transformational the cost and time savings can be for the health of film and TV industry professionals. “We work way too long and way too hard in this business,” he explains. “Twelve to 15 hours is very standard, and that takes a toll on our health, our relationships, everything. I envision a future where we can actually work eight- to 10-hour days, like every other frickin’ industry in the world.”
Now that would be one post-pandemic paradigm shift that everyone could get behind.