Avatar: The Way Of Water – how gaming tech helped bring Oscars nominee to life, Gaming helped bring Oscars nominee life and could change filmmaking forever
By LJ Strachan, President
Avatar: The Way Of Water was the highest-grossing film of 2022 and is one of the front-runners in several categories at the Academy Awards. It has already been heralded for its visual effects, which were developed as artists get to grips with new tech that could revolutionise filmmaking.
From the Indiana Jones-esque adventures of Lara Croft to the increasingly Pixar-quality cartoon visuals of Super Mario, video games have long looked to Hollywood for inspiration.
But recent years have showed that the relationship is becoming increasingly transactional.
While you don’t have to look far these days for a film or series based on a popular video game (The Last Of Us and Sonic The Hedgehog are just two, with Mario himself in cinemas soon), it goes much deeper than you might think.
“These worlds have been converging for a decade now,” says Allan Poore, a senior vice president at Unity, a video game development platform increasingly turning its hand to films.
“And for the most part, the core principles are actually the same.”
Indeed, modern video games look so good that the technology behind them is quite literally changing the way blockbusters are made – including the very biggest of them all.
Avatar: The Way Of Water was comfortably the highest-grossing film of 2022 – fitting, given it’s the sequel to the highest-grossing film ever made.
James Cameron’s latest blockbuster was up for best picture at the 2023 Academy Awards – and was victorious in the best visual effects category.
Many of the tools used to bring The Way Of Water to life came from Unity’s Weta Digital division.
Unity bought the tech assets of Weta, the New Zealand-based visual effects firm founded by Lord Of The Rings director Peter Jackson, for some $1.6bn in 2021 (he still owns a now separate company called WetaFX, a more traditional visual effects company that – somewhat confusingly – also worked on Avatar).
But what Unity’s deal did was bring a team of talented engineers used to working on films under the umbrella of a company best known for its accessible video game engine. Think of a gaming engine like a recipe kit – it will contain everything you need to make a game. Some are designed to help build specific types of games – like a shooter or sports title, while others are more broad-brush.
Unity has been used on everything from indie titles to entries in the Call Of Duty and Pokemon franchises.
Jackson said the fusion of expertise, known as Weta Digital, would be “game-changing” for creators.
What makes video games tick is that the rendering of the worlds players explore is done in real time. That’s because a game can play out differently depending on what the player does – it’s not fixed like a film or TV. Just think of that scene in The Wrong Trousers where Gromit is building the train track as he moves along it and you will get the idea.
That’s hugely different to how films have traditionally handled visual effects, where the rendering all happens during post-production – it’s why you’ll see behind-the-scenes footage of actors standing in big green rooms, or talking to tennis balls on the ends of sticks. All the computer wizardry was done after the fact. Gaming helped bring Oscars nominee life
And while The Way Of Water still leaned heavily on those techniques, parts of the production were powered by new real-time techniques that let Cameron and his cast and crew paint a picture of the finished product as they were working on set.
“How do you speed up film making? You do it by showing artists and directors, as quickly as you possibly can, a representation of what that frame is going to look like,” says Poore, who worked on hit animated films Ratatouille, Incredibles 2, and Coco during his time at Pixar.
“Directors will use a screen that is actually showing real-time components, so they can see what the scene and surroundings will look like as they film.
“Hopefully they’re going to help make film production smoother, easier, and faster.”
With Avatar 3 less than two years away, rather than another 13-year gap as seen between the first two films, that assessment may well be correct.
Unity’s rivals have also looked to take advantage of just how photorealistic real-time visuals have become to make moves into filmmaking, in some cases taking things even further.
The Mandalorian, the hit Star Wars series that returned for its third series this month, uses an immersive soundstage called The Volume to put its actors into whatever fantastical scenarios its writers can dream up.
I am text block. Click edit button to change this text. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Ut elit tellus, luctus nec ullamcorper mattis, pulvinar dapibus leo.
Rather than rely solely on green screens that see the effects added during post-production, The Volume boasts an enormous wall of screens that show digital environments made using Epic’s Unreal game engine (which powers the popular shooter Fortnite) in real time.
It means the actors know where their characters are supposed to be, and changes can be made on the fly.
Two recent comic book films have also used it – last year’s The Batman and last month’s Ant-Man threequel.
Star Wars actor Ewan McGregor worked on The Volume during his return to the franchise last year, and hailed its transformative impact compared to the films he worked on 20 years ago.
“It was so much blue screen and green screen, and it’s just very hard to make something believable when there’s nothing there,” he said. “And here we were [on Obi-Wan Kenobi] in this amazing set where if you’re shooting in the desert, everywhere you look is the desert, and if you’re flying through space, the stars are flying past you. So cool.”
While Poore doesn’t see the need for traditional digital effects techniques evaporating any time soon, the idea of a “virtual production space” where visuals can be generated on the fly is only going to grow.
At the UK’s National Film and Television School, there’s already an entire course dedicated to just that.
Ian Murphy, head of the school’s visual effects MA, says: “The main change that’s really exciting is it takes what was post-production, firmly at the end of the process, and gets us involved right at the beginning.
“VFX people are quite techy, but this pushes them into having conversations with production designers and cinematographers on set – and that’s a huge change. Gaming helped bring Oscars nominee life
“If you’re shooting on green screen, you’re having quite odd, nebulous conversations. The idea of this tech is the changes are fairly instant. And they might not be the finished pictures, there’s still visual effects work to do, but something from that process is sort of a blueprint that takes you into full production.
“And with the images you get from a game engine now… the trajectory is certainly all moving towards it eventually being the actual images people see in the cinema.”
We’ve certainly come a long way from Pong.
You can watch the Academy Awards on Sunday 12 March in the UK from 11pm exclusively on Sky News and Sky Showcase. Plus, get all the intel from our Oscars special Backstage podcast, available wherever you get your podcasts, from Monday morning.