While 1999 movie influenced everything from videogames to action movies to the metaverse, latest installment shows some restraint. From a report: The jaw-dropping visual effects in “The Matrix” transformed the quest to prove what is possible on screen. The franchise returns this week to find out if there is anything more that can be done. For the 1999 original, filmmakers invented a way to make Keanu Reeves’s hero, Neo, defy physics while dodging bullets on screen. The effect blew enough minds to get a nickname — “bullet time” — plus changed the look of action movies, and influenced mediums from animation to videogames. For the new sequel, “The Matrix Resurrections,” filmmakers deployed much-higher-caliber technologies, including three-dimensional imagery made using artificial intelligence. But after 22 years of digital evolution, high-end movie effects are approaching a plateau near perfection. “We went from pulling off what seemed to be impossible, to a sort of inability to create surprise” in the movie industry, says John Gaeta, who helped craft the bullet-time effect. He was a visual-effects designer on the first three “Matrix” films; now he is making things for the metaverse.
This year the movies presented us with a car slingshotting from cliff to cliff (“F9”); Ryan Reynolds running amok inside a videogame (“Free Guy”); and giant monsters crushing the Hong Kong skyline (“Godzilla vs. Kong”). Any viewers who paused to ask themselves — “How did they do that?” — likely came up with the same answer: “Computers.” Human characters that are totally computer-generated and believable are still on the frontier, “but I’m not sure if there is anything else that can’t be done given enough money or time,” says Ian Failes, editor of befores and afters, a magazine covering visual-effects artistry. Despite any numbness among viewers to digital spectacles, Hollywood’s demand for them has only increased. Visual-effects houses have raced to compete in a global production boom and fuel the streaming wars with flashy content. Some directors are reacting to the VFX arms race by practicing more restraint. Denis Villeneuve’s “Dune” depicts settings such as the desert planet Arrakis with a naturalistic look. Instead of zooming viewers into a fleet of attacking space ships, the director presented the nighttime ambush in silhouette at a distance, conveying a somber sense of scale. “He was just showing the reality of the world,” says Namit Malhotra, chief executive of DNEG, a visual-effects company that worked on “Dune” and “The Matrix Resurrections.” He adds: “When you’re spending that kind of money, it’s hard for filmmakers to control the desire for more, a little more oomph.”
In the new “Matrix” release, director and co-writer Lana Wachowski plays with expectations that the sequel must level up. Spoiler alert: In the movie, Mr. Reeves’s character is reintroduced as a videogame designer whose big hit was called, yes, “The Matrix.” The events in the film franchise supposedly happened within the world of his videogame — including that signature action sequence in which Neo bends time and space. As a group of videogame developers brainstorm ideas for a sequel to “The Matrix,” one declares, “We need a new bullet time!” The original bullet time was “a borderline hack,” as Mr. Gaeta recalls it, that started with 120 still cameras firing off film photographs of Mr. Reeves dangling on wires. Those images were stitched together with software to simulate a swooping camera move in slow motion. The successor to that technique is known as volumetric capture. A camera array captures people or spaces from every angle, and then A.I. meshes this video into 3-D footage that can be viewed and manipulated from any perspective.