AI Movie ‘Our T2 Remake’ offers artistic but no threat to the industry.

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

The rain was relentless outside. Inside the lobby of NuArt Theater in Los Angeles, there was joy. At the sold-out March 6 premiere of the crowdsourced-funded “Our T2 Remake,” a full-length parody of “Terminator 2: Judgment Day” produced entirely with AI tools, Energy Missed Sundance circa 1995.

Proponents include Caleb Ward, co-founder of AI filmmaking education platform Curious Refuge and creator of last year’s viral AI short “Star Wars by Wes Anderson;” He was also the creator of Dave Clark, who made another viral AI video with his Adidas feature and whose short film “One More” will precede the feature. Neem Perez, founder of AI storyboarding mobile app StoryBlocker Studios and director of “T2 Remake”; and Jeremy Boxer, a former Vimeo creative director who now runs the consultancy Boxer and co-founded the community group Friends With AI.

With the audience seated, ready to admire the AI ​​creations, host Perez and executive producer Sue Molina took the stage for the first order of business: thanking the creators — and dispelling hopes that the movie will be a big hit in cinema. Can jump.

“Let’s have some expectations here, shall we?” Perez said with an embarrassed laugh. Hopefully some of you aren’t here to see the ‘Terminator’ remake. It’s far from it, folks, because we really didn’t. It’s an experimental film -“

High Experimental,” Molina said.

“A highly experimental film, using experimental technology -“

High Experimental.”

On screen, the soft sell wasn’t false politeness. The global AI creators were assigned to create 50 parts of the film in two weeks, with a few creative parameters (a mockery of ChatGPT’s global domination by Skynet in 2050, do not use any images or dialogue from the original), and the filmmaking process. Different levels of talent, lose the familiar plot. It soon became a foregone conclusion. Some of the photos were impressive. Others were undercut by the rapid development of AI. Although the scenes were only six months old, more than a few people dated “Flash Gordon” in the post-“Star Wars” world.

After the screening, Perez told IndieWire that the film achieved its goals as a showcase and was a blow against what he called the “doomsday scenario” that can dominate AI discussions.

“Every time someone brought up the subject of AI, Skynet, which is the AI ​​in ‘Terminator,’ and how it was going to end the world, we’d all lose our jobs.” They said. “Meanwhile, there were strikes and it was a very polarizing time. A lot of people weren’t just seeing the positive aspects of AI and how it was bringing this community together, and how people could create really amazing things. had been.”

Each artist used a variety of mixed media styles (from glitchy T-pose models to Funko Pop figures), moving every two to three minutes. (To maintain a coherent story, the producers included short interstitials from the original “T2” in post to help viewers.)

‘Our T2 Remake’Story Blocker Studios

An open call recruited artists, primarily from X (fka Twitter). Among them are AI rock stars like The Butcher’s Brain, Uncanny Harry, Jeff Synthesized, and Brett Stewart (who was inspired by his polished-looking Teddy Bear Terminator built with a custom Laura AI model). Most artists used the Midjourney and Runway platforms, and generators from PicaLabs (video), Caber (animation), Leonardo (photo) and Snow (music).

Perez devised a collaborative workflow in which artists selected scenes through a web store such as limited-release NFT drops, with assigned channels to post their progress. The sound was, for the most part, pulled from a licensed stock website. He was also invited to a private Discord for post-production collaboration.

The scenes were dominated by live-action and animation parodies (including a “Star Wars” riff on “Law & Order”), various stylistic interpretations of Arnold Schwarzenegger’s iconic T-800 Terminator unit. Another reason the film looked disjointed was that the producers ran two waves of artists with a lull in between, allowing them to quickly refine the toolset.

‘Our T2 Remake’Story Blocker Studios

Perez and Molina also participated in their respective scenes. “I was thinking about ‘Robot Chicken’ and parodying the sitcom with the Funko Pop toys,” Molina said. “And the first thing that came to mind was Mr. T as the Terminator. It starts with Mr. T-800’s jingle, where his head is exploding in all directions. It was actually a childhood inspiration from the introduction of the classic ‘Donald Duck’ shows. I was writing and editing and creating photos at the same time. That’s kind of the beauty of AI.

Perez contributed to the opening drone and spaceship attack, which completed the final scene, and the climactic encounter between the T-800 and T-1000. “For the opening, I did a lot of visual effects compositing and 3D work,” he said. “I added some 3D spaceships or hunter drones flying into the scene because motion is difficult for AI, and the only way to do that was with 3D. Then I used compositing for the background to The fire could move a little more vividly, and so the smoke and fog effects added a little more life to each shot.

He also added his performance. “In the Terminator fight scene, I used something called Gen-1, where you put effects on top of things you shoot,” Perez said. “I dressed up like a T-800 and a T-1000, bought a leather jacket and a police uniform on eBay, and played both scenes on green screen. I’m a big video game nerd and I was in the ’90s. I grew up playing ‘Mortal Kombat’, so it was a play on the video game where the two faced off.

Perez said AI is still in its wild west early stages, which will become increasingly sophisticated as artists experiment with increasingly refined tools. Meanwhile, its StoryBlocker app uses scripts to help watch.

‘Our T2 Remake’

“Personally, I don’t think films like ‘T2’ will replace traditional filmmaking,” he said. “the biggest [weakness] There is character consistency, ensuring that your characters look exactly the same in every shot with their wardrobe and hair. These are tricky because the AI ​​interprets it differently every time you point it. He described the film’s lip-syncing as “brutal” due to the tools available at the time. They have improved rapidly.

“The other thing is camera control, which didn’t exist when we started,” Perez said. “It’s mostly light motion shots, either push-ins or pans, but you won’t do a crane-down shot in a dolly push-move or anything too complicated. But it’s getting better. Runway introduced more camera controls. Done, but you’re still limited to four seconds of scene. Anything else after that breaks the character.

Even on their truncated timeline, Molina said they are all too aware that the tools under their fingers are becoming obsolete. “We were tempted to make changes, go back and fix things, but we decided not to,” he said. “We’re making history here. This is a time capsule, so let’s leave it at that. As we’ve already seen over the last six months of making this film, the quality and control has been just phenomenal.

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

Leave a Comment