In a spotlight blog post on the completion of the project released today, the Unreal Engine team at Epic Games announced that the debut of the new real-time animated short from Mold3D Studio Tear down is made even cooler with the addition of the hero Windwalker Echo to the downloadable assets in Unreal Engine 4.27 and Unreal Engine 5 Early Access. The character was featured in the UE5 reveal video, Lumen in the land of Nanite. Mold3D and Epic will fill up Tear down sample project available for download and exploration later this month.
Mold3D was founded in 2016 by CEO Edward Quintero, a 22-year-old veteran creative leader, animator, VFX and environmental artist whose resume includes roles at ILM, DreamWorks Animation and successful projects such as The matrix trilogy, Avatar and The Mandalorian. The studio was formed to explore the potential of real-time technology and content creation, as Quintero was collaborating with Epic on Unreal Engine projects such as Paragon and Robot recall.
Quintero started to launch Tear down to Epic after the UE5 demo ends. The proposal was to create a finished piece of final pixel animated content in Unreal Engine. With Mold3D starting to gain a reputation for environmental art, they were excited to exemplify their expertise in story development and character design. With the exception of Windwalker Echo, the Tear down assets (including its opponent) were all designed and created by Mold3D.
As the project received the green light just as the COVID-19 pandemic struck, the studio quickly switched to a remote working environment to create real-time rendered animation – along with other Mold3D projects. – using virtual production techniques. Motion capture was created in Las Vegas, with the Quintero team in Burbank directing actors through Zoom while viewing real-time character results in Unreal Engine, making it easy to ensure they are have the grips they wanted. After the main motion capture, the team did a second session with the actor just for the face capture. For this, they used the Live Link Face iOS app.
“While we probably would have done a lot of the same things if it hadn’t been for a pandemic, luckily we were able to rely on the virtual production aspect of the shoot to save the day,” Quintero said. “We were able to watch her takes with the recording pulled out of the iPhone and also on the same day we could see the camera looking at her.”
The team had previously modeled assets in Maya and ZBrush, before blocking animation in Maya and bringing it into Unreal Engine via FBX, where they also blocked cameras in Sequencer, the built-in multitrack nonlinear editor. ‘Unreal Engine. Taking advantage of the engine’s ability to render files in real time, they brought in daily animation, starting in a very raw state, even as the models themselves were still being finalized. For the development of the appearance, the team used many materials and shaders from Unreal Engine, including vertex shaders and decals, not only to give a unique effect, but to help maintain performance over time. real.
“It was great to see the previews with light and color. You got a taste of what it was going to look like right away, instead of having to wait a few months to start getting a little more involved. It was invaluable because it helped us visualize and refine the look along the way, ”noted Quintero. “It combines tips, techniques and processes that have been learned over my years working in the visual effects and animation industry, with the benefits of being able to iterate and quickly visualize the results. in real time. “
Additionally, Mold3D used Unreal Engine’s Landscape toolset to create the terrain, and Quixel Megascans – which are free for use with Unreal Engine – to populate the environment. Effects, such as the Glowing Orb, were primarily done in Niagara, Unreal Engine’s visual effects system. Lighting played a key role in the appearance of the project, with the team leveraging Unreal Engine’s real-time ray tracing capabilities to produce sophisticated effects. To refine the close-up lighting on the characters, they built movie-style lighting fixtures in Unreal Engine, allowing them to create beauty lighting, rim lighting, key lighting, and more.
“We had a certain look that we wanted. I think originally we wanted it to be very stylized, like a manga / anime genre, but it was more about trying to make it realistic. And it ended up being a bit of a hybrid, not super photorealistic, but it does have a bit of a stylized hue. Lighting was a big part of it; we worked with several lighters to get the right look, ”explained Quintero. The potential for rapid change in lighting in real time was one of the things the team appreciated the most: “If that doesn’t work for shooting, you can quickly move the direction of the sun or move around. quickly light up the character and fight your way instead. having to turn on, render overnight, come back, check your renderings and not be happy.
Aside from the obvious benefits of being able to render images in fractions of a second rather than minutes or hours, Mold3D also appreciated being able to conduct multiple aspects of production at once – for example, working on appearance development. in parallel with the animation and be able to make decisions with flexibility, depending on the context of the visuals; not to mention the composition, timing, camera and lighting.
“This wouldn’t have been possible without the creative collaboration and continued support of the Epic Games team,” said Quintero. “They have been there for us every step of the way. Our business has grown by leaps and bounds on this project, and we look forward to many more great productions in Unreal Engine.
To concern Tear down on Youtube.