Cinematic Pipeline with Unreal Engine

Folks,

Let me start by making it clear that I am not really a gamer or game developer. I am a producer for film and TV, but I have been in and out (mostly out) of the CGI world for a long time so I can understand the basics and usually find my way around related software, but until recently, I wasn’t even aware of Unreal Engine.

For no apparent reason, about 6 months ago, someone sent me a link to Unreal Engine and said, “You’re welcome.” Being curious, I dug into it. I gave myself a 30 day time frame to see how much I could learn and achieve. The results are here: https://youtu.be/DV5l7d4b9Cs

During that time, I was beginning to wonder if I could use UE4 purely as a rendering engine for an animated project, a TV show, specifically. I had only one previous experience with using a GPU based renderer, which had it pros and cons vs something like VRay. It had been mostly positive, but the company is no longer around and development ceased. However, it was not a game engine, and behaved and could be treated almost as a normal renderer.

Unreal, on the other hand, while extremely powerful and capable of amazing results, is a game engine first, and through the course of my next test, I would come across many hurdles. Sometimes, I felt the constraints were limiting me, and my ability to achieve exactly what I desired, but in the end, I was able to get very close to my intended goal. With more time, I realized that I could probably get exactly what I wanted, however, as this is just a test, I have to draw a line somewhere and start putting efforts into the next phase of the project.

https://youtube.com/watch?v=a3rH2nl3_5k

The first major hurdle/learning experience for me was how to deal with lighting. In a regular renderer, a spot light is a spot light. You don’t have to deal with Static, Stationary, or Movable. This juggling of light types, how many and an assortment of properties that one doesn’t encounter in regular rendering engines, baffled me. I constantly felt like I was having to cheat or trick Unreal into working for me. In addition, there are issues one encounters on most platforms, bugs. To be honest, not many plagued me, but one in particular had me almost in tears for about a month. being new to Unreal, I simply went with the idea that the problem had to be my fault because I had no clue what I am doing. Turns out it was a bug that few people would ever stumble upon. Subsequently, it’s joined the ranks of others in need of attention.

I am producing a TV show that is a bit rough around the edges, like many shows that air on the Cartoon Network’s Adult Swim. These aren’t Pixar quality material. The pipeline has to be inexpensive and fast, requiring a small production team. After a lot of research into various elements, I decided I’d make a test video that would use what I believe will be the final pieces of technology for the production of the actual show. The above video is the result of the test, with a few things missing.

Technical information:

  • Environment - Modo
  • Characters - Character Creator from Reallusion iClone Suite
  • Character Animation: Mocap with Perception Neuron
  • Mocap Cleanup: MotionBuilder & iClone
  • Textures & Materials: Allegorithmic Substance Designer
  • Post: Adobe Premiere, Audition, and After Effects

What’s missing from UE4:

  • Vertex Animation!!!
  • timeline skeletal bone control
  • Rendering Layers
  • Rendering specified camera
  • Script control
  • Better anti-aliasing

What’s missing from the video (my pipeline):

  • Facial Mocap
  • Cloth Sim
  • Better Hair sim (will probably use HairWorks for final production)

When I started, I saw several problems in front of me, which normally would have made me shy away from Unreal as a rendering solution, but I was also aware of several UE4 features that were “coming soon.” I put that in quotes because I have now learned that the UE4 roadmap, is sometimes like looking a a pirates treasure map; you may never get what you are looking for, even if it is on the map. This couldn’t be more true with Vertex Animation support. When I started, it was on the map for last Oct. I thought, great, it’s right around the corner. You can guess how that went. That was the first limiting factor, I could have painstaking worked around it, but i didn’t see the point for the TEST, as in the future, there will either be vertex animation support or I will move on to a different renderer. Since Vertex Animation has come off the wishlist/backlog, I think we might have it soon.

It would have been possible to produce the above video without the sequencer, but I wouldn’t have wanted to do that, and I certainly wouldn’t pursue the test or any future cinematic without it. This massive feature, I knew for a fact was coming to the engine, so I focused on modelling, which took about six weeks, which includes two other big locations not used in the video.

I always knew the only way to produce the animation in a timely manner was to use mocap, I had ordered a Perception Neuron rig and was told that it would arrive late November. That came and went and I was informed it would be sometime in Q1 of 2016. At the time, that could have been four months away. This didn’t make me happy. I actually put the project on hold. My luck changed a few days before Christmas when the rig showed up. In addition, the master build of 4.11 was on the GitHub, which meant I finally had access to the beginnings of the sequencer.

“Stuttering” John, a well-known stand up comic, was looking at my work and was intrigued. I had another guy guy come over and they both suited up. We did the mocap in about 20 minutes. I spent time doing more cleanup than I would normally do because I was new to the process and also new to the rigs and hadn’t done the mocap properly. It probably would have been quicker to redo the mocap session. The lip-sync was done through iClone’s automated system, but it really only gets you about 50-75% the way to something usable. iClone works at 60FPS, so I had nearly 20,000 frames of lip-sync to go through TWICE, once for each character. In the final production pipeline, we will use facial mocap. iClone will have the system available toward the end of the year, in theory. If not, we’ll go with another system.

Working with the sequencer was great! Matinee and I never got along, so I didn’t bother with it, I knew that the sequencer was on the horizon. Unfortunately for me, 4.12, has many sequencer features that were not in 4.11, and because I was using the Substance plugin, I could not move to the 4.12 master build. Therefore, I had no Cine cameras, which would have been great. Again, this was just a test.

EPIC PEOPLE START HERE!

Things that would have been nice would be to set up render passes, which includes things like which objects, which frames, cameras and so forth, plus have the ability to use a script to control the engine. This could allow the automation of multiple machines to render out different parts of the animation, unattended. This project took about 3 hours to render out 4 minutes of animation on a GTX 980Ti. The real reason it took this long was because I was rendering out to EXR files. Render passes of specifc layers/objects will make it much easier when compositing with other sources, such as, VFX from Max or Maya

There are some things I wish the sequencer could do or do differently. In this project, I had to blink the character’s eyes in the engine. The animation of the blink morph was not embedded in the FBX, so I wired up a few nodes in the Level BP and my problem was solved. The blink custom event was called by the sequencer. For starters, morph animation is not represented when scrubbing through the timelime. You can only see it if you PLAY the level, which I don’t do, or you render it out.

Since the timeline lets you keyframe assets, it would be nice if you could keyframe bone transformations. I had a problem with intersecting geometry on a character, and the solution was to go outside of UE4, tweak the animation, and then go through the export-import process again, which can be a major undertaking, especially since it is a trial and error situation. Had I access to a bone transform directly from the sequencer, the problem would have been fixed in two minutes.

Wrap Up! (Does this guy ever shut up?)

In the end, while there were times I was pulling my hair out, it was a great experience. The engine allowed me to get pretty much instant feedback when setting everything up and blocking out my scene. When it came to camera movement and placement, it was incredible. Moving forward, I will be working on a branch with several GameWorks components. These will go a long way to fixing several missing features and expediting the pipeline.

Lastly, a HUGE thanks as always to the awesome Unreal community and the amazing Epic staff for help and support through this project. I’ve had the opportunity to meet several folks in person at GDC and the LA meetup. I look forward to building on that network, and collaborating in the future.

Cheers,

Sterling

LMAO!

That second video was great :smiley:

Good work all around though!

Congrats, great work! As a filmmaker myself, I enjoyed reading your report. :slight_smile:

Thank you!

At some point, I thought I was writing way too much about this thing, but I put so much time and effort into the animation that I thought it warranted some serious contemplation. I also thought it might be good for Epic to get a little perspective from film/TV people. I know there are other cinematic efforts out there. Might as well throw my two cents in.

:smiley:

Hahah it’s good for your skin!
You can use morph targets instead och vertex animation right? :slight_smile:

I created a group for Unreal filmmakers to share tips and tricks, join! :slight_smile: https://www.facebook.com/groups/1045465492195829/

I’ll check out your Facebook group.

Morph targets for vertex animation has it’s uses, but is severely limited and cumbersome. For many uses, I see it as just another way to try and “trick” Unreal into giving you what you want. Of all the things that are missing from the engine, this one is the most baffling to me. I am fully aware of the challenges it creates. I’m also aware that it isn’t the most “game friendly” when it comes to speedy run-time requirements. However, it is in both Unity and Cryengine, I hear people desperately asking for it, and if Epic want to expand into non-game areas, it is a requirement. Before starting this project, I knew what was missing and what hurdles I would have to overcome. Unfortunately, I saw Vertex Animation as being released around the corner from when I started, so I figured I was in a safe place. Sadly, I was seriously wrong.

-S

Cool :slight_smile: I’m curious because I never used vertex animation before. Is it just better for just quick fixes on your mesh or is it better for like squash and stretch type stuff too? I have to try it out. Thanks for the hint hehe

For one, you could do a lot of intense animation work externally in Max or Maya, where you can have better control over things like cloth simulation. For example, if you had a character get out of bed and you want the blanket on top to move appropriately, then the character grabs the blanket and drags it off the bed. This would be a rather complex cloth sim and would probably require some manual tweaking on top of the simulation. Forget about trying to do this with a series of morph targets. On the other hand, you could export the vertex animation (large files for something this complex) and you’d be all set.

-S

Aaah, so you can have vertices move in a sequence, not only 0-1 like in a morph taget. Now I get the use of it. Thank you! … And yes I will need that too in the feature hehe

Cheers

Among other issues with using morph targets for vertex animation is that it really only gives you an approximation since it is tweening between morphs, this is useless in many situations. I won’t mention names, but someone at epic told me to manually do some vertext painting to then displace a mesh if I needed fine tuning. Insert Facepalm here! Not to mention that you can’t do vertex painting on skeletal meshes.

-S

I am currently carving these words on a gold plate.

I should mention that I did try to use subdividing in my character materials, but they created a problem which I didn’t actually spend any time trying to resolve. As a result, the characters look a bit low poly. They are between 45-60k each. This doesn’t solve the AA issue, but just another thing regarding smoothing out assets in the scene.

You can see the low poly on the edge of the character’s head:
8bc59b2c8068d6da1b57246a93bc759ccb0d5240.jpeg

You can see it smoothed out with material tessellation, but you also see the resulting problem:
2482ba21f1a565c4c3f0aaad502cda0e96b0cd74.jpeg

I think the problem is a result of the mesh having multiple materials on connected polys, or that it is a skeletal, morphing mesh
Sub2.jpg

RE: the anti-aliasing issue, I have considered, rendering out at a crazy resolution and then scaling down in post.

-S