Folks,
Let me start by making it clear that I am not really a gamer or game developer. I am a producer for film and TV, but I have been in and out (mostly out) of the CGI world for a long time so I can understand the basics and usually find my way around related software, but until recently, I wasn’t even aware of Unreal Engine.
For no apparent reason, about 6 months ago, someone sent me a link to Unreal Engine and said, “You’re welcome.” Being curious, I dug into it. I gave myself a 30 day time frame to see how much I could learn and achieve. The results are here: https://youtu.be/DV5l7d4b9Cs
During that time, I was beginning to wonder if I could use UE4 purely as a rendering engine for an animated project, a TV show, specifically. I had only one previous experience with using a GPU based renderer, which had it pros and cons vs something like VRay. It had been mostly positive, but the company is no longer around and development ceased. However, it was not a game engine, and behaved and could be treated almost as a normal renderer.
Unreal, on the other hand, while extremely powerful and capable of amazing results, is a game engine first, and through the course of my next test, I would come across many hurdles. Sometimes, I felt the constraints were limiting me, and my ability to achieve exactly what I desired, but in the end, I was able to get very close to my intended goal. With more time, I realized that I could probably get exactly what I wanted, however, as this is just a test, I have to draw a line somewhere and start putting efforts into the next phase of the project.
https://youtube.com/watch?v=a3rH2nl3_5k
The first major hurdle/learning experience for me was how to deal with lighting. In a regular renderer, a spot light is a spot light. You don’t have to deal with Static, Stationary, or Movable. This juggling of light types, how many and an assortment of properties that one doesn’t encounter in regular rendering engines, baffled me. I constantly felt like I was having to cheat or trick Unreal into working for me. In addition, there are issues one encounters on most platforms, bugs. To be honest, not many plagued me, but one in particular had me almost in tears for about a month. being new to Unreal, I simply went with the idea that the problem had to be my fault because I had no clue what I am doing. Turns out it was a bug that few people would ever stumble upon. Subsequently, it’s joined the ranks of others in need of attention.
I am producing a TV show that is a bit rough around the edges, like many shows that air on the Cartoon Network’s Adult Swim. These aren’t Pixar quality material. The pipeline has to be inexpensive and fast, requiring a small production team. After a lot of research into various elements, I decided I’d make a test video that would use what I believe will be the final pieces of technology for the production of the actual show. The above video is the result of the test, with a few things missing.
Technical information:
- Environment - Modo
- Characters - Character Creator from Reallusion iClone Suite
- Character Animation: Mocap with Perception Neuron
- Mocap Cleanup: MotionBuilder & iClone
- Textures & Materials: Allegorithmic Substance Designer
- Post: Adobe Premiere, Audition, and After Effects
What’s missing from UE4:
- Vertex Animation!!!
- timeline skeletal bone control
- Rendering Layers
- Rendering specified camera
- Script control
- Better anti-aliasing
What’s missing from the video (my pipeline):
- Facial Mocap
- Cloth Sim
- Better Hair sim (will probably use HairWorks for final production)
When I started, I saw several problems in front of me, which normally would have made me shy away from Unreal as a rendering solution, but I was also aware of several UE4 features that were “coming soon.” I put that in quotes because I have now learned that the UE4 roadmap, is sometimes like looking a a pirates treasure map; you may never get what you are looking for, even if it is on the map. This couldn’t be more true with Vertex Animation support. When I started, it was on the map for last Oct. I thought, great, it’s right around the corner. You can guess how that went. That was the first limiting factor, I could have painstaking worked around it, but i didn’t see the point for the TEST, as in the future, there will either be vertex animation support or I will move on to a different renderer. Since Vertex Animation has come off the wishlist/backlog, I think we might have it soon.
It would have been possible to produce the above video without the sequencer, but I wouldn’t have wanted to do that, and I certainly wouldn’t pursue the test or any future cinematic without it. This massive feature, I knew for a fact was coming to the engine, so I focused on modelling, which took about six weeks, which includes two other big locations not used in the video.
I always knew the only way to produce the animation in a timely manner was to use mocap, I had ordered a Perception Neuron rig and was told that it would arrive late November. That came and went and I was informed it would be sometime in Q1 of 2016. At the time, that could have been four months away. This didn’t make me happy. I actually put the project on hold. My luck changed a few days before Christmas when the rig showed up. In addition, the master build of 4.11 was on the GitHub, which meant I finally had access to the beginnings of the sequencer.
“Stuttering” John, a well-known stand up comic, was looking at my work and was intrigued. I had another guy guy come over and they both suited up. We did the mocap in about 20 minutes. I spent time doing more cleanup than I would normally do because I was new to the process and also new to the rigs and hadn’t done the mocap properly. It probably would have been quicker to redo the mocap session. The lip-sync was done through iClone’s automated system, but it really only gets you about 50-75% the way to something usable. iClone works at 60FPS, so I had nearly 20,000 frames of lip-sync to go through TWICE, once for each character. In the final production pipeline, we will use facial mocap. iClone will have the system available toward the end of the year, in theory. If not, we’ll go with another system.
Working with the sequencer was great! Matinee and I never got along, so I didn’t bother with it, I knew that the sequencer was on the horizon. Unfortunately for me, 4.12, has many sequencer features that were not in 4.11, and because I was using the Substance plugin, I could not move to the 4.12 master build. Therefore, I had no Cine cameras, which would have been great. Again, this was just a test.
EPIC PEOPLE START HERE!
Things that would have been nice would be to set up render passes, which includes things like which objects, which frames, cameras and so forth, plus have the ability to use a script to control the engine. This could allow the automation of multiple machines to render out different parts of the animation, unattended. This project took about 3 hours to render out 4 minutes of animation on a GTX 980Ti. The real reason it took this long was because I was rendering out to EXR files. Render passes of specifc layers/objects will make it much easier when compositing with other sources, such as, VFX from Max or Maya
There are some things I wish the sequencer could do or do differently. In this project, I had to blink the character’s eyes in the engine. The animation of the blink morph was not embedded in the FBX, so I wired up a few nodes in the Level BP and my problem was solved. The blink custom event was called by the sequencer. For starters, morph animation is not represented when scrubbing through the timelime. You can only see it if you PLAY the level, which I don’t do, or you render it out.
Since the timeline lets you keyframe assets, it would be nice if you could keyframe bone transformations. I had a problem with intersecting geometry on a character, and the solution was to go outside of UE4, tweak the animation, and then go through the export-import process again, which can be a major undertaking, especially since it is a trial and error situation. Had I access to a bone transform directly from the sequencer, the problem would have been fixed in two minutes.
Wrap Up! (Does this guy ever shut up?)
In the end, while there were times I was pulling my hair out, it was a great experience. The engine allowed me to get pretty much instant feedback when setting everything up and blocking out my scene. When it came to camera movement and placement, it was incredible. Moving forward, I will be working on a branch with several GameWorks components. These will go a long way to fixing several missing features and expediting the pipeline.
Lastly, a HUGE thanks as always to the awesome Unreal community and the amazing Epic staff for help and support through this project. I’ve had the opportunity to meet several folks in person at GDC and the LA meetup. I look forward to building on that network, and collaborating in the future.
Cheers,
Sterling