Unreal Engine for film/TV, current state

I’m aware many threads have been created for this but let’s just fire up another one for where we are right now. :slight_smile:

Where are the biggest hurdles for getting already created content in Maya into Unreal right now? The success with UE used in Star Wars made me want to take another stab at this.

We now have alembic in Unreal which is great, USD is coming and is available in Maya via a free - I believe - plugin from Pixar. I haven’t tried it mind you…

The other thing would be UDIMs, which is available via Granite, haven’t tried that either so feedback here would be interesting. Would be nice if Epic would implement this though.

Multichannel EXR export with set frame rate I believe does exist and work (?) right now.

Has anyone tried doing high end work (heaps of UDIMs at 4k for instance without popping as mipmaps change + plenty of polys. Think Star Wars battle scenes - no need for 24 fps necessarily as long as UE will output the correctly rendered frames without skipping) with Unreal?

Thanks!

You do realize that UE4 used by ILM to “Render” droid K-2SO, is an extremely heavily modified UE down to the source code plus a million custom magic tricks and implementations and very heavy offline compositing with additional render passes NOT rendered in “real time” in order to get where they got? :). Also who said it was running real time as in real time? It is a render output where it makes sure that every pixel is super anti aliased before it exports the frame, don’t expect it to be running at 60/30fps as final output and what you see is what you get. Basically they turned UE4 into a slightly faster offline GPU renderer.

Also You think you can get those real time area shadows (both self cast and environment cast), or reflections on moving objects for free? also the fact that the droid model is such a simplistic and smooth model helps sell the illusion a little better, I doubt if at all they would’ve been able to achieve half as much as they did on something a little more complex. Because sooner they get more complex, the time spent on trying to make it render in UE with bells and whistles vs having it offline on their render farms with far superior results becomes irrelevant.

I think this was a nice little experiment trying to test the early waters in their RnD departments. But this doesn’t mean the current state of film is going real time anytime soon or in the foreseeable future, at best it would be some shots that can be handled in some areas like they did with a custom modification of Cryengine a few years back which was used in Maze runner and a few other films.

My problem in the recent months has been on how misguiding the news outlets and PR machines have been in regards to this, making it sound like they are just outputting these images straight out of the usual engine. This cannot be further from the truth. I think those who are not aware or have the background when it comes to this sort of information can easily be fooled and easily impressed in believing that we have reached a point that even Star wars is being rendered in real time!

It is unfortunate that they can’t put the extra effort in clarifying the details but I understand the PR machine has to keep on turning.

Thanks for the reply ,

I’m aware of the things you mentioned, I don’t think they hid the fact it’s a modified renderer but maybe some articles/posts did but I did not read that many so you could very well be right.

I don’t care much about full 60fps, heck even if it’s 1fps but actually have the ability to render as 24fps that would be something I’d be interested in knowing. It is assumed you’d have top of the line GPUs as well. A project with space ships might just be a good option for testing this out. No massive environments 90% of time, just loads of polys and loads of 4k textures with plenty of UDIMs.

Bottom line I don’t expect this to be a good enough option right now vs something like Redshift or whatnot but I still can’t help but to be curious where UE4 is right now… Maybe I’ll give something a shot.

Thanks.

I think this has always been the case with tech demos vs realtime game performance. To anyone who doesn’t understand game engines it looks like these engines can produce games that look like the demos without any sacrifices (look at Unity’s Adam Demo or UE4’s Kite demo). A real game would have so much more to contend with than just the graphics.

I think its the same with the rendering for film. Realtime will never be on par with offline renderers, but the question is…how close can they get?
@RumbleMonk - I would give UE4 another try. Sequencer and the new Tonemapper give you a lot of control over creating scenes in UE4 - just dont expect Pixar-level renders :slight_smile:

While you won’t get Star Wars quality right out of the box you can get really good results using only UE4. The Hellblade demo is a good example. Especially if you don’t need realtime results, you can do whatever your GPU can handle before crashing. I’ve heard of/seen for example TV shows (cartoony stuff) and music videos so it shouldn’t be impossible. Things like UDIM don’t work without modifications though.

What you are essentially saying is to turn unreal into a GPU renderer with extra cudas as a boost.

While this point of view may sound fine on the surface, in practice it is not and here’s a big reason why:

You have to consider the massive amount of imports and exports pipeline structure (huge time consuming factor even if you have it nailed down), special animation bakes and hierarchies every time changes are made or finalized, this also means setting up special rigs to be recognized for unreal exports, also a huge time killer for every project. Shader/ligthing/reflection limitations (relatively for gpu renderers), and so many other factors that would require putting gloves on while dealing with any realtime engine especially for production work of this kind with massive scene elements VS having everything in your 3d package spending a few extra for hardcore GPU’s getting yourself a copy of the limitless amounts of GPU renderes out there and get far superior results while still maintaining all your modifications and instant access of amendments in one place, your 3d package of choice.

While all this is all good talk and experimental for UE it would simply be an utter waste of time to try to do rendering on it in a serious studio outside the usual realtime/semi realtime (VR architecture/mocap visualization etc…) requirements, and before you know it either your TD’s or clients would come knocking.

Also don’t forget how much extra work the guys for Hellblade needed to do in order to make a scene work while the same scene would’ve been done double or triple time in a traditional GPU renderer inside a main 3d application.

And at worst all this for what 10 to 20 percent render speed gain in Unreal? You forget you would’ve spent much more than that setting the scene up for it to render so the time spent VS time gained is a not ideal at all, especially since user time is always more valuable and expensive than machine time, get the extra cores and get job done better, faster and far more efficiently as it always has been so.

[FONT=Franklin Gothic Medium]Line by line, and using Frankin Gothic as you are a Star Wars fan.

  1. I feel this is your biggest point, not just because you listed it first, but because Maya is very high up the food chain for people working in 3D animation. I think the concern most have is “To what end?” as in, okay; you got Maya - why use another toolchain if your only concern is animating something for filmic purposes. See my point?

  2. Yup, it’s all good - but to what end if you are only interested in filmic end results?

  3. It’s there and can be used, not sure if you mean a licensed copy for free?

  4. I think I get your point on this, but the last thing you want is an endless vindaloop - guess it depends where your chair is in the post-production office factory production line. Too many bottlenecks is bad, you want to go smoothly from script to screen.

  5. I think Epic did a great job on the Kite demo, they got the color mix just about right for that - but would have been better if the final product was HDR and uncompressed to DCI standards. Maybe that’s just me, but I hold color up to higher standards. 4:4:4 is more important to me than poly-count and SFX, which funnily you did not mention?

I believe the bigger question is: How can we best replicate content from other platforms/processes in the production-line with the least effort and most gain. In other words, if I am working in Maya; I want to hit a button and KA-POW it’s on my UE4 monitor.

What are you talking about? You’re saying it’s not practical, but the music videos and shows I was talking about already exist. The Hellblade demo was the most impressive for the realtime acting thing, not the graphics in general, they were more “game like” (sure, AAA quality but anyway). I don’t see why the pipeline would cause issues, import the levels, import the characters, import the animations, that’s it. When they’re in the engine you can just reimport things if they need to change later.

But the biggest issue I have is with the “10 to 20 percent render speed gain”. The Hellblade demo runs at 30 fps. No offline GPU renderer can render that fast. Not even one frame per second. You’re looking at least minutes, at most hours. And this is assuming you’ve got a render farm/multiple GPUs. When I tested rendering 1080p footage in Unreal the bottleneck was my HDD (!), but I still got way faster rendering than I have ever gotten with an offline renderer. And not 20% percent faster, more like 1000% faster. And that’s probably still pessimistic!

No one is debating that you can get better quality when rendering offline, but no one should be debating either that the possible turnaround speed when rendering using UE4 is way faster. Sometimes that’s what important for TV shows and such things, if things look almost as good but you get the results way faster, why not do it that way? Whether you want to see it that way or not is up to you. :stuck_out_tongue:

Duplication of effort.

An animator working in Maya is using assets created by the Art department - why on earth would someone be interested in doing everything all over again in another program.

Animators animate, that is all they do. They have zero to no interest in the before and after, and certainly do not sit back and do an ex post facto consideration for using something else to reduce render times. I seriously doubt any animators even know what a render farm actually is, unless they are being groomed for a raise and promotion. They live in a small and extremely niche environment, operating only in a keyframe by keyframe existence.

A solo animator, sure I can see that. In truth they would use an external render farm anyway to submit their work to whomever their client is. Cheaper, faster, more productive. If they need to produce a PoC, they would render it out in low-poly mode, and probably take a few high-end screen shots. Those also would be rendered at an external farm.

So - if you have to create the assets in Maya, and the animator does not create those assets but works in Maya for animation…where is the impetus for an animator to use UE4?

I do animation myself but it doesn’t matter if it’s for UE4 or Blender or whatever, if it can be exported to .fbx it works anywhere. There’s no duplication of efforts, it’s the same effort. You can use UE4 as a traditional tool by placing animations in Sequencer or do more fancy game like stuff with logic and such if you want. (more suited for realtime than rendering) The latter is what’s used in the game shows that use UE4.

But the impetus as I said before is that it’s way faster to render things in an engine that’s optimized for realtime rendering. This will only get better as Sequencer is improved further in every release.

I’m sorry some of us disagree, Of course i understand what you are trying to say and of course we all would love to render things realtime.

But in its current form UE is very counter intuitive and just an added layer of unwanted complexity in a pipeline. Maybe for very simple and fast things. But other than that in a real world production scenario It is very hard to see how someone would jump onto this path, unless it is specific case.

And I still think you are oversimplifying the process of exchange between apps. It is not a simple click export FBX files we are talking about here (unless you ahve a chair and table turnaround), there are tons of things that need to be configured and many files exchanged and setup initially so that the exchange can happen well and that alone takes so much time. As I said before the setup time does not justify the cost for rendering, that is if it renders at all. And I think you exaggerate about rendertimes on outside GPU renderers.

last time i saw a video clip rendered in Unreal, I can assure you that it could’ve been rendered in a scanline renderer with post for better results. I don’t count Hellbalde as an example here. We are speaking of work that needs quick turn around and quick patches and fixes. Don’t let me get started on Sequencer time setup and horror VS what you already have ready to go in your 3d app.

And I didn’t get to post production here and render elements and so on.

I understand that you are saying that some things may work and I agree they might, but in general I’m afraid we just have to agree to disagree :).

Yes, but why would an animator even care about how long it takes to render something, or whether something is real-time or not?

IF you have to learn another program, that is duplication of effort and you will have to build the lighting and you will have to recreate some elements of what you built in Maya - there is no denying that, so again: to what end?

My export scripts for characters/animations are one button exports. Do you think I’m exaggerating or do you know? Because if you have used offline renderers you would know that they are way slower, even the fastest ones like Redshift for example.

The title of the thread is “Unreal Engine for film/TV, current state.” so that’s what I’m talking about. It doesn’t matter what you do in the pipeline, if you can see the finished product earlier you have more time to polish/edit/change things around. Sequencer is basically an NLA so if you know how those work Sequencer is very easy to use. And if you don’t you probably don’t have an interest in TV/film/other linear media anyway.

But as I said works done in UE4 already exist so you can just look at them and judge for yourself whether it’s worth it or not. For some cases it is, for some it isn’t. For stuff like TV shows it’s more likely to be worth it, for blockbuster movies probably not yet.

You’re right, the animator will not care about the tools so long as the tools work. This includes the assets they have been given to work with and the rendering pipeline. Renders from one application are not the same as renders from another.

For a e.g Maya -> UE pipe to work I’d imagine lookdev would be done in UE, animation would come in as .abc 98% of the time and the lookdev work be applied there. A camera pipe seems a minor hurdle and so does the mentioned hierarchy issues which I don’t actually even think would be an issue at all but that’s coming from my idea of how to set it up. Could be an issue under certain circumstances but really would not know what.

An animator might not care about how long something takes to render but a budget interested mind certainly do. A two day turnaround to get your passes out vs 2 seconds is a huge benefit too. Realtime lighting vs 4 minutes to see what you’ve done is huge too.

I’m looking at it for TV work again, but the fact there is indeed a new pipe to consider and lack of user knowledge are problems. Those things take some time but once ironed out maaaaybe. The big off putting factor to be honest are the small details that might be very hard to fix in UE - like inaccurate off screen reflections that clients might poke at or similar.

Well as an animator I do care how things work, more so if involving video games, as the art of content creation is all about how what I do over here translates to over there and video game engines are notorious for sucking out fidelity in favor of engine optimization. To be honest when someone says it does not mater makes me want to hit them with a wet trout.

As for production use UE4 is “almost” there but means nothing until you can have a digital character walking around an environment and believe that it’s all real.

I’m part of a team about to film a short film in a green screen studio, using Unreal Engine to superimpose the environment around the actors. I’m no game developer or programmer or graphic designer. But I have to say this system is just unbelievable. I have no education on computer graphics yet with a few YouTube videos I’ve been able to figure out how to manipulate the Environments and Props I bought in the Unreal Marketplace. We start filming on June 5 and plan to shoot 53 pages in 2 and a half days, with 95% of the roles played by children.

the process of putting these environments together the way I like and adding props everywhere to give it a more loved in and customized feel (I even put up a few posters and changed one environments language from Japanese to English on everything in the environment from books to posters, the chalkboard, papers, everything. It’s been a really gem.

The kicker, I have yet to step foot in the studios and see how all this works. I trust the guys who own and run the studio, that this is indeed the future, the exec has put just shy of a hundred grand into the production and it all happens in 6 days. I didn’t even mention, we’re going to attempt to edit this in one day, the very next day after filming, and premiere it the day after that.

I hope that this experience will be as incredible as we’ve imagined. I’ve been told we’ll be the first company to produce a mixed reality production 100% with unreal engine. I’ll be sure to drop in and let you know how it goes.

some of the issues I’ve found so far has really been compatibility within the online marketplace, outside of the Unreal Marketplace. Take Turbosquid for instance, the have some things that are compatible with Unreal but it seems a majority of designers out there are designing for still image renders, with incompatible High High poly environments and models. Conversion is never guaranteed and it would almost be worth building what you find in these instances from scratch. The studio had no one to recommend to create our environements or try to convert what we’d find in the online marketplace, outside Unreal’s. If any of you are designers, programmers, artists, the lot, I have a feeling the guys over at LA Castle Studios might have some work for the best of you, helping out companies like the one I work for.

We’re making a pilot for a tv show in the same fashion in August. Any suggestions for a newcomer to this wonderland of possibilities would be so rad. Does anyone know a good designer with some Unreal Engine environments up their sleeve? What are the other outlets I could go to to find compatible environments and props? I haven’t had time to test my hand at making custom props in Blender but I’ve used it before to do some character modeling and ended up making a lamp, a table and a little set backdrop for the model because I found it looked a bit boring after rendering. None the less, is that something I could put some time into without hitting major problems importing them to Unreal Engine? Can I create props or environments In Unreal Engine in the way I would Blender?

Im really amazed by how far computer graphics have come and I’m so excited to be part of this transition from real life sets to Unreal Engine environments!

hey there my question is what is suitable format for character animation in sequencer .fbx or alembic abc coz with .abc i didnt find way to load multiple animation in my sequencer timeline help needed

If you want to use alembics your best bet is to import the abc as a character skeleton, this will convert the abc into a morph sequence. A very nice feature to have but it becomes unusable quickly as topology goes up. It’s also inaccurate.

Alembics are still in the experimental phase and I would not use it right now. Rig your character as you’d normally do for UE4 usage and import that with the animation as an fbx.

Well with Live Link add in 4.19 as well as Unreal Studio we are getting into that edit over here and send it over there where format options is going the way of the VHS tape :wink:

The thing to keep in mind is Unreal 4 is a closed edit environment that once imported, or sent to, it handles the other stuff for you.