Unreal Engine Livestream - Blender to UE4 - Sept 7 - Live from Epic HQ

WHAT
The Blender dream team strikes again! , Mike, and Kevin return to discuss a variety of topics in Blender. will show how to import animations and retarget them to take advantage of Marketplace animations using Rigify. Kevin will chat a bit about Epic’s use of Blender in our recent Fortnite Cinematic trailer. Lastly, we’ll get a “state of Blender” update from Mike. He’ll touch on Eevee and the new 3D format, glTF - and why it matters!

WHEN
Thursday, September 7th @ 2:00PM ET Countdown]

WHERE
Twitch
Youtube

WHO
Wright - Senior Technical Artist - @anonymous_user_f0fa612d](https://twitter.com/TomTomAttack)
Mike “Blender Mike” Erwin - 3D Graphics Developer, Blender Foundation - @anonymous_user_1579d1e6](https://twitter.com/DangerCobraM)
Kevin Vassey - Senior Technical Animator - @kmvassey](http://twitter.com/kmvassey)
Amanda Bott - Community Manager - @amandambott](http://twitter.com/amandambott)

If you’ve got questions about Blender, add them in the comments below!

ARCHIVE

Hi,
very nice to hear that you used Blender for Fortnite trailer!

Unfortunately cannot catch it live but I’ll definitely watch it later. You’ll record it right? :slight_smile:

Yes, it’s immediately available as Twitch archive, then they edit it and upload it to YouTube as soon as possible :slight_smile:

Whaaaa… ??? Blender oh yes Thank You! Amanda :cool:

wish: Cool, would be great if unreal didn’t rename my bones. I use underscores and periods in my bone names.


Request: Also, Alembic files from Unreal to Blender would be lovely as I tend to go back and forth a lot.


Question: One more important question, is there any development on supporting .blend files, in the same way that unity does?


I’d like to know what cool tech art wizardry might be accomplished using Blender and UE4 together. Blender’s got a lot of cloth and physics stuff but apart from baked simulations I rarely see other applications of it.

Would be nice to see what else you could with Blender for games and UE4 apart from the usually mentioned stuff basically.

Excited for this. Always good to see Blender getting more coverage.

Yes, Yes, Yes!

Super excited for his stream!

Nice! Related to Blender and UE working together, apart from baking ambient occlusion, normals, displacement… is it possible to bake lightmass of a whole level directly in Blender and import it into UE4?

As far as blend file support… Unity just uses Blender’s FBX export so it’s only saving you a few seconds and honestly, it will force you to keep your files cleaner. You don’t want to package a bunch of .blend files with your game.

I need to play with Alembic!

UE4 does such an amazing job at baking lightmaps… I’m not sure why you would want to do that in Blender but the answer is I don’t think that’s possible regardless.

I could do an APEX example next time (real-time cloth).

Sure! that sounds cool, having a way to do cloth from Blender sounds good, would there by any benefit compared to using the new in-engine Nvidia solvers included in 4.16? They have some limitations like multi-layered cloth and so forth and Blender might have some ways to get around those so perhaps a use-case that goes beyond a simple cape or something would be really nice!

There are a few reasons you might not want to use a rigify style rig.

  1. Blender scene units are not correctly set to avoid issues, so you have to run a script before exporting every animation to create a new rig with the fixed settings (instead of changing the scene unit settings once and be done with it). This will also cause problems with custom rigs, for example if you make a hair rig (or whatever) in the same scene it won’t work properly if the root bone is removed.

  2. Unapplied 0.01 scale on the mesh because of the above workaround, which could make newbies think that having unapplied transforms on meshes isn’t a bad thing, which could screw them over later.

  3. When running the script the armature is replaced by the “fixed” armature, so when you want to animate again you’ll have to delete the “fixed” armature (and the empty) and add the normal armature back in the armature modifier because it is removed after deleting the fixed rig (after exporting an animation the mesh will stop responding to the rig every time).

  4. Because of having to run a script before exporting every animation you can’t run other scripts like normal animation batch exporting scripts for exporting every action in the file to a separate .fbx file. Or at least they become way more complicated.

  5. The bones do not have the right names, so if you want to retarget you’ll have to select the right mapping in the retarget manager which is a bit strange for a custom rig, it should just autodetect the bone names (by using the correct names in the rig). We saw this in action in the stream. :stuck_out_tongue:

Rigify was never meant as a rig for games and anything to make it work like that is more or less a hack (as you can see in the script). It can work as showed in the stream but it takes more time to set up than should be necessary.

I prefer the UE4Tools rig (http://www.lluisgarcia.es/ue-tools-addon/) because:

  1. It forces you to use correct scene setup (0.01 metric) because the rig is that size.

  2. No unapplied transformations.

  3. No script required after generating a rig, you can use it for multiple animations exporting the same rig (control rig with deform only ticked) instead of having to create a new one.

  4. Ability to run batch export scripts or other crazy scripts (see UE4 animation retargeted(?) to Blender control rig - YouTube for retargeting UE4/marketplace animations to the IK control rig).

  5. Correct names for retargeting by default so you don’t have to waste time doing that.

It’s also on github (GitHub - lluisgarcia/UE4-Tools: Blender addon for improve the workflow between blender and Unreal engi) and uses templates for the rig so it could easily be expandable by the community in the future to support other kinds of rigs like four legged rigs and so on. I think it’s a better alternative all things considered.

So, if I use default settings in Blender, PHAT is totally broken and gives me a garbage Ragdoll. But when I use 0.01 scale settings, sometimes my mesh disappears or is flicking and the preview shows my mesh 100 times scaled. Not sure how I fixed that one, but I know that when working with non default scale, a lot of other tools in Blender are broken (especially modifiers). That’s why I ask the question on Stream what exact settings someone should use, but it seems you all are just happy with default settings and don’t care about PHAT. Can someone experienced shed some light on that mess please? <3

You need to use 0.01 scale settings with applied transforms which means that scale should say 1 for all axes on both the rig and the mesh.

The usual way to fix it from a default Blender setup is to change the scene unit scale to 0.01 metric, select the rig and scale by 100 times, then select the rig and press spacebar and do Apply Object Transform ticking Scale in the lower left, then repeating this for the mesh too. After that things should import fine.

I just saw the stream and was curious to know if the link to that destruction VFX build of Blender is up anywhere? I didn’t see it in the Youtube links or in this post anywhere. But it’s great to know that Blender is used within some games like Fortnite. People always kinda always seem to try to throw in a mention of how Blender isn’t used anywhere professionally when I mention that i use it to them. Glad to know that my preferred tool is used professionally. It gives me hope as a game developer lol.

You said scale the rig by 100, will I end up with a 200m character then?