The Blender dream team strikes again! , Mike, and Kevin return to discuss a variety of topics in Blender. will show how to import animations and retarget them to take advantage of Marketplace animations using Rigify. Kevin will chat a bit about Epic’s use of Blender in our recent Fortnite Cinematic trailer. Lastly, we’ll get a “state of Blender” update from Mike. He’ll touch on Eevee and the new 3D format, glTF - and why it matters!
Thursday, September 7th @ 2:00PM ET Countdown]
I’d like to know what cool tech art wizardry might be accomplished using Blender and UE4 together. Blender’s got a lot of cloth and physics stuff but apart from baked simulations I rarely see other applications of it.
Would be nice to see what else you could with Blender for games and UE4 apart from the usually mentioned stuff basically.
Nice! Related to Blender and UE working together, apart from baking ambient occlusion, normals, displacement… is it possible to bake lightmass of a whole level directly in Blender and import it into UE4?
As far as blend file support… Unity just uses Blender’s FBX export so it’s only saving you a few seconds and honestly, it will force you to keep your files cleaner. You don’t want to package a bunch of .blend files with your game.
Sure! that sounds cool, having a way to do cloth from Blender sounds good, would there by any benefit compared to using the new in-engine Nvidia solvers included in 4.16? They have some limitations like multi-layered cloth and so forth and Blender might have some ways to get around those so perhaps a use-case that goes beyond a simple cape or something would be really nice!
There are a few reasons you might not want to use a rigify style rig.
Blender scene units are not correctly set to avoid issues, so you have to run a script before exporting every animation to create a new rig with the fixed settings (instead of changing the scene unit settings once and be done with it). This will also cause problems with custom rigs, for example if you make a hair rig (or whatever) in the same scene it won’t work properly if the root bone is removed.
Unapplied 0.01 scale on the mesh because of the above workaround, which could make newbies think that having unapplied transforms on meshes isn’t a bad thing, which could screw them over later.
When running the script the armature is replaced by the “fixed” armature, so when you want to animate again you’ll have to delete the “fixed” armature (and the empty) and add the normal armature back in the armature modifier because it is removed after deleting the fixed rig (after exporting an animation the mesh will stop responding to the rig every time).
Because of having to run a script before exporting every animation you can’t run other scripts like normal animation batch exporting scripts for exporting every action in the file to a separate .fbx file. Or at least they become way more complicated.
The bones do not have the right names, so if you want to retarget you’ll have to select the right mapping in the retarget manager which is a bit strange for a custom rig, it should just autodetect the bone names (by using the correct names in the rig). We saw this in action in the stream.
Rigify was never meant as a rig for games and anything to make it work like that is more or less a hack (as you can see in the script). It can work as showed in the stream but it takes more time to set up than should be necessary.
So, if I use default settings in Blender, PHAT is totally broken and gives me a garbage Ragdoll. But when I use 0.01 scale settings, sometimes my mesh disappears or is flicking and the preview shows my mesh 100 times scaled. Not sure how I fixed that one, but I know that when working with non default scale, a lot of other tools in Blender are broken (especially modifiers). That’s why I ask the question on Stream what exact settings someone should use, but it seems you all are just happy with default settings and don’t care about PHAT. Can someone experienced shed some light on that mess please? <3
You need to use 0.01 scale settings with applied transforms which means that scale should say 1 for all axes on both the rig and the mesh.
The usual way to fix it from a default Blender setup is to change the scene unit scale to 0.01 metric, select the rig and scale by 100 times, then select the rig and press spacebar and do Apply Object Transform ticking Scale in the lower left, then repeating this for the mesh too. After that things should import fine.
I just saw the stream and was curious to know if the link to that destruction VFX build of Blender is up anywhere? I didn’t see it in the Youtube links or in this post anywhere. But it’s great to know that Blender is used within some games like Fortnite. People always kinda always seem to try to throw in a mention of how Blender isn’t used anywhere professionally when I mention that i use it to them. Glad to know that my preferred tool is used professionally. It gives me hope as a game developer lol.