How to import a fluid simulation from blender?

Blender has some very nice fluid simulation built-in. I’ve used it to generate an animated mesh. Although the mesh animates, it is not a skeletal mesh. How can I import this animation into UE4?

The idea is to have some objects in blender that mirror the world geometry in the scene, do a fluid simulation in blender to create the fluid mesh flowing over the objects, then export that animated mesh into UE4 to play the fluid flowing through the scene.

If you watch the infiltrator demo video about the effects, he talks about using some simple geometry for splashes and such inside the particle system.

Blender does create separate meshes for each ‘frame’ of fluid sim so you could technically rig something up using these and swapping them out each frame.

However, for flowing water, I would suggest a different approach. There are a lot of drawbacks to doing it the way you are thinking. Mostly, it will be pretty resource heavy.

What I would suggest instead is to rely on a more texture-driven approach. You can set up some static geometry to represent the water and then in UE use animated shaders to drive the water texture and displacement. If you set up the UV maps properly you can get the feeling of water flowing around objects and such. Particle emitters in the correct places can give you your splashing effects.

If you wanted to get really fancy, you could run a fluid sim in Blender and then use it to bake your bump/normal maps for the simpler geometry in UE, but that approach might be more than is needed. Simpler is often better with these things :slight_smile:

Ah, hmm, interesting.

One thing to point out is that the fluid sim in blender is NOT one mesh, there’s a different mesh generated per frame of the sim. That’s the reason it is so heavy data-wise.

Totally different approach, but if the player is going to be in a fixed position during the event could you actually render the scene in stereo with 360 degree cameras in blender and set it up so the player can look around and get the depth perception effect like you want, but everything would be 100% pre-baked.

Another thing to note is that you can drive actual geometric displacement with textures.

Thanks for the suggestion. This is certainly good advice from a traditional approach. However, I’m approaching this from a VR perspective. It’s very hard to do this convincingly when you view the scene through the oculus rift. Because of the depth perception, it’s easy to tell when something is not quite right. So in my scene, I want to put the player in the middle of a valley with a huge dam breaking and a wall of water rushing in. I need all the splashing, sloshing, and idiosyncratic surface geometry associated with a fluid. I understand that doing so will create a mesh with a huge number of vertices, but I don’t mind spending the computing resources to do it because the water is the main feature of the experience. It’ll be a one time event, the player doesn’t need to interact with the fluid, then the animation is over and the game proceeds.

Yeah, I had a feeling that it would be a different mesh at each frame. Darn!

I like the second suggestion as well. However, it would require figuring out how to render a different 360 video for each eye, then I’d probably need to go down to a pretty low level part of the UE4 api to and do some programming to render a different 360 video for each eye. Right now, this is just a proof of concept so not ready to take it that far yet, but certainly that seems like the most efficient use of computing resources for this situation.

Also, this wouldn’t work for version 2 of the oculus rift dev kit since it tracks camera translation for increased immersion. Dev Kit 1 only tracks rotation, but DK2 will track both rotation and translation. So if you lean forward, the camera in game moves with you.

Maybe I can capture a normal map from the sim? Generate a second simulation with lower resolution to get a simplified mesh? Play the normal map on a simplified mesh to drive some of the surface geometry? Is that what you’re suggesting?

Actually, you’d want to do the rendering in blender from two separate cameras, one for each eye. It would require a different setup in UE for that cutscene though, for sure.

You could use a mix of meshes with interesting water materials, and GPU particles to get the parrallax effect you are hoping for.

For something like a Dam breaking you are going to need a mix of lots of small particulates, and large volumes of more foamy looking surfaces. The larger surfaces you could create with a morphTarget/Skeletal mesh/simulation exported, and then mix mesh emitters and tons of GPU particles with depthCollision and velocity grids to add natural motion.

You are most likely going to need to use a mix of techniques to achieve this result, a pre-baked simulation alone won’t do the trick.

If you download the Elemental content you can see how we mixed meshEmission with vertex animation and sprites for lava bursting out of the cracks in the snow in the exterior section.

I hope that helps get you going!