Transform data, point clouds, instances from other DCCs to Unreal?

Hello all!
I’ve been struggling to verify some information and processes for working with Maya or Houdini and UE4. Here’s a kind of scenario that I’m looking to have…

In Maya, I have a massive bowl of cereal. Let’s say, Lucky charms or something. I’ve got the breaded X’s and O’s, I’ve got 3 types of marshmallow’s, hearts, stars and clovers. Each cereal type is highly detailed, like 3000 polys each.

so my asset list is this…


using some particles or rigid bodies, I’ve dropped 300 instances of each cereal type in the bowl and it looks great, no collisions, perfect pile of cereal in a big ol’ bowl.

What are my options for getting it into Unreal? Maintaining my perfect positions and instancing so it’s renderable?

Going from DCC to DCC, their are a number of ways I would typically use.

  1. Export as FBX with “preserved instances” checked. The docs for Unreal don’t mention one way or another if this is supported but from a simple experiment it seems it’s not, but maybe I did something wrong. I’ve not found any posts claiming that it works either,
  2. Export a locator or point cloud with the transform and orientation data and which instance to use for the cereal pieces, then import the piece types into unreal, reattach the. some how to the point or locator cloud as instances or HISMs somehow?
  3. write out a json or python file, with the transform and orientation values and which instances, somehow interpret that in unreal to recreate it.

Number one would by far be the simplest route, but I’d love to hear of any successful method aside from just do all instancing in Unreal from scratch.

Thanks in advance!


  1. Correct it isn’t - BUT after you suffer through the import you can replace to the instanced version manually.
    Basically you can right click the cereal, replace with, and choose the last mash you dropped into the level or something similar.
    You could possibly script this replacement thing too, but I think it more complicated then

  2. seems like less work TBTH.
    You can take the FBX to blender, and script a 2m script to pull out X/Y/Z locations in “world” space.
    You can then use said script to move actors in engine.

Option 2 I’m not even considering, it’s essentially the same as 3 anyway isn’t it?

Best of luck either way? Sounds like a bit of a pain for a bowl of lucky charms :stuck_out_tongue:
(also, are so many tris required? can’t you fake the cereal look by baking down the normals/albedo to a simplified mesh and get the same exact look?)

Thanks for the helpful response, I’ll try your FBX suggestion first, thanks!

Regarding the example, it’s hypothetical, there’s no need for that resolution, but I plan on doing some other highly instanced high res geo static meshes, I’m currently testing unreal as a possible augmentation rendering engine and I wanted to push some film-level assets through it, see if I can get some comparable renders (real-time is not a requirement for me), 3fps would be magical for my goals :slight_smile:

I’ll follow up on this thread with whatever solutions I get working.