I have volumetric capture data from an Azure Kinect Dk. I’m trying to import my Obj Sequence into UE4 with textures so that I can incorporate my actor’s performance into a virtual scene.
Is there any way to properly import Obj Sequences, or say, convert said sequence into an alembic file that UE4 can properly read? All my attempts in Maya, Houdini, Blender, and related plugins have failed.
Perhaps there is a tutorial or methodology that I’m missing.
Goal: Import / Convert Obj Sequence with textures into UE4.
Hi, Mohit. Yes, you’re right - UE4 does support alembic .abc files (in an experimental state). The goal I’m trying to accomplish is getting an Obj Sequence into UE4 via any method that proves successful. I tried several .abc (alembic) conversions prior to posting my original request for help, but failed.
Is there a workflow from Obj Sequence to UE4? Does anyone have the knowledge and time to develop one? I think it would be extremely useful for content creators like myself.
As far as Obj sequences go i am yet to see someone do it.
Is your character bone hierarchy supported by UE4. If not then create alembic file without the bones. Just export the mesh you would be good to go. This was the only way i found to work when the rig was improper.
Also if this still fails try switching to Geometrical Cache in Alembic Import options in Unreal, Does the trick.
It’s sort of working. I have a 70 frame animation, when I import the alembic in Unreal I get 70 separate 1 frame animations rather than a single 70 frame animation. That’s where I’m stuck at the moment. Any ideas? Maybe we can figure this out together.
We’ve been having the same issue. The frame rate drops and there is inconsistency between the visual and audio as well. Not to mention the ability to manipulate the data has been a huge challenge - lots of distortion, artifacts, etc. Did you guys make any headway? Would love to help each other!
Hello,
I have same issue
I’ve been recently working on a project including Volumetric Capture data and making it work in UE5.
Now, the thing is that the Volumetric Capture data is a sequence of OBJ files, and I need to somehow make it work in the Sequencer and render a video of it.
I’ve been searching around on the internet looking for a way to make it work and there’s just not enough tutorial or any article that might help me make it work. I’d managed to play the obj sequence using blueprints, but the problem is that the obj sequence can only play if I ‘play’ the game and not in Sequencer.
So I want to ask, what exactly should I do if I want to play an obj sequence in the Sequencer? Or is there any way to ‘play’ the game and render what gets played on the screen?
I imported the obj sequence into blender or Maya (as I described above) - then exported it as an alembic and imported it into Unreal (which worked using the updated alembic importer).
As for materials: I imported the per-frame textures as a png sequence into Unreal and played it in the sequencer to match the frames of the alembic animation. The png sequence drives a material that is assigned to the alembic actor in the scene. Check out this page in the documentation: Play an Image Sequence in Unreal Engine | Unreal Engine 5.0 Documentation