Rgb-d

Hi!

Is there somebody working on RGB-D projects for Unreal Engine?? I have seen some work for Unity, used primarily for Augmented Reality applications, and holographic visualization inside 3D scenes with Oculus glasses and Kinect sensor for image capture.

I think this kind of things should be very nice to implement in Unreal.

Here an example for Unity:

What do you think?? I hope to see this kind of things soon in Unreal :slight_smile:

There are already public plugins supporting for the Kinect (an RGB-D camera), and leap motion. I believe both give access to depth image and colour stream. I haven’t seen anyone doing exactly that effect, but you should be able to do so relatively easily (I think its just a textured heightmap). The issue is more a lack of good quality miniature depth cameras to attach to the rift (yes with a DK2 you could use fixed cameras, but it rather limits your working environment to do so).

3Gears have a very small depth camera on kickstarter right now, so that should be useful when it ships.

Thank you Tom, I tried to capture a sequence of Kinect 3D points and processed them in Brekel software, and then I created a demo in UE. If UE supports Alembic the result would be more straightforward. Meanwhile I will take a look about texture heightmaps and other alternatives.

Here is the result,

Take a look to this guy with three Kinects:

What do you think? This should be Virtual Reality!

Has anyone had any luck exporting the Q3D video (the first youtube example) to UE4 instead of Unity?

Also, iparra, can you share your workflow with using Brekel with a Kinect and UE4?

Thanks

Hi reedandrader,

Sorry the late response. If you use brekel you can record a sequence of meshes and textures. All that you need to do is to import them to a folder in your Unreal project and then create a blueprint. In this blueprint you call a function to play an animation changing the static mesh and the texture (or the material). It’s not so direct now but I hope this kind of things we will se in Unreal very soon.

Cheers

Hello, I’m on the same problem, except it’s from photogrammetry obj sequence. So thank you for your message saying it’s possible. iparra, I see you are from basque country, france or spain ? I live in Socoa :>
Le monde est petit.

Put a depth map on the geo shader in maya and render it through a camera in the position of the kinect (use texture to help align. then use that render as a displacement on a new piece of geo in unreal.