kinect color depth

i’ve been reading the forum to find a way to use the kinect plugin

i want to show the rgb image , combined with the depth so it looks 3d or similar to a color point cloud using blueprints.

i checked and it looks like depth stream and color stream are well aligned (i guess the plugin uses coordinate mapper to make both in the same coordinate system).

Im kind of new with unreal so i dont see how this can be done,
i found this :

its a bit similar of what i want, just take the rgb plane/material and make it deform based on the values of the depth.

Some people commented it could be posible using a heightmap.

but im lost, what do you think ?

well i tried using “world displacement” ,
but its not working yet, i get some “peaks” (i dont know if thats the word)

the plugin example shows how to use a material instance and expose the texture node as parameter so the kinect stream is inserted as that texture

what i did is simply add a second “texture parameter” on the material to receive the depth stream and use
it as the “world displacement”

i had to invert the depth texture using “one minus” node

im guessing my problem requires to remove black or undetected pixels around the body and somehow average pixels (meaning blur them) to decrease the peaks

any ideas ?


actually the depth and color are not aligned. you have to use the knect sdk coordinate mapper to align them
additionally as far as I see all the existing plugins are altering the depth data. they are whether giving you an average, or just the low value byte or the high value byte of depth
to have the highest accuracy you need both.
afterward you need to make sure you stop the compression on the image in ue4, and then turn the image value from 0-1 to 0-255 to get the real depth.
I had to write my own plugin to be able to do that

hey plangton,

for me the depth and color is ok if i align both manually (offseting the depth texture)

about the depth values, its a matter of checking the code but its closed for this plugin as far as i know,

its also really sad that its not posible to create a standalone packaged app using it without having to ask opaquemedia to do it.

Anyway im giving a try to lion032’s plugin.

thank you for the help

In the lion32 plugin the issue is the same. he uses an average formula for putting a 13bits depth to a 8bits color channel. if it is good for you then you can use it

Hi @plangton what is your plugin name that can interpret depth data to mesh offset? any tutorials around it?