Hello! Need to make a video texture that has depth data so that i can move particles in 3d like in this example: https://www.youtube.com/watch?v=etSfYfIIoSE
I have 2 videos: one normal rgb video and the other is the black & white depth extraction I made of the video using runway (Runway).
What would be the best way to extract depth info from the 2nd video? I tried to read it from the R or A channel of the depth video but nothing happened… probably I need some clever AI process here? Any suggestions? Thanks!
Any help here? Thanks!
Hello @crtq,
I would love to try and help you with this, but I think I need more information to do so. The youtube link you provided is a 30 minute tutorial, so I’m not sure what part of the video you mean to reference.
Is your final goal to place the particles in 3D space along the depth data, or to place your particles in front of or behind a 2D image that has depth data?
Hi @sarahlenker and thanks for the answer! I would like to extrapolate the depth info from a video (as you can see in the image it has the depth info but i don’t seem to know how to extrapolate it from it)
and then I would like to use this depth info to move a particle system (which is actually textured with the coloured version of the video)
Basically I would like to recreate this video in 3d as a particle system.
Hope this makes it clearer and thanks for your time!
Crtq
@crtq Alright, I think I have a solution for you!
So for this, we’re going to sample the depth texture, and use the color value to set the particle distance. I’m assuming you are already doing a texture sample for the colored version, but I’ll walk you through my setup just in case. Make sure you’re emitter is in local space for this for ease of use.
First, I’m setting a new variable, UV position, with a random 2D vector value.
Then, we’ll add the sample texture node. Use your depth information here for the texture, and add your new UV Position attribute into the “UV”. Make sure to check “Transform UV Into Position” and set your UV scale to however big your particle system will be.
What we’re doing here is sampling the particle’s position in relation to the texture’s UVs to accurately get the particles position within the texture.
We’re not going to be setting the particle’s color, because I don’t want to mess with your colored version. If you notice in your parameter writes though, the module is still writing to these parameters called “SampledR/G/B…” We’re going to use this value later.
Now, we’ll create a Scratch Pad. Call this whatever you want, I’ve named mine “Set Depth”. Now that we have that sampled color value, we’ll use it to offset the particle position. I’m taking in my SampledR value (since it’s a black and white image, it doesn’t really matter that it’s R), and I’m multiplying it by whatever I want my range of depth to be. Then, I’m adding that value to the particle’s Z position. (This is because the particle system will by default be facing upwards when sampling a texture)
So now, our particles are offset by an amount relative to their UV sampled color value.
Hope this helps! Let me know if you have any other questions!
They’re moving! Awesome, thank you so much @sarahlenker I need to tidy up a few things but if I have any other questions I’ll ask you
Thank you!