cubeRenderTarget to 2D texture?

I can’t seem to figure out how to pipe a captureCube’s render target into a normal 2D texture… It only seems to work with something like a WS reflection vector. I just want to use the resulting latLong map it creates as a texture on another surface. Is this possible?

1 Like

Nothing? :frowning:

I bump this post because i’m looking for this solution too
You mean that you want the realtime sphere texture ?

I’m pretty sure it is just a standard Render Texture and therefore usable as like a normal texture, do you have a screenshot of what you have tried?

I already post it, but the answer didn’t works

Thx

Erodann, Sorry I completely misunderstood your question in the other thread. I thought by “not reflection” you meant you didn’t want it flipped :slight_smile:

This is happening since the engine automatically remaps UV coordates for cube sources such as HDR textures or cube targets to use the spherical format. You can counteract it using math or you can bring in your texture as a non-HDR format to sample it regularly.

To do this, Martin Mittring sent me the formula the engine uses to convert under the hood which is this:

float2((1 + atan2(NormalizedDirection.x, - NormalizedDirection.y) / 3.14159265) / 2, acos(NormalizedDirection.z) / 3.14159265)

Reversing those operations gives this math:

float2 regUV = (UV.x-1)3.141592652;
regUV = float2(-sin(regUV.x),cos(regUV.y));
return float3(regUV.x, regUV.y, cos(UV.y*3.14159265));

that goes into a custom node with one input called “UV” Then you plug in texture coordinate and it returns a square texture value. Enjoy!

also as i said in the other post thank you so much for the answer but there are a probleme :S

http://img15.hostingpics.net/pics/859048render.jpg

Thank you by advance :slight_smile:

weird, the Y coordinate needs to be multiplied by 2 still. use this instead:

float2 regUV = (UV.x-1)3.141592652;
regUV = float2(-sin(regUV.x),cos(regUV.y));
return float3(regUV.x, regUV.y, cos(UV.y*3.14159265)*2);

This doesn’t make sense to me though since in the original code it never had that factor of 2 there… hmmm will need to figure out why :slight_smile:

actually this is still not quite right. hold on, dont worry will figure it out in a minute :slight_smile: its very close.

backwards math is sometimes tricky since the order of operations is all reversed.

haha indeed, i have a little probleme with the top and the bottom

i’ll wait for you :slight_smile:

3rd times a charm :slight_smile: fwiw I am making this a material function using regular nodes (not custom node) that will be available in a future ue4 version:

float2 Angles = float2(2 * PI * (UV.x + 0.5f), PI * UV.y);

float s = sin(Angles.y);
float3 Direction = float3(s * sin(Angles.x), cos(Angles.y), -s * cos(Angles.x));

return Direction.xzy;

2 Likes

haha thank you so much RyanB :DD
It’s working ! You rock !!

Thank you for posting this!!! I’ll try this tonight!

Can you do a paste of the blueprint code? Me and C are not friends;)

That’s HLSL code, you can create a ‘Custom’ node in the Material Editor, and copy & paste that in there. That’s all you gotta do!

You need to give it a UV Input (Vector 2) though.

1 Like

Thanks! Unfortunately, it is not obvious to me how to hook this up. (Sorry I’m an idiot)

I put the code into a custom node, made a vector 2 (Constant), but don’t know how to wire it up.

Does the Vector2 go into the Custom node?

Here’s what I have (the texture sample is the renderCubeTarget)

e78c6c1fdc2db9e61720980421cfe9944a4b80ab.jpeg

This version needs to be set to “Float 3” output, and the source you hook it up to needs to be a cubemap texture.

Nevermind, I got it to work! Here’s a screengrab of the hookups for any dumbarses like myself. I didn’t quite get what Ryan was saying when he said you needed an input called UV. But finally dawned on me to add that to the custom, instead of using the default input.

Many, many thanks!!!

1 Like

Step by Step Tutorial

For anyone new to 360 rendering from VFX, Film or Games who would like to produce 360 captures for Youtube 360, Google Cardboard or Gear VR ect. If rendered to a high quality (4K with lossless image quality (PNG / BMP / TGA) then this can be a viable, fast and FREE alternative to 360 panoramic rendering from standard 3D packages such as Max, Maya or C4D.

Pros of using this technique:

  1. Produces 1080p renders for Youtube 360
  2. Quick to set up and get testing
  3. It’s easy for beginners
  4. Can be produced in a Blueprint only project
  5. Renders quickly

Cons of using this technique:

  1. Lossy image quality (Look at 's Thread for a 4K solution)
  2. Capped at your screen resolution (unconfirmed)
  3. Reflections and Fog particles can cause Seams

TUTORIAL

  1. MODES - Create a Scene Capture Cube
  2. CONTENT BROWSER - Add New - Materials & Textures - Cube Render Target, call it “T_CubeRenderTarget”
  3. Edit T_CubeRenderTarget - Size X to 2048
  4. Return to Scene Capture Cube - Set the Render to Texture Target as T_CubeRenderTarget
  5. Uncheck & Check tick box “Capture every frame” to refresh the texture
  6. Content Browser - Create new Material called “M_360CaptureMaterial”
  7. Set Material Shading Model to UNLIT
  8. Drag in your 360 T_CubeRenderTarget Texture as a Texture Sample, Hook into ‘Emissive Color’
  9. Create a new “custom” node
  10. Add Code as follows into Code slot
    float2 Angles = float2(2 PI (UV.x + 0.5f), PI UV.y);float s = sin(Angles.y);float3 Direction = float3(s sin(Angles.x), cos(Angles.y), -s * cos(Angles.x));return Direction.xzy;
  11. Add a 2nd Input, name it ‘UV’
  12. Create a Texture Coordinate, and plug it into input 2
  13. Search for a Plane in the Content Browser - if no Plane create one in 3d modelling package. Polycount is irrelevant
  14. Apply M_360CaptureMaterial to the Plane
  15. Set Scale to X=2, Y=1, Z=1 which gives us a 2:1 aspect ratio surface to capture from
  16. Create a standard Camera in the world, facing the plane front on
  17. Set Location of Camera to be the same in the X & Z. but offset in the Y to give some distance (specific distance is irrellevant.)
  18. Edit the Camera Settings to be:
  • Name: 360_Orthographic_Camera
  • Projection Mode: Orthographic
  • Ortho Width: (Size to fit Plane)
  • Constrain Aspect Ratio: On
  • Aspect Ratio: 2.0
  1. Create a called MAT_360
  • Select 360_Orthographic_Camera in your scene
  • Right-Click - Add New Camera Group
  • Name Your Group
  • Create new Director Group
  • Add Key - Choose your camera Group
  1. Animate your Scene Capture Cube through the world.
  2. Select your
  3. Open Level Blueprint
  4. Right-Click, create reference to your
  5. Drag out a wire from the reference, add a ‘Play’ function
  6. Hook the ‘Play’ Function into Event Begin Play
  7. Add a Delay, set duration to the same as your length (in seconds) ex. ‘30’
  8. Add an Execute Console Command: ‘quit’
  9. SAVE YOUR WORK
  10. Open (maximise) click on MOVIE (create a movie, in the far right of the toolbar, you may have to expand the toolbar)
  11. Choose your settings:
    Capture Type: AVI / JPEG Image Sequence
    Capture Resolution: 1920 x 960 (Maximum Monitor Resolution x Width / 2 = 2:1 Aspect Ratio) FPS: 30fps
  12. Click OK, Wait for Render to complete (or hit ‘tilde’ to bring up console command and type in quit)
  13. Navigate to your Unreal Project Folder:
    …/Saved/Screenshots/Windows

If you’re getting any rendering errors like I did after tying your scene capture cubes into an existing , don’t panic! Just delete your sequence and try again.
Errors discovered include - incorrect arrangement of faces from cube map, black bars around frames, different scaling of faces.

If you’re seeing seams on your renders like I do, you may have to re-think how you’re approaching screen-space reflections, screen-facing particles and vignettes.

[Will post video results & screenshots soon]

Thanks!
.

1 Like

, do you mind if I post a Wiki article with your instructions in it? I figure that’ll come in handy in the future.

I’ll link back to the thread too :slight_smile: