So, I’m trying to recreate the effect of a Sun Corona. My initial idea was to use a camera-facing plane with a simple texture map, and use a ‘Flow Map’ to push a texture outwards towards the edges. As it turns out, the flow-map works great but the camera-facing sprite-like plane doesn’t actually work unless you are directly facing the suns’ sphere mesh. Instead, I’m now using a slightly larger sphere than the base sun mesh, and (basically) a Fresnel-Mask to mask out the effect to the edges.
My problem is, as I pan around the viewport and don’t face directly towards the center of the sun, my corona starts banding towards the camera focal point. Right now this is because I’m using a reflection vector to make the UV coordinates, but that’s not the way I need to go around it by the looks of it. I need to create texture coordinates based on the size of the object on screen. I think I’ve managed to pull that off, but using the pixel world position. This is a problem, because I’m basically trying to make 2D Coordinates in 3D Space, and as the sun moves to the edges of the screen, the coordinates shift slightly. You can see this in the video as I preview the ‘Constant Bias Scale’ node and pan around the Material viewport.
EDIT: Just noticed Open Broadcaster bloody cut-off, all I’m showing is the coordinate generation. I can try to re-capture if it would help fix the problem
So I’ve drawn out on paper what I need to achieve, in the hope it makes it easier to understand. I’ve managed to get object-size based UV coordinates, and also managed to get the directional vector from the object to the camera’s position, now I need to somehow rotate the UV coordinates to face the direction of the Object -> Camera Vector. Can one of the Shader wizards out there help me out? I thought the Flow-Mpa would be the toughest part, but apparently it’s not… I’m assuming I could use a ‘Rotate Around Axis’ function, but the function that comes with UE4 is somewhat unexplained, and I’m not even sure if UV coordinates are really vectors?
I’ve had another potential idea, but again need feedback. I need to somehow get the X and Y min and max locations of the objects pixels, and Lerp between 0 and 1 to create a gradient based on that. I can get the screen position of a pixel, but not relative to anything else… thoughts?
I didn’t get a good look at your UV Math from your earlier post but I believe if you Take Absolute World Position and subtract Object Position WS and plug the results instead of your Reflection Vector into your UV math you will get the results you are looking for.
EDIT - Nope never-mind thought I had it found my error
I didn’t get a good look at your UV Math from your earlier post but I believe if you Take Absolute World Position and subtract Object Position WS and plug the results instead of your Reflection Vector into your UV math you will get the results you are looking for.
EDIT - Nope never-mind thought I had it found my error
[/QUOTE]
Haha, thanks for looking anyway Eric I need the likes of Ryan Brucks or to help me out on this one…
Basically thinking about it another way, I want to render a UV-gradient on an object that is always aligned to the screen, and also respects how large it is on-screen. So regardless of size, the left-most pixel of the object would always be black (in the green channel) and the right-most pixel would always be white, and map a gradient in-between them. Reflection vector gets fairly close, and I tried feeding it a camera-aligned flat normal but that didn’t quite work either.
The extra part of the problem, is that the gradient needs to map the object the same way regardless of how much of it is on-screen, so I can’t really use screen-space math to do it (I don’t think), because as soon as part of it goes off-screen, the pixel shader will just take left-most pixel that’s on screen, not on the object if that makes sense.
I just posted it on your Facebook thread as well, but here’s a question: Why don’t you simply use a sprite? Just put an emitter at the center of the sun, spawn a camera-facing particle every bunch of seconds, and you should have exactly what you’re looking for
Hey Kashaar, I replied on Facebook too but I’ll also post here just so everyone’s on the same page
I actually originally tried a sprite, but the problem is, as the distance between you and the object increases or decreases, the relative sizes of the two change completely. This is I suspect just because of the way the camera’s FOV works, but as you get closer, the sphere takes up more of the screen space and completely hides the sprite. At more glancing angles or when the sun is near the edge of the screen, some of it is also hidden too. Just the way the camera-facing math works.
This is why I turned to a sphere. You can see the effect at an equal distance from the sun at all angles then, but obviously it’s just this issue of distortion as you move around. What I need to do is literally map a gradient on the sphere that always lines up to the screen-space X and Y positions, and always ranges from 0 > 1. I thought normalizing the screen-space UV’s on the object would work, but of course, it just normalizes each pixel individually, not the object as a whole. That’s where I’m struggling.
I came up with a solution that seems to work pretty well; it’s a sprite, Jim, but not as we know it. Specifically, it’s a circular triangle fan that is aligned to the cameras position, then perturbed in the material to intersect the visible horizon, rising normal to the surface. The net effect is that the corona is always visible and is correctly projected no matter which direction you’re looking or how close you are to the surface (within reason). Here’s a zip that contains the material, an example corona mesh, and an example blueprint, just extract it into your Content folder (I hope that’s how it works).
You can UV map the corona mesh however you want, and how it’s aligned to the camera is up to you. In my example I use the tick event of the blueprint, but it’s possible to do it in the material, I just didn’t bother to figure it out. Note that it has to face the position that the camera is at, not the direction the camera is looking. Also, if you use non-uniform scaling, you’re going to start getting weird stretching. There’s two parameters, the radius of the planet at 1,1,1 scale, and a minimum altitude to stop crazy things happening when you get really close to the surface. The minimum altitude will depend on the thickness of your corona I think, you’ll just have to play around and see what works.
Also, I had an idea that might improve the effect; you could instantiate multiple corona meshes that are fixed in place in world space, and are faded in or out when the angle between where it’s facing and where the camera is gets too large. This would make them seem to be more attached to the sun than the camera. For performance, don’t create a bunch of meshes, just make a few and reuse ones you’ve faded out.
Tried out your .Zip Beej but it doesn’t work I’m afraid, Sprite doesn’t seem to align itself unless I’m missing a step. Everything done in the Blueprint can be done in the Material anyway (which is more ideal, because this is going in a Multi-player environment).
Also if anybody is interested, the reason it’s important to have it camera-facing, is because I’m using a Flow Map to ensure that the corona always flows ‘outwards’ from the sun surface.
Hey man that’s awesome! I was waiting for your input on this haha… I can’t take any credit for the solution though. Luckily I’ve been at EGX and have had Josh and Nick (Shader & Rendering Programmers) around to help me out, and we all sat down at the booth for an hour or so and worked out the solution (by we I mean they ;)). That’s why I love Epic… they’re so community/people driven.
What we basically needed to do is cancel out the distortion of the view projection matrix, it works insanely well, so well even they we’re happy with it
When this project is over I have an insane amount of “tutorialization” to do, learned so much during this process and it’s cool to share it with everyone.