Map an image to have coordinates to get access of the image

Hi!

I want to show this image in a sphere, like a sky dome:

I want to show the part of the milky way that it is visible at this moment. You know that you can see different parts of the milky way depending on the month we are.

Is there a way to map the image, applied to a static mesh, to rotate it?

If I could have a reference to the hole of image, or a part of it. I could rotate it and show the visible part of it depending of the month the player is.

Sorry, but I don’t know how to explain it better. Basically is have a reference inside the image to know how much I have to rotate it. My problem is that I don’t know how to have that reference.

Thanks!

This is polar coordinates mapping, and this is HDRI texture. So make sure you import it as HDRI.
You can also google for free HDRI textures, there is nice website with ton of those.

Now solutions:

  • get (i think) any template from architectural visualisation
  • they all have hdri blueprint for skybox.
  • migrate that skybox blueprint into your project and change texture for one you want

Or.
-get that project from template and just recreate skybox material

However i am not sure if that material uses polar coordinates.

This is how you use reflection vector on skybox image, if you use this you do not need UVs on skybox sphere. And this reflection vector method uses polar coordinates just like texture you have:

For rotating this image, you need to rotate that camera vector before using Transform vector (or maybe after) :smiley:

And to know how you rotate. Hmm You kind of need to calculate UP vector on earth in place where is your camera/player at time. then rotate camera vector in material to match that earth vector. You can calculate earth up vector in blueprints, then pass it as parameter into material.

Follow up:

I am lazy so i asked my yellow rubber duck (chatgpt). And here is its answer, and because it is deep AI, check everything below (sometimes it makes silly mistakes that look like THE truth):

  1. Compute Up Vector (Zenith)

Given latitude (ϕϕ) and longitude (λλ), the up vector in ECEF (Earth-Centered, Earth-Fixed) coordinates is:
UpVector=(cos⁡(ϕ)cos⁡(λ),cos⁡(ϕ)sin⁡(λ),sin⁡(ϕ))
UpVector=(cos(ϕ)cos(λ),cos(ϕ)sin(λ),sin(ϕ))

Where:

ϕϕ = Latitude in radians.
λλ = Longitude in radians.
  1. Compute Local Sidereal Time (LST) for Correct Skybox Rotation

To align the skybox properly with the stars, we need Local Sidereal Time (LST).
Step 1: Compute Julian Date (JD)

First, convert UTC date and time to Julian Date (JD) using:
JD=367Y−⌊7(Y+⌊M+912⌋)4⌋+⌊275M9⌋+D+1721013.5+UT24
JD=367Y−⌊47(Y+⌊12M+9​⌋)​⌋+⌊9275M​⌋+D+1721013.5+24UT​

Where:

Y,M,DY,M,D = Year, month, and day.
UTUT = Universal Time (hours, fraction included).
Example: If it's February 3, 2025, at 12:00 UTC, calculate JD for that timestamp.

Step 2: Compute Greenwich Mean Sidereal Time (GMST)
GMST=280.46061837+360.98564736629(JD−2451545)
GMST=280.46061837+360.98564736629(JD−2451545)

This gives the GMST in degrees.

Step 3: Convert GMST to Local Sidereal Time (LST)
LST=GMST+λ
LST=GMST+λ

λλ is longitude in degrees (positive east, negative west).
  1. Compute Skybox Rotation

    The HDRI skybox rotation (Yaw) should match LST in degrees.
    Convert to radians if needed:

Yaw=LSTmod 360
Yaw=LSTmod360

The up vector gives the correct tilt, and the LST gives the proper skybox rotation.

Final Steps in UE5

Up Vector: Controls the HDRI tilt.
Yaw (LST): Controls the HDRI rotation.
Use SetActorRotation on the skybox.

This should correctly align your HDRI skybox to time and location on Earth. :rocket:

ps.
Use this reply from out future overlords as guide, you have all keywords, and algorithm how to calculate it. So use google for more reliable articles.

And pps:
I already found derp mistake in last step: do not use set actor rotation for the skybox, instead rotate camera vector in material just like i posted in previous reply.

and C++ (Again deep ai reply):

  1. Compute Julian Date (JD)
double GetJulianDate(int Year, int Month, int Day, double UT_Hours) {
    return 367 * Year 
        - floor((7 * (Year + floor((Month + 9) / 12.0))) / 4.0) 
        + floor((275 * Month) / 9.0) 
        + Day + 1721013.5 
        + (UT_Hours / 24.0);
}
  1. Compute Greenwich Mean Sidereal Time (GMST)
double GetGMST(double JulianDate) {
    return fmod(280.46061837 + 360.98564736629 * (JulianDate - 2451545.0), 360.0);
}
  1. Compute Local Sidereal Time (LST)
double GetLST(double GMST, double LongitudeDegrees) {
    return fmod(GMST + LongitudeDegrees, 360.0);
}
  1. Compute Up Vector (ECEF Zenith)
FVector GetUpVector(double LatitudeDegrees, double LongitudeDegrees) {
    double LatRad = FMath::DegreesToRadians(LatitudeDegrees);
    double LonRad = FMath::DegreesToRadians(LongitudeDegrees);

    return FVector(
        cos(LatRad) * cos(LonRad),  // X
        cos(LatRad) * sin(LonRad),  // Y
        sin(LatRad)                 // Z
    );
}
  1. Compute HDRI Rotation (Final Function)

This function calculates the up vector at a specific location and time.

FVector GetUpVectorAtTimeAndLocation(int Year, int Month, int Day, double UT_Hours, double Latitude, double Longitude) {
    // Step 1: Compute Julian Date
    double JD = GetJulianDate(Year, Month, Day, UT_Hours);

    // Step 2: Compute GMST
    double GMST = GetGMST(JD);

    // Step 3: Compute Local Sidereal Time
    double LST = GetLST(GMST, Longitude);

    // Step 4: Get Up Vector
    FVector UpVector = GetUpVector(Latitude, Longitude);

    return UpVector;
}

Usage Example

FVector SkyUpVector = GetUpVectorAtTimeAndLocation(2025, 2, 3, 12.0, 40.0, -74.0); // New York, UTC 12:00

This gives you the correct up vector, which you can then use to rotate the HDRI skybox in Unreal Engine 5.

ps.
Again i did not tested this code yet, not even looked at it. It is too early and i need my coffee now. This is product of me waiting for it to brew. :wink:

pps.
Waking up, and i just had idea.
Instead of making this code in C++ or bluepritns, make this code in custom material shader. Those are simple float calculations. You can code that in HLSL (shader language) and have whole math as material node.

@Nawrot

Wow!

Thank you very much for all of your answers!

I have to read them carefully. By the way, the images are in OpenEXR format. I don’t know if this format could store information interesting to use it here. You can find the images here: NASA SVS | Deep Star Maps 2020.

Don’t worry about all the astronomical calculations. I’ve already done them using a book and C++.

I’ll come back later when I find how to do it.

Thanks!

Great, so all you need to do is find how to rotate CAMERA Vector To match local up vector on surface of earth. And add it in that reflection vector part of material.

As to extracting info from OpenExR format i think you need to code it in C++. But then creating own importer is probably qquite complicated, and anything less is basically adding that info manually.

Edit:
I just found this great tut about some nice material tricks (like rotating mesh inside material shader, and not blueprints). Which is great for this sky sphere problem.

1 Like

I have a lot of questions:

  1. How do you know that the texture I have has polar coordinates?

  2. Where did you get that Blueprint code from, is it from a template?

  3. I have tried several templates and I can’t find a template with a blueprint for skybox. I have only find it a material:

  4. How can I know if a material uses polar coordinates? Polar coordinate system - Wikipedia

Thanks!

1: Texture has polar coordinates, if you see horizon line form something that looks like sinus function (wave), not all HDRI textures use polar coordinates (so you need different reflection calculations for projecting them on sky). On that NASA page with sky textures, “milky way” texture uses polar coordinates.

2: that blueprint is in most architectural visualization templates, its called light studio i think. Howevr just now (while looking for tut i seen) I found that there is HDRI backdrop plugin for unreal now.

  1. I think first material pic i posted here requires polar coordinates texture (it is how that reflection vector will work). And you need skybox/spehere mesh to apply material to it, and material does not use UVS of that mesh, reflection vector calculation maps it instead.
1 Like

My first attempt doesn’t work. These are the steps that I have followed:

  1. Downloaded an OpenEXR image from NASA.
  2. I have used Gimp to convert it to HDR image format.
  3. I have imported in Unreal drag and drop it inside the editor.
  4. I have changed the HDR image’s properties Mip Gen Settings to NoMipmaps, Texture Group to Skybox ,and Compression Settings to HDR Compressed (RGB, BC6H, DX11).
  5. After that, I have created a Unlit material with this expression:

But, when I change the value of the Cube map rotation parameter, nothing happens:

Is there a problem with the HDR image?

By the way, these are the details of the Texture that I’m using in the material:

The problem was that the values I was using were greater than 1. :sweat_smile:

LOL.

I am trying to make that material, with rotation inside material, then i try to do all calculations from location and time. Just for fun, because your questions pumped me up.

However i have problem with correctly importing nasa EXR textures. Something with UVS does not work unlike other stock HDR textures (that use polar coordinates). (yes i may install gimp).

Btw you can just manipulate those UVS inside material, no need to transform whole mesh, material does not use mesh uvs anyway.

Edit:
I found difference (or what is wrong with NASA texture), textures i am using with my material (they are from some space skybox pack) are cubemap textures that use polar coordinates, and nasa texture is just using polar coordinates. There is some step missing on import. So result is not what it should be.

Edit 2:

Here is material that properly projects NASA milky way onto sky sphere.

it uses rather costly calculations:

  • U = 0.5 + (atan2(Y, X) / 2π)
  • V = (acos(Z) / π)

And converting that nasa skybox into proper unreal cubemap would make everything simple.

I remember seeing somewhere much faster way, but cannot find it again
Those CUSTOM nodes like TEX_Skybox_Color are “Named reroutes”, they help with material readability, and make it easier to plug certain parts of shader.

PS.
I see difference your texture was converted into cubemap in blender, i used affiniity photo that just changed texture format. Need to learn gimp. :smiley:

2 Likes

The only thing I did with Gimp was open the EXR file, and exported as HDR file changing the file extension.

I’ve created a test project, with the HDR image, and I have uploaded it to Github if you want to check it. I’ll do it because I’m not sure if my texture is correct and I don’t know how can I know it. It blows my mind what you’ve said:

Milky way (that dotted wavy line on nasa texture) should be straight line on sky, because galaxy is disc and we in middle of it. So if milky way is not straight line around sky, your UV mapping is bad.

Currently i am trying to make material that has static switch for cubemap and polar coordinates texture (that is not cube map). But there is problem for UVs that are vector2 for one and vector3 for other.

Just cannot find what is exactly difference for polar vs cubemap. Why one wants vector2 as uvs, other vector3.

ps.

Found out why (its more like i knew it all the way) cubemap uses vector to map UVs, that is why you should calculate uvs from camera vector in first place. :wink:

Now time to properly rotate it.

1 Like

If you try the project I have uploaded to Github, I think, it looks fine. Like we can see in real life. But, I’m not sure because I don’t understand what you mean with a straight line on the surface on a sphere.

I can also see the Texture in 3D if I open it and select the 3D View checkbox:

And looks like I see it in Unreal Editor if I click play.

Yes it looks fine. And you use texture UVs (that are applied to mesh later), so rotate around axis works (it is rotating vertices of mesh). However you rotate it around single axis. And for long + lat location you need to rotate around 2. Which brings good old gimbal lock problem (I also have no idea how to chain two of those rotations).

I just made material with standard material nodes, that rotates reflection vector from camera. But again gimbal lock thingy.

I also created custom node to rotate vector (in HLSL), but for some reason it does not work. I am bit rusty with HLSL.

Now some explanations. Why i do not want to use mesh UVS for cubemap:

I remember that using camera vector as reflection, was more precise and produced better results.

Which means that rotating mesh (or shape of mesh) has no effect on skybox, because UVS are calculated without using mesh UVs. Also reflection method being not affected by size or shape of mesh was one of reasons to use it. You could have smaller mesh and skybox was not affected by place on where player is relative to skybox mesh.

And 3rd reason (bit outdated now), is that when using low poly sky sphere you will get deformed skybox texture on triangle edges. Ie. it will be composed from flat triangle UVs, while with reflection there are no deformations from texture stretching, each pixel is where it should be. However with current high poly meshes everywhere, it does not matter much.

ps.
Do not worry about me digging into this. I just got pumped by this problem, and I also need skybox material for upcoming arch vis project, where it would be nice to have material that can rotate skybox, and that uses camera vector to map UVs.

1 Like

What I need to do is tilt the Z axis of rotation and then rotate the sphere around the Z axis. Explained with examples:

  1. A player living in the North Pole. We don’t need to tilt the Z axis. We only need to rotate the sky around Z axis.
  2. A player living in the Equator. We have to tilt de Z axis 90 degrees before start rotating the sky around Z axis.
  3. A player living in the South Pole. We have to tilt the Z axis 180 degrees.

It’s something like this:

Tilt the celestial sphere (or skybox) and rotate around Z. If it is possible.

There is difference. Rotate around Z axis, or rotate around tilted Z axis.

I have both rotation, and combined rotation is wrong. I ether need to make custom HLSS that uses quaternions (which refuses to work atm). Or make combined rotation out of quaternion nodes in material (and i think there are no quaternion nodes in non custom materials).

Best would be getting Up vector at long,lat, matching skybox to it then rotating around that new Z (aka new up vector at long lat).

Update, after refreshing my HLSL skills i got it working:

This material uses CUSTOM node to calculate rotations.

This is setup for custom node

struct ComputeVector {

    float3 RotateVector(float3 R, float Longitude, float Latitude)
    {
        // Convert degrees to radians
        float lambda = Longitude; // * (3.14159265359 / 180.0);
        float phi = Latitude; // * (3.14159265359 / 180.0);
      
        // Compute observer's local basis vectors
        float3 U = float3(cos(lambda) * cos(phi), sin(lambda) * cos(phi), sin(phi));
        float3 E = float3(-sin(lambda), cos(lambda), 0);
        float3 N = float3(-cos(lambda) * sin(phi), -sin(lambda) * sin(phi), cos(phi));
      
        // Construct rotation matrix (column-major)
        float3x3 M = float3x3(E, N, U);
      
        // Transform reflection vector to skybox space
        return mul(M, R);
    }
};

ComputeVector f;

return f.RotateVector(CameraVector,Long,Lat);

And this is shader code (just copy paste it in code part of custom node).

Looks like it works, however:

  • i did not checked if positive lat and long are positive (multiply camera vector by -1 may be in wrong place)
  • i did not checked if it correctly rotates to long,lat location (again they may be flipped, because you moving Z axis (north pole) in opposite direction, matter of visualizing where Up goes when you move on surface of earth.
  • long/lat are in degrees, but i am not absolutely certain. :wink:

ps.
there is no way (that i could find) to calculate that all with quats or matrixes in non custom nodes, that stuff is not exposed.

pps.
long and lat are not in degrees, i butchered that calculations somewhere. :wink:
Oh and probably vectors are swapped, so it rotates around 0-x instead of Z axis