Good afternoon to everyone, i’m new of the forum and i’m kinda a noob of unreal engine too, so be patient if i don’t use the correct technical terminology.

I’m an archaeologist and i’m working to make a navigable 3D reconstruction of the site on which i’m working, and this is my question: is it possible assigning to the photogrammetric 3D model captured the same geographical coordinates of the site in the real world? I would like to print on the screen this points to know the exact location of each structure/object.

I don’t know if i had explained myself well, if not let me know.

Thanks for the help.

I see a little bit of problem with directly using real-word position in the editor because:

- Unreal uses centimeters
- UTM coordinates are usually really large numbers.

Those two reasons combined would demand you to use huge numbers for the objects positions and that would be somewhat unwieldy. A better approach (I think) is to place all you structures on a virtual plane with know real world positions. This can be the actual editor coordinates. From these known coordinates you can apply a simple transform equation to get the real world position and then display that on the screen.

Something like:

Real Easting = Constant + Unreal X position/100

Real Northing = Constant + Unreal Y position/100

^That’s basically what you’d have to do–since things in 3D software are positioned about the 0,0,0 origin you would have to calculate the relative position from that.

Sounds interesting. I’m trying to do something similiar.

Could you please show an example using the Simple Transform Equation?

My goal is to use .png heatmaps and generate terrain. Each .png image has geo ref coordinates.

Thanks

I’m doing the same for real world location data - mapping UTM coordinates to a terrain. It’s pretty straightforward if the following assumptions are true.

#1. Your terrain is centered at x=0 and y=0

#2. You know the real world minimum and maximum eastings and northings for your png / terrain

#3. All of your data is in the same coordinate reference system (CRS). If not then you’ll need to reproject some or all of your data to a single CRS so that everything lines up. I had to reproject my data but ended up doing this outside of UE4.

Given these assumptions, here’s the approach I use to map real world UTMs to UE4 terrains:

XInUU = (((Easting - MinimumEasting) / (MaximumEasting - MinimumEasting)) * TerrainSizeInUU) - (TerrainSizeInUU * 0.5f)

YInUU = -1.f * ((((Northing - MinimumNorthing) / (MaximumNorthing - MinimumNorthing)) * TerrainSizeInUU) - (TerrainSizeInUU * 0.5f))

Easting = real world position in x

Northing = real world position in y

MinimumEasting = minimum real world position in x for the png / terrain you are mapping points onto

MaximumEasting = maximum real world position in x for the png / terrain you are mapping points onto

MinimumNorthing = minimum real world position in y for the png / terrain you are mapping points onto

MaximumNorthing = maximum real world position in y for the png / terrain you are mapping points onto

TerrainSizeInUU = terrain size * 100 (e.g. 403,300 for a terrain that is 4033 x 4033)

Also, my y data was inverted so I had to multiply it by -1