I’m working on a game with spherical gravity, and it’s been really painful to use the default camera to add assets to the planet.
In the game itself, my character has its up-vector constantly aligned to the normal of the surface of the sphere, while the camera is a component with its own rotation. This means that, regardless of where I am on the planet, I am standing upright on the planet, while being able to look around fully.
I’ve tried using the Level Editor Subsystem and the Unreal Editor Subsystem, but I’ve hit a dead end.
- The Level Editor Subsystem has a callback for when the editor camera moves (
OnEditorCameraMoved
). I’ve added a delegate there. - The Unreal Editor Subsystem has functions for getting and setting the camera position and rotation (
GetLevelViewportCameraInfo
andSetLevelViewportCameraInfo
).
Having access to these sounds like everything I need, but after registering the delegate, I discover that the editor camera isn’t an actor or component, so I’m unable to parent it to an actor. This led to 2 issues:
- I lose the ability to only set the up-vector to the parent actor. If I set it for the camera, then I lose my ability to pitch the camera.
- When I’m on the bottom side of the sphere, because the transforms are in world space, the mouse controls for pitching get inverted.
To deal with the inverted mouse controls when upside down problem, I’ve tried to see if I can use the Unreal Editor Subsystem functions to overwrite movement from the Level Editor Subsystem callback (i.e. on camera move, set the position and rotation to the previously saved values). It doesn’t work. What happens is the camera starts spazzing out.
Does anyone know of any other approach that actually works?