I have a vehicle traveling over a several hundred kilometers and has the main camera following the vehicle. I want to create a trace that I can see from an overhead Scene Capture 2D blueprint that’s projected in the HUD. I didn’t know how to create a trace over such a large distance, so at every second, I’m generating a static mesh sphere. Basically, the client wants to see a trace of where the vehicle traveled on the HUD as the vehicle is moving, but the trace trashes up the main camera.
Is there a way not to see the generated spheres that show where I traveled in the main camera, but have them show up in the SceneCapture 2D image on the HUD? Because my SceneCapture 2D camera has to be very far overhead to see the entire landscape, the spheres are huge and when I’m looking through the main camera, I don’t want to see giant blue spheres scattered across the landscape.
Also, any other suggestions on a better way to see a trail of where the vehicle has been is welcome. I thought because of the distance that particles may not be the best way to go. I also thought of sprites or procedurally generating a pipe that would follow behind. Either way, I’d still want these not to be visible from the main camera, but to be visible from the over head Scene Capture 2D.
So, from what I know of, with a scene capture, the scene is actually rendered twice anyways, so, I might try to setup a HUD proxy. I.E. - Apply a pre-rendered texture (image) of your entire map at whatever resolution you want (if your HUD shows entire map, I would use a lower res image the size of the HUD, if your HUD shows a closer view cropping the overall map, I would create a just high enough resolution texture for the amount of the zoom. Apply this texture (possibly with a normal texture of the terrain applied?) onto a plane. Have your scene capture camera above this plane (The plane can be very small, you will have to determine size of plane for how accurate the camera movement will be when scaled). You can then spawn your track marks on the small plane where you need to. You can handle the movement of scen capture camera and placement of track marks by converting the scale of the world coordinates to a local coordinate scale. This way your scene capture is not rendering your entire landscape twice (hopefully just the small plane is getting rendered and the main scene, not main scene twice).
So, to be clearer hopefully: A 100km square world, a HUD Mini-map of pixel dimensions 250x250px = 2.5km per pixel (if your entire map shows on HUD). Now, if the size of your mini-map plane (with 250x250px image) is scaled to 10000x10000cm then your scene capture camera will move 1cm for every 0.1km your vehicle moves in real world. You can then use your render target as an imag for your widget HUD. You can move the scene capture camera closer or farther away from the textured mini-map plane to zoom in/out. Hopefully this makes sense?
I do believe you could create your scene component, plane with image as an actor by itself, and all you have to do is place it in the world somewhere. It could even have its own lighting and ignore the world lighting, and the world lighting ignore the scene component actor lighting. You would place this actor in some far off place where it will not be seen in the level. You could then simply place this in any map to get a mini-map (mini-map scene capture actor would create its own widget for map) and have a variable for the texture of map, and even scale the plane size proportional to pixel dimensions of the map, and also have a variable to set the scale 1px = Xcm.