Diegetic HUD/UI?

As part of the space game I’m working on, I’m trying to create a diegetic HUD for the player while they’re in the cockpit of the ship. Rather than have the HUD information render to the screen, I’d like to render it to the cockpit model, so it looks like the MFDs (multi-function displays) on the control console are displaying the information on their screens. I also want the information to update in real-time, to reflect things like the ship’s current speed, heading, fuel, hull integrity, etc. So far, I’ve gotten a very basic mock-up put together where I added a widget component to my player pawn (which also contains the static mesh component for the cockpit) and made the widget display some text, and positioned said widget over one of the MFDs of the cockpit model.

That isn’t a very elegant solution, in my opinion, but I haven’t found a better way to do it, yet. Ideally, I’d like to be able to create a material or texture that displays HUD info, and apply it directly to the cockpit model so it covers an MFD, but I’m not sure what steps to take to do that, or if it would be able to produce the result I want.

If anyone could share some insight on how to best set up a diegetic HUD like the one I’m describing, that’d be great.

You could try something like this albiet the resolution isn’t the best unless you play around with material scaling within the material editor but anyway, this is what you’ll achieve
PREVIEW:

MATERIAL:

BLUEPRINT: FULLSCREEN]

1 Like

As far as I know, the Widget Component works by taking the rendertarget output from a Slate Canvas (the root element in a UMG Widget), and applying that to a world-space plane, providing the necessary utilities and functionality to map worldspace(3d) clicks into widgetspace(2d) user input.

You could do like you’re already doing and add a Widget Component to each of your cockpit’s MFDs, positioned over the display, and that would simply work. Or you could go digging in the wiki and the forums for topics on rendertargets and attempt to write your own system to map the render output from a UMG widget to a mesh of your choosing.

As far as having the information update in real time, simply have the actor containing the widget pipe the necessary information to the widget via event calls or setting variables on the UMG instance.

That, I’m pretty sure, I don’t want to get into unless I absolutely have to (and preferably, only later on down the line.) I know virtually no C++ at this point, so I’d really just be fumbling around in the dark with it, at best. Any idea if Epic or any of the source contributors plan on adding the functionality to render widget data to a UV-mapped texture/material? I’m also hoping someone with C++ experience will eventually add support for a spherical coordinate system and that the orbital mechanics plugin becomes available in the marketplace.