In both these games there are UI on a physical object, like VALORANT Phantom has a mini-map on the left ‘heartbeat’ sensor part of the gun while MW2 BETA has the UI on the tablet,
I think they both use render textures rendering a UI onto the texture of the asset but for MW2 how did they allow them to click on the buttons / where to place the bombs?
Hey there @ChezyTheMessy! Welcome to the community! So this is a bit of a complex question, generally depending how your diegetic menu works. Things like Metro literally have physical actors for their UI. In your examples, it is as you say basically a render texture, that gets edited by game logic, then processed to that material or applied directly to the viewport with some magic. I recommend using 3D widgets however. There’s not many resources on the whole process, but some users have posted their handling of creating a texture to apply to the material. The problem comes when making it hit testable and verifying where you click. In theory this can be done using world position in the material and passing the location your mouse casts to, and registering clicks.
Another user on the forums has one solution for the UI Material:
However my recommendation is to try using the 3D widgets, and using them as you would in a VR project, just applying them flat to your object. Then you can trace from the mouse and use their native hit testable features:
Disclaimer: Epic Games is not liable for anything that may occur outside of this Unreal Engine domain. Please exercise your best judgment when following links outside of the forums.