What is the appropiate relationship between HUD and Widgets?

In the official documentation Widget components are created and used by adding them into a Level/Character blueprint, and they basically act like a HUD component.

I did some tests and I’m quite confused right now, which of the following is true:

  1. Widgets replace the HUD entirely, or at least they could.
  2. The HUD is a container of Widgets
  3. HUD is more appropiate for something, while Widgets are more appropiate for something else, they are complementary. (if this is the case than what they are more appropiate to do what?)


HUD is most basic class for the hud. For very simple games or for development time it is better than widgets (ie UMG).
But moment you want more than simple display of HP bar, or you need gui scale with different resolution, you should use UMG.
Generally unless you want simplicity you should not use HUD, umg is much better in almost everything.

Thanks gonna just use UMG then.

As soon as you call “Add to Viewport” your Widget is a Part of the HUD your Player Controller is associated with.
Thats the relationship between HUD and Widget. The Widget will then be a childcomponent of your HUD.

And since your HUD´s main purpose is “Drawing” you are able to see those children on your screen.

so what you are saying is that I’m not really adding the viewport to a character blueprint but to the character hud.
Therefore the correct way to handle widget component should be to access the HUD instead of directly the Widget,
I’m supposed to expose some method in the HUD to be called from the player blueprints then? Or I should just consider the HUD as an abstract interface that widgets just “implements”?

first: dont say character its confusing. You always control the PlayerController, the default or some custom. Your character is possessed by the PlayerController and that is
why your Character reacts to your input.

A HUD is a part of the Controller. It is the representation on your Monitor. A widget is some piece of data which is also owned by the PlayerController.

You see that when you call Get HUD (the input pin asks for a controller) and when you create a widget (the input pin asks for a controller).
To find the best place for the container of your widgets references depends on your game architecture i think.
Every logic which affects the widget i implement inside the widget and then call that function from the Creator Class from the widgets reference stored there.

Again, the HUD draws and draws and draws… a.s.o.

If you do some changes inside a widget (f.e. toggle the visibility) the effect will be immediatly presented on your screen, because the HUD draws and draws and draws and if a widget say: “Dont draw me” the HUD will do so :slight_smile:

I’m a beginner, but what I ended up doing is just using the HUD as a container for my UMG widgets. Then when I need to access the widgets, I just use GetHUD. If I need to draw something (not widget stuff, since they do their own drawing), I just do it in the HUDs Event Receive Draw HUD.

This may not be the “right” way or “best” way since it does entirely depend on how your game is organized, but so far it’s working OK for me.

That was how I planned to go, but I think it still require some tests to figure out the best strategy.

thanks a lot everyone

please remember that as soon as a widget is part of the HUD it will be automatically drawn within the Event Receive Draw HUD every frame.

If your HUD draws some other primitive stuff thats exactly the way you should go (f.e. Drawing a Crosshair, the crosshair dont need to be a widget, so it can be drawn from the HUD´s side)

The best and the right way is hard to tell, but i want to give you a “thinkaboutit” on your way.

Considering this:

  • you have a Widget which represents your Character Inventory.
  • you will need to set up a Connection between Widget and Character, right?
  • to access the Widget you need the HUD for it, does a HUD has any relationship with a Character?
  • to build connections between independent class (Character and Widgets are independent from one to another in the beginning) you can use interfaces for that
  • what you see in most tutorials, only a few people using interfaces or dispatcher, you will most likely see that they are typing Get Character or Get Controller inside a widget. This works sometimes, but even Epic made a video about circular dependencies and the unholy REINST error, which i came across too :), and referring to this behaviour.
  • i was praying it is a good way to create widgets inside the playercontroller, why? because he knows all needed classes around him, the HUD, the Widgets, the possessed Character with an inventory.
    However, this works quite well and good as long as you dont load any levels. I thought a Controller would be persistent and now i am experiencing it isnt, this wont mean i will break my structure i did so far but i need to adapt to this circumstances.

I highly recommend that you draw your Game Layout (partwise) on a sheet of paper and think about the best way for you to get going.
I also recommend that you try to keep that structure until you really really really know it just cant work the way you thought :slight_smile:

HUD is a hold over from the old days, but it’s also used by a lot of people still because there’s still some stuff that’s only possible with the primitive Canvas drawing API that will take us some time to introduce in Slate. What I’ve been considering doing mirrors with many people have done organically, their HUD will become another widget host. So like adding widgets to the viewport, you’ll have some way to add them to a player’s HUD. HUD I would probably rename to UI Host, UI Panel, UI…something I dunno, still up in the air. Then we add some stuff at the project level, like what HUD class do you want to use, what widget should be created and hosted in the same subframe as the HUD. This is the direction I wanted to go to solve splitscreen, where by hosting widgets in a players HUD, you implicitly handle scoping player input, placing it in the right subrect for splitscreen…etc. These are the things I think we’ll start looking at solving after 4.7 for UMG.

so what you are saying is that HUD is the container of widgets.

so what about if I create the HUD with some methods in it to switch like from PauseMode to PlayMode, RadarMode, etc…,
and then from the hud itself I will access the playercontroller in order to find stuff like current Health, etc…
while the PlayerController will only have a reference to the HUD and will only be able to use the HUD methods instead of directly talking to widgets (like by switching HUD modes)

Would this be a good approach?

HUD would become a container of widgets, or would have access to something that is that manages a set of widgets per player’s screen rect. It currently isn’t, but that’s my idea for it in the future.

I would expect the HUD to more or less be an invisible thing, and for people using UMG, they just know they can add widgets to a PlayerScreen (maybe a good name for it), and make calls like remove widgets from players screen, add widget to players screen, get widget by name on players screen…etc. The HUD need not access things like the player controllers health. That’s something the widgets themselves can do.

Having the HUD be a viewcontroller, and managing swapping out the right widgets for a given state seems good for a game implementation. I think if we’re going to solve it at the engine level it’s larger than that, as a PAUSE state for the game may need to cover the whole screen, even in a split screen scenario, which would mean adding widgets to the viewport and not the player’s HUD.

seems great, looking forward to future updates then :slight_smile:

Sounds good. How do you imagine things like a Canvas Panel and nodes like ‘Slot as Canvas Slot’ would play into this? Would Canvas Panel be removed from UMG altogether and be something inherent to this new ‘PlayerScreen’?

Also, have you looked further into this since you made this post? Is there any news on progress for the implementation of this PlayerScreen?

You miss understand. There’s Canvas, and then there’s UMG’s Widget Canvas. AHUD uses a UCanvas, which is the old immediate mode UI rendering technology provided with UE 2,3,4. The UMG Canvas isn’t going anywhere. Yeah Player Screen is a node now instead of viewport you add a widget to a player’s screen.

Nick Darnell,

Back in early 2015 you said the following:

“What I’ve been considering doing mirrors with many people have done organically, their HUD will become another widget host. So like adding widgets to the viewport, you’ll have some way to add them to a player’s HUD.”

This seems like an amazing idea and I am just wondering if any progress was ever made on this. I am trying to use a mixture of HUD Canvas Drawing (old inventory system I wrote before UMG existed) and new UMG windows (because UMG is awesome!). However, it appears that UMG always renders on top of everything else, so there is no way to mix Z-Order with items drawn using the old Canvas system. If there was some way I could manually make the draw calls for each UMG widget from my AInventoryHUD::DrawHUD() method, then I could control the Z order. Anyway, your idea seemed like a good one so I was just wondering if there is an update on it or if there is some other direction you can point me to mix Z orders of UMG and Canvas.


This doesn’t help, but I thought I’d add that UMG widgets do NOT render over post process effects, or at least they don’t by default.

Post process effects are affecting UMG widgets in my project. I don’t want them to, so I have to disable them.

@Nick Darnell after 4years later, and the ue4 version is 4.22. is this refactor going to happen?