Hello, I am looking for a way to make my game main layout look something like these old games:
So I want the main window to be smaller than fullscreen, fix it to 65% of the screen lets say, and rest of the area be covered with
an UI/game subsystem that is big part of the game too, but doesnt require expensive rendering.
If possible though, I would like to create one skeletal mesh on top of the UI. A bit like the character equipment avatar is in the Ultima Underworld screenshot.
If that complicates a lot or requires double rendering, I could skip that and just make prerendered video files of the avatar character on the UI. A realtime mesh would be ideal though.
Is this obtainable via blueprints and UI tools, or does it require complex programming?
Does anyone have any useful starting tips or tutorials where I should look at?
Look into rendertargets. Say you have the camera render into a rendertarget that you then display on a quad in the center of the screen. That might help. As for getting the main camera view to not display on-screen, I have no idea.
What you are talking about seems rather contrary to the way the vast majority of current games(and game engines) do it, so it may require some messing around in the engine code.
I try the rendertargets… feels like an awkward way though.
The code itself is in the engine, as the way you can move and scale the viewport inside the editor is just what I want, If I could just set it the same way in UMG.
Main camera aspect ratio also can be easily adjusted, thus scaling the image vertically. Doing same horizonally isnt adjustable in the camera settings though.
It still sounds kind of weird. Say I wanted to make a simulator game with 1/3 screen covered by a large menu. I would put a large UMG menu on top of the view, but the engine would still have to render/calculate everything what under it? Seems kind of waste.
I know its modern trend for FPS/TPS games to have as little UI as possible like in Dead Space, but its not the only way games are.
Hmmm… I tried setting up a Scene Capture Component 2D instead of a normal camera. Then I set the UI so that the rendertarget is the size and position I want it to be.
The positioning works wonderfully, but other problems occur;
1)RenderTarget neglects the post processing I have.
2)There is still the normal rendering occuring underneath my UI, meaning huge FPS loss due 2 renderings. Its more of an picture on a picture instead of a repositioned picture.
3)Rendertarget display ignores my vertical movement for some reason! My character cant look up and down, only turn left and right and move as usual.
So I made a temporary test UI. It still lacks most of the fundamental components what make the use of such UI important for me,
but at least it demonstrates the style a bit for what I am aiming for.
It works now too, but I would LOVE to stop the game from rendering everything under the UI as it wont be visible anyway.
The framerate is better when I have a window of the same size open, than in the fullscreen where UI blocks everything.
For what you want to achieve, you are going to need to dive into the engine source. To read the source, just open a filesystem browser to your UE4 install directory and open the [YourVersionHere]/Source/ folder. If you want to modify the engine, you will need to compile it yourself. If you go to UE’s github page, it has instructions for how to compile the engine from source.
As far as usage goes, I would say 99.99% of games in Unreal’s primary market (AAA games, etc…) will be rendering to the entire window, so it may have been seen as a waste(or not even considered at all) to create a system to allow the user to pick-and-choose what part of the window they want to render to. Such efforts might be better spent on UMG, for example. And the games you are showing are using custom engines(I think) anyway. So that’s my rationale for why the engine doesn’t support rendering to a specific portion of the game window out-of-the-box.
Yes. I understand what you are saying. Yet, at the same time it would still be simple task as well to have the option.
At the moment, you can adjust the aspect ratio of the camera, and setting it to lower values will stop the rendering from the area outside camera view.
Set the aspect ratio from 1.777 to 2.35 and you will see framerate increase. Having the same setting to reduce horizontal length would also give a framerate increase.
You can currently also can put UI layouts on the black areas outside viewport rendering, doing exactly what I want without expensive cost.
Having an offset to the abovedone settings to set the position to whatever area on the screen would not be complicated… except for me, as it requires source coding :P.
Edit: as for the 99.99% part you said, I very strongly disagree. Thats an attitude that only makes 99.99% of all games 99.99% similar generic mass! And by giving UE4 first for 20$ and then for free, shows that Epic is not only interested in AAA markets, but wants their excellent engine to be used for all sorts of creations and creators.
I agree, I was simply speaking about the way current(and semi-past) market trends may or may not have influenced the decision to not implement such feature. Time is a limited resource after all, and the time spent to implement a feature that currently only a handful of people will use is time that can be used to fix(or implement) features that nearly everyone needs and will use. No argument, just my $0.02.
A cursory look at the source reveals that you may want to start by taking a look at the Source/Editor/LevelEditor folder. The level editor is the window that handles displaying the world in whatever configuration you choose.
Ah yes of course. I get what you mean by the priorities companies and engines must have.
Thank you very much for the help! I will put that into a look - or when I get some additional help in few months.
I made a rather silly temporary solution. First I set camera aspect ratio as narrow as I can so it will render least pixels possible within the marigins of my UI. Then I created a simple unlit static mesh which is a box with a window, that I constrained to my first person perspective character camera so, that the box mesh is hidden beneath the UI, but stops rendering the areas underneath the UI. Its quite a bubblegum solution, which I assume will cause problems at some point, but hey these gave me +8-12 frames already ^^!
For the avatar I wanted to implement in the UI as a mesh, I made a small area separated from the main area, where I use a scene capture component and then display it on the UI via rendertarget. The lack of post processing doesnt hurt in this case as its just 1 mesh I can design to fit the look by hand. Setting the light for it to match the light of your character position is bit problematic though… I wonder if I can somehow capture the light of players location and cast it to a lighting volume/lightmass where the “avatar” is somehow… hmmm…
You are quite welcome. As for displaying the character, there was some talk bandied about around 4.5 or so on the subject of implementing a Viewport widget class that essentially held a custom UWorld, for things like inventory or character preview screens. As far as I know, this was never implemented, but it does provide some ideas.
Again, more complicated engine stuffs, but the Static Mesh Editor (and/or Persona (the skeletal mesh editor)) might provide some insight. Each are creating a separate UWorld to hold the preview asset. Both should be found in the Source/Editor folder.
Have you actually any kind of “data” that tells you it is really that wasteful ?
I would not worry about such a thing in such an early stage of a project.
Unless you do full blown AAA quality graphics you should be fine even if you render it “behind the UI”.
That said - I love you wanting to go back to old school UI features, they were so awesome!
No other data than keeping the “show FPS” in the editor turned on. I have set a various testing spots in my maps where I can test performances under different settings, under same conditions.
In those spots I could see an increase of 8 to 12 frames depending on the spot, with the bubblegum solutions I did.
I want to aim mid specs computers so I want to keep the costs down even if I dont have AAA- graphics :).
Its not actually thaaat early stage. Polishing needed and plenty of stuff still missing, but Ive mostly been creating content and left basic stuff like this to the later stages, but now I realized I gotta figure these out soon before I set everything on stone…
I am not going completely old school though, I want to have a mix of both worlds. The paper book looks static in pictures, but moves like an normal first person rpg. When you hit an enemy with critical hit, blood splashes “out” of the screen on the book pages, all the texts are “written” on the pages with slide fading effect and sounds, moving to new chapter makes the pages of the book turn, and the avater I still am applying is a semi interactive mesh too which displays status instead of HP bars etc. Its a big UI but still has dynamically changing elements in it.
Argh. And now I got problem, that I cannot communicate between UI Widget Blueprint and Level Blueprint.
The reason I need to send variables from UI WidgetBP to the Level BP, is that my levels are loaded with Matinee actors, and only Level BP can play Matinees… but Widget Blueprints are not readable in LevelBP the way class blueprints are :/. Thus I cannot make object variable to target to LevelBP, nor cant open WidgetBP`s Event Dispatchers in LevelBP, nor cant find help from Blueprint Instances either.
I would like to keep the communication as simple and easy to edit and add things as possible, as my UI has plenty of variables, and my levels are full of Matinees I want to use UI commands to play.
As current solution I was able to create custom events in LevelBP, which will be triggered by UI WidgetBP by “execute command” node. I am not sure if this is the best way to go at all though.
Another very direct way was to create “button meshes” on my player characterBP that are not visible but trigger the actions and send the variables to LevelBP when pressed, but that feels kinda silly and prone to bugs. Or hmm, maybe I could pass the variables to LevelBP via my characterBP/dummy BP in between… it sounds like awful many stops to pass a single variable though.
The way I set it up in my project was to have the Level BP spawn the widget, and then bind the necessary event dispatchers.
If you want to pass variables, I would suggest subclassing the HUD class in Blueprints, and spawning the widget from there. At that point, you should be able to set variables on the HUD from the Widget BP and call Level Script events from the HUD BP.