Oh this is great stuff - do keep blogging!
Information on this is thin on the ground as far as I know. But what I do know is that it depends if you’re running in PIE or Simulation mode. If you’re in PIE mode, when you enable the debugger it’ll pick whichever controller is closest to you (not sure if it’s cursor position or world position). If you’re in simulation mode, it’ll use whichever one you click on - this is why it’s preferable to be in simulation.
I haven’t yet had a chance to look at expanding it for your own purposes, but it’s certainly something I’d like to get to in the future. Please do let me know if you get anywhere with it!
Reminds me of Prince Of Persia, UE4 style
The game looks amazing - love your pixel art.
Such a good looking game man keep it up !!
Ding ding! 10 points. PoP is an inspiration and such a great series, glad it comes across in some form.
Aww gosh, thanks a lot. Nice words like this are creative types fuel. Us pixel pushers thank you for your kind sentiments.
UI elements and never ending polish
With us now looking to build in a nice tutorial area for preliminary outside play testing, I decided to have a go at blueprinting out dynamic control GUI elements that would fade up and down based on player proximity as well as be able to switch out sprites depending on platform being used (such as Xbox, PS4 or PC). Now I’m a pixel pusher by trade and logic always makes me cry but I was quite surprised how easy it was for me to thump this out and I was quite pleased with the results.
I’m only so far along with it but the platform switching can be done within the blueprint via a simple variable so I’m assuming the code types in our team will be able to call that from a menu/option setup when the player decides what control system they are using and the game will adapt automatically. It will also eventually need to handles redefining the controls so it displays the appropriate button/key for the set action, but that’s beyond the scope of what we need for the initial demo (and perhaps my abilities)!
In other developments, I have always been a bit miffed that the dawn and sunset images worked so well and were far more interesting to look at (even though they are background visuals and shouldn’t really be distracting the player) than the night scene was, it came off as far more flat and dull. With this in mind, I went back to it and decided to give it a new lick of paint. I searched around the internet for ideas of how to make night scenes a bit more interesting and dynamic and looked to the sunset/dawn scenes already done as well. The strong light source is what really let me make parts of the scene pop out so I decided it’s a clear enough night time, why not have the full moon bounce a lot more light down to the mountainside below and pick up more detail of the rivers and city in the background.
It didn’t take too long to switch it up, I’m fairly happy with the results and it’s another step in the right direction. Its only subtle changes from the old version for the most part and it may be touched up again in the future but for now we have plenty more to be getting on with. As usual, any criticism, questions or feedback are welcome.
Looking good! Keep up the amazing work!
AI Update #2
Creating a workable perception system for a 2D game in unreal has proved a much bigger challenge than we had expected. However, despite the initial problems we’ve managed to come up with something perfect for our needs. One of our coders, Rex, did a great job in creating us a bespoke sense configuration for 2D.
What we now have is a collection of several 2D cones, which give a much better representation of the AI’s sight. We’ve also offset the position of the visibility raycast to go from eye to eye, rather than the centre of each character.
These two changes have made a massive difference to how it functions, and give us much more room to add interesting gameplay features. There’s plenty of tweaking ahead but by and large the AI sees you when you think it should, and hiding from it is also much more intuitive than before.
Speaking of tweaking these values, Rex also implemented a new debug to replace Epic’s own Gameplay debugger for our purposes. It easily lets us know which cone/s we are in, and if the AI is successfully able to raycast to the player.
Unreal Engine has a really powerful tool known as the ‘Environment Query System’. This system allows you to ask questions about the environment around you, and filter it down into useable bits of data. Using this hugely powerful tool for a 2D game is obviously overkill, but the key advantage is that they are very easy to create and test just using blueprint and the in-game editor. My aim wherever possible is for everything to remain completely readable by anyone on the team, so they can make their own improvements and suggestions.
At the moment I’m only implementing a few simple tests, for example: ‘find the nearest waypoint I can successfully plot a path to’ and ‘pick a random spot near to where I last saw the player’. Here’s an example of picking a nearby valid point (the blue points are discarded as the AI can’t reach them):
I’m looking forward to expanding these in the future using a few simple tricks to help the AI make smarter decisions without cheating too much. For example, when I create a query to pick a random spot near where the AI last saw the player, I can weight the results towards where the player actually is (even if they are hidden!).
Last Known Position
Another important aspect of our perception system is the concept of ‘last known position’, which I’ll refer to as LKP from now on to preserve my sanity!
Unreal’s perception system has it’s own concept of LKP, but we aren’t currently using it just yet. My simplified version positions an object whenever the player leaves the view cones. This position is then used as a start point for a search, should the AI reach the point and still not get a visual on the player.
Having this LKP object also allows me to deploy another classic stealth AI cheat which I like to refer to as ‘6th sense’.
Imagine a situation where I pass through an AI’s view cones heading in a direction it can’t see. How do I make the AI seem smart and know which way the player has gone? Sure, I could make it super complicated by using things like Dead Reckoning combined with multiple EQS to decide which cover the player is likely to be in. This is the sort of thing you’d find in Crysis or Halo, and as such is somewhat beyond our scope as a 2D game.
Instead, the LKP is updated for a short time (~0.5 seconds) after the AI has technically lost sight of them. From the player’s perspective, this usually just looks like intuition, and would only look like cheating if the time is too long. As with most things in life, this is best explained with a gif.
The green tick in the cross hair is my LKP. See how it continues to update even after the player character leaves the view cones.
Suspicion is what we use to determine what the AI thinks is worth investigating, and later chasing after. The rate the suspicion increases is determined by whichever cone the player character is currently in. If they are in more than one cone, the cone with the higher rate of suspicion is used.
Currently we have 3 cones:
Close Cone - Instant max suspicion. If the AI sees you here, it will chase immediately.
Mid Cone - Average level increase.
Far Cone - Small increase.
Note that currently this suspicion system doesn’t take into account how well lit the player is. I’ll try adding a modifier based on that during the next pass as right now I just feel it’d muddy the waters while we test out the basics.
I also added some code to cover colliding with the the AI. Spoilers - they don’t like it very much.
All these new features mean there’s a lot more going on in the Behavior Tree now. I think it’s time to start splitting these into separate behaviors . . .
There are a few other issues that have started to appear, now things are a bit more complicated - I have quite a few areas where I quite harshly abort sequences if things change (like for example the player leaves the AI’s sight) and these now cause visible hitching as the AI flip flops between two branches. I’ll need to take a step back and rethink some of these longer sequences and find some more graceful break points.
So, that’s all for now! What’s next?
- Hooking up all the animations
- Moving the view cone around (looking up and down, moving up and down slopes etc)
- Adding some placeholder sounds
- Lots of tweaking of the ‘magic numbers’ until it feels good
Thanks for reading! I’d be happy to hear any feedback if people have any advice or thoughts to offer here.
I would never have known it was UE4, it looks like it was made using something that specializes in this sort of thing. Great work!
All the yes. So reminiscent of Prince of Persia! I hold that series close to my heart. I am so grateful you’re taking a leap into creating a high-quality, stealth “Metroid/PoP” experience! All in Unreal Engine, too. This is very commendable.
Deep, acrobatic control system - Wall running, grappling, swinging and more.
Everything will distort, everything will be unquantifiable
We had been using refractive elements within the game from pretty much near the start of the project for the easy “+10 points” that refraction/distortion gives any game. The only problem was whilst the effects rendered great whilst in the editor (using a perspective based camera) when we moved to the orthographic camera in gameplay, any effects based on refraction were simply not rendered.
Refractive Particles In Editor | Refractive Particles In Game
We carried on this way for a while, knowing of the problem and assuming that the orthographic camera might get an update at some point to support more of the effects that Unreal is capable of rendering but after a while it became clear a different approach might be needed. As we don’t actually NEED physically correct refraction, we just want to have the cool effect of heat haze or water visually appearing to distort the world behind it, refraction is actually overkill for our needs anyway. We got to thinking all we really need is to be able to distort the final scene texture and apply it to the scene with bounds of our choosing, either by particle effect or sprites. I started looking through post process effects and how they are applied to the screen (the Unreal Wiki came in very useful here) looking to the custom depth pass to mask out areas of the scene. This quickly felt like too much for the effect we wanted, a bit overkill, and I felt there must be a simpler way to do it. There was!
By using the ScreenPosition node and distorting the scene texture via noise maps that are also panned and rotated on top of each other at mixed scales, we get a distorted, haze effect across the screen. We now only use this material on sprites or particles where we need the effect to happen and we quickly and easily get the intended effect on a orthographic camera in game, bringing our fires and watery effects to life, making the world feel more dynamic. The material I made up looks something like this if anyone wanted to achieve something similar in their 2D games, the premise can be carried across to any material type, particle, sprite or post process.
With this hurdle overcome, we can now have the fire haze we wanted, along with distortion when walking behind waterfalls or pools of water within the game. These effects are all ongoing, work in progress (and some are quite subtle in these small potato GIF’s) but I figured the post might be useful to budding 2D Unreal developers. As usual, any criticism or feedback is appreciated.
This is looking phenomenal! Can’t wait to see more! Any idea of a release window?
Thanks for the compliments Brian (and everyone else!). Unfortunately we cannot comment on a release date yet, as it is still quite early in development, and there are a lot of unknowns.
…of course, you can follow our social media channels and sign up to our newsletter to stay notified on developments and news updates
(in my signature - newsletter subscription at the top of our website home page)
Thank you for sharing how you set-up your refraction shader! It works so well in your environment, giving that nice finishing touch. (I’m especially in love with the last .gif of the underwater section).
Yeah, once the full set of effects are sorted and we have got the surface to react and ripple to entry/exit it should be very cool we hope. I’m sure I’ll be throwing up a GIF of that when its done.
I’m going to hold you to that!
I can’t believe you made this with Unreal. Really beautiful pixel art, but I’m impressed on the tech side, the way you managed to use the engine. Nice!
Great work guys! I’m super excited for this. It takes me back to when I used to look at screen for Super Nintendo games when I was a kid.