Any info on the future of Blueprints and AI integration? Blueprint Visual Scripting in Unreal is an amazing tool but hasn’t been updated much since the UE5 era. Anybody know if there is a future for UE Blueprints or is it going to be left behind in the ai arms race?
I’m moving to Unity if it does…
Why? I use Chat GPT alongside Unreal to develop and it’s astounding how much Chat understands my development goals. Unreal on the other hand (my primary tool) has no understanding of what I want it to do. I have to painstakingly drudge through every tiny implementation. If Chat GPT was used as a controller/agent within Unreal (or Unreal had it’s own tool like this), that would be a 10x development speed up don’t you think?
So, I just asked ChatGPT on how to make an turn-based RPG with UE 5.1 and it just outlined the steps on making one without doing any of the actual work.
I can’t see how people can defend this stuff
Right, thanks for highlighting the problem. GPT can understand your game development goals but can’t process any of the work required to materialize your goals. Unreal is a tool for humans to process their development goals but has no understanding of the goals it’s processing towards thus making everything more time consuming then it needs to be. So some way or another we need an AI Agent that understands the goals I am seeking to achieve and can deploy actions within the engine to speed up (and reduce costs) on game development.
Well, when I asked it on how to provide an example I got this, which is somewhat accurate
How to make an RPG
Sure! Here’s a more detailed guide to creating a turn-based RPG in Unreal Engine 5.1:
Step 1: Set Up Your Project
- Install Unreal Engine 5.1: Download from the Epic Games Launcher.
- Create a New Project:
- Open Unreal Engine and select Games.
- Choose Blank or Third-Person template.
- Select Blueprint as the project type.
- Set the project location and create the project.
Step 2: Design Your Game Mechanics
-
Outline Game Concept:
- Define the story, characters, and gameplay mechanics.
- Create a design document to keep track of ideas.
-
Turn-Based System:
- Decide how turns will work: initiative rolls, player and enemy turns.
- Plan the sequence of actions during combat.
Step 3: Create Character Blueprints
-
Player Character Blueprint:
- Navigate to the Content Browser.
- Right-click, go to Blueprint Class, and select Character.
- Name it (e.g.,
BP_PlayerCharacter
). - Add variables for health, attack power, defense, etc.
-
Enemy Character Blueprint:
- Repeat the steps to create an enemy blueprint (e.g.,
BP_EnemyCharacter
). - Customize stats for different enemy types.
- Repeat the steps to create an enemy blueprint (e.g.,
Step 4: Implement Turn Order Logic
-
Create Turn Manager:
- Create a new Blueprint (e.g.,
BP_TurnManager
). - Add an array variable to hold active characters (players and enemies).
- Implement functions to manage turns:
- NextTurn: Move to the next character in the array.
- EndTurn: Reset the turn for the current character.
- Create a new Blueprint (e.g.,
-
Handle Player Input:
- In the Player Character blueprint, set up input mappings in Project Settings.
- Use the Input Action nodes in the Event Graph to handle actions.
Step 5: Combat System
-
Action Selection:
- Create a UI widget (e.g.,
WBP_ActionMenu
) for selecting actions. - Use buttons for Attack, Defend, etc. Link button clicks to specific actions.
- Create a UI widget (e.g.,
-
Damage Calculation:
- In character blueprints, create a function (e.g.,
CalculateDamage
) to compute damage based on stats. - Subtract damage from the target’s health and update the health bar.
- In character blueprints, create a function (e.g.,
-
Animations:
- Import animations for attacks and abilities.
- Create an Animation Blueprint for your characters.
- Use Montage for action sequences, triggering them during combat.
Step 6: User Interface (UI)
-
Health Bars:
- Use a Progress Bar widget to display health.
- Bind the progress bar’s value to the health variable in the character blueprint.
-
Action Menu:
- Design the action menu using a widget.
- Use Event Dispatchers to communicate selected actions to the turn manager.
-
Turn Indicator:
- Create a simple text or graphic element to show whose turn it is.
- Update this element within the
NextTurn
function.
Step 7: Level Design
-
Create a Battle Map:
- Use the Landscape tool or place static meshes for a battle arena.
- Ensure the map is designed to accommodate turn-based mechanics.
-
Set Up Spawn Points:
- Create specific spawn points for players and enemies.
- Use Box Trigger or Spawn Actor nodes to spawn characters at the start of combat.
Step 8: Testing and Iteration
-
Playtesting:
- Regularly test gameplay to ensure mechanics work as intended.
- Look for bugs and issues in the combat system.
-
Feedback Loop:
- Gather input from testers on gameplay balance and mechanics.
- Iterate based on feedback to improve the game experience.
Step 9: Polishing
-
Sound Effects and Music:
- Import audio files and use Sound Cue for various effects (attack sounds, ambient music).
- Trigger sounds during actions in the combat system.
-
Visual Effects:
- Use Particle Systems for attack effects (e.g., magic spells, physical attacks).
- Implement screen shake or other effects to enhance combat feedback.
Step 10: Packaging and Deployment
-
Optimize:
- Use Profiling tools to check for performance issues.
- Optimize assets, reduce draw calls, and improve overall performance.
-
Build Your Game:
- Go to File > Package Project and select your target platform.
- Follow the prompts to create a standalone build of your game.
Additional Resources
- Unreal Engine Documentation: Explore detailed guides on blueprints, UI, and gameplay mechanics.
- YouTube Tutorials: Search for step-by-step tutorials specific to turn-based RPGs in Unreal.
- Forums and Community: Engage with the Unreal Engine community for advice and troubleshooting.
By following these detailed steps, you can create a solid foundation for your turn-based RPG in Unreal Engine 5.1. Good luck with your development!
To me asking about handling the initiative roll
It literally wants to rip off an board game
Implementing an initiative roll in a turn-based RPG using Unreal Engine 5.1 involves a few steps within a Blueprint. Here’s a detailed guide on how to set this up:
Step 1: Create Initiative Variables
- Open Your Character Blueprint (e.g.,
BP_PlayerCharacter
orBP_EnemyCharacter
). - Add Variables:
- Create an integer variable named
InitiativeRoll
. - Create an integer variable named
InitiativeModifier
(if you have any modifiers that affect the roll).
- Create an integer variable named
Step 2: Set Up the Initiative Roll Logic
-
Create a Function for Rolling Initiative:
- In the Blueprint, create a new function named
RollInitiative
.
- In the Blueprint, create a new function named
-
Implement the Roll:
- Use the Random Integer in Range node to simulate rolling a die (e.g., a d20):
- Set the range to 1-20.
- Add the
InitiativeModifier
to the random roll. - Set the result to the
InitiativeRoll
variable.
Here’s how the nodes should look:
- Random Integer in Range (1 to 20) → Add (with
InitiativeModifier
) → SetInitiativeRoll
.
- Use the Random Integer in Range node to simulate rolling a die (e.g., a d20):
-
Call the Roll Function:
- You might call
RollInitiative
at the start of combat, or whenever you need to determine initiative. - You can trigger this function in the Begin Play event or through a specific event (like a button press).
- You might call
Step 3: Store Initiative Rolls in the Turn Manager
-
Modify Your Turn Manager Blueprint (e.g.,
BP_TurnManager
):- Add an array variable named
InitiativeOrder
to store characters sorted by their initiative rolls.
- Add an array variable named
-
Create a Function to Sort Initiative:
- Create a new function named
SortInitiative
. - Loop through all characters (players and enemies) and call their
RollInitiative
function. - Store each character’s
InitiativeRoll
in theInitiativeOrder
array.
Example Steps:
- Use a ForEach Loop to iterate through your active character array.
- Call
RollInitiative
for each character. - After rolling, add them to the
InitiativeOrder
array along with their reference.
- Create a new function named
-
Sort the Array:
- Use the Sort node to sort the
InitiativeOrder
array based on theInitiativeRoll
values.
- Use the Sort node to sort the
Step 4: Execute Turns Based on Initiative
-
Turn Management:
- In your
NextTurn
function, iterate through theInitiativeOrder
array to determine whose turn it is. - Use an index to track the current character’s turn.
- In your
-
Handle Turn Transitions:
- At the end of each turn, increment the index and loop back if it exceeds the array length.
Example Blueprint Nodes Structure
-
RollInitiative:
- Random Integer in Range (1-20) → Add (InitiativeModifier) → Set InitiativeRoll.
-
SortInitiative:
- ForEach Loop (over active characters) → Call RollInitiative → Add to InitiativeOrder → Sort InitiativeOrder.
-
NextTurn:
- Use the sorted
InitiativeOrder
to determine the current character for their turn.
- Use the sorted
Conclusion
With this setup, you can effectively manage initiative rolls within your turn-based RPG. This allows for a dynamic combat experience where character order is determined by their initiative scores. Adjust the rolling logic as needed to fit your game’s mechanics, such as adding bonuses or implementing different die types.
But I genuinely disagree with this part. Because one shouldn’t rely on an AI to draft out the outline of an game. It’s the same as pointing out what’s essentially common knowledge, nowadays.
I am not suggesting Unreal have an AI Documentation that can be consulted or auto generate a template. I mean an AI Agent that can scan and modify blueprints, actor files and UMG Widgets.
Example: the other day I spent over an hour trying to figure out why a widget stopped interacting in a vr game. Turns out nested in the tree of widgets, a crucial layer had Not Hit Testable All Children instead of Not Hit Testable Self. When will I be able to write into Unreal “Homepage Widget is not interacting, please scan”, and 3 seconds later Unreal sais “X Layer is set to Non Hit Testable All Children, Stupid. Should I fix?”.
So much of our time is wasted pinning down stupid little issues like that because the engine has no ability to Scan itself in relation to my goals. I can’t ask it anything in plain language. I have to use my human eye balls to go through every possible little reason why X widget isn’t interacting (as an example).
Whilst I don’t necessarily agree with OP that this would be a particularly helpful development, relative to other development work by Unreal that could be done, I do want to mention that I don’t think the example you gave demonstrates anything other than a poor use of ChatGPT.
At the end of the day these are tools, like any others, they have their strengths and weaknesses, and it’s not about them solving everything perfectly. Good, and helpful, use of them is about applying them to the correct situation.
If you ask ChatGPT how to make a game and get a reply on the steps to make one, surely isn’t that it doing a relatively good job, given what was requested?
Mind you I’m not saying if you’d requested it to “build a game” it would have done it, but again that’s probably not a use case it’s well (or at all) suited to!
Edited to include quote being referenced
Whilst I don’t necessarily agree with OP that this would be a particularly helpful development, relative to other development work by Unreal that could be done
Name one thing more valuable than having an AI Agent in Unreal that can interpret natural language to scan and modify blueprint code and settings. I am genuinely curious what you could possibly have in mind that is more valuable than that.
Mind you I’m not saying if you’d requested it to “build a game” it would have done it, but again that’s probably not a use case it’s well (or at all) suited to!
The only reason ChatGPT can’t build a game with Unreal is because it can’t touch anything within Unreal. It is stuck OUTSIDE of Unreal, we need an AI Agent INSIDE of Unreal, be it GPT or something else.
I wish I could say “Unreal GPT, bring up all functions associated with punching”. But Unreal has no idea what is a punch or a jump, or a door open. It has no understanding of anything beyond how it’s tagged. I don’t think people are acknowledge the immense speed up of having an AI Agent in Engine.
That could be useful, but it kind of raises some liability concerns if it doesn’t provide you an complete log of what it’s changed.
Granted, it’s great for giving advice. But the first answer is just the really basic overview. The next one kind of dives an little bit deeper in it. But at the rate things are going, I’d rather be watching an guide on YouTube. Around an fifth of what I’m getting is kind of off-topic.
Other than that, it’s not exactly as in-depth as I’d like it to be. But that might be just me using the free version of it
Personally, I think the ‘AI bubble’ is close to bursting.
LLM AI is auto-complete on steroids. I can only give you a combination of what’s gone before. Which sounds like a recipe for monotony to me.
There is no creative element ( not to mention the coding equivalent of hands with 7 fingers etc ).
I sure hope the guys at Epic don’t think the AI bubble is bursting because they could soon find themselves in competition with engines that create game elements from natural language. Or at minimum be debugged and navigated with natural language (which is what I wish UE had).
Did you watch the Meta Connect that went live a few weeks ago? Meta showed off mesh generation from natural language within Meta Horizon. They are effectively making a game engine with llama integration.
It was only a couple years ago that we all thought Real Time Rendering was the future of linear media. Then Runway and Sora come out. Now why would studios who are focused on speed and their bottom dollar spends hundreds of thousands of dollars and years of time in an FBX pipeline when Runway can create the same or better in minutes? Especially for backgrounds (which is what UE was used for on Mandalorian for example).
The semantics of what we call AI or your personal opinion is irrelevant. It’s hard to imagine Unreal keeping up with newly emerging competition if it doesn’t incorporate AI (Algorythm that Effectively Understands the users goal from natural language input) in some form soon.
I get that you’re a big believer, I am not
Prediction: There will not be natural language alternative to Unreal in the foreseeable future.
Yes, you will get things that make meshes or backgrounds or basic games. But you won’t have enough control to make anything decent.
Already, even with just meshes, how are you going to make a game ( or background ) that looks different to all the run of the mill stuff around now. You can’t, because GPT ( or any LLM ) can’t give you things you don’t already have. It can give you a trippy mishmash of everything that’s gone before, but for real creativity, you need a talented artist, thankfully
I am gonna side step the philosophical discussion about what counts as “real creativity” because i don’t care. What I want is an AI Agent that can scan and modify blueprints, actor files and UMG Widgets from natural language. You either have the inginuitive foresight to understand the vast workflow speed up this would provide us all or you don’t.
Well, for one. You’d need an way for it to navigate the directory, along with accurately modifying nodes the way you told it to. But it doesn’t exactly feels worth it, in my opinion. Mainly because it’s not exactly engaging to repeatedly define an specific set of instructions and hoping that it’ll work itself without breaking anything as opposed to learning something new about the process
I’m not sure if you’re aware of this, but they’ve recently managed to make an “AI version” of Doom that’s mainly based upon preexisting gameplay. As you’d expect, it’s kind of horrible at rendering enemies that are an fair distance away. In fact, it actually looks worse than the original Doom.
Yes, I did see that
I’m not against ‘AI’, but I think we have to clarify a few things
The current LLM AI is going to be great at find cancer from scans, cross-referencing star maps, writing ( some ) documents etc.
Would I like to be able to design all my widgets and user interface by just describing how it should look and work to a ‘robot’? Absolutely, yes, great labour saving device.
But, do I want Epic to fall into the same trap that is currently affecting Microsoft, Google, Apple, Nivdia and many more? No. Because these firms are just worried that they don’t have the latest toy, and so have shoehorned LLM into their products in a way that is just going to pis off everyone and decrease productivity.
Meanwhile, I’ll be happy to have the cheese falling off my pizza, rather than the alternative
May not be 100% related but we can start adding game companies to that list; the new BF will, apparently, have some whimsical features. Here’s EA’s youngest brainchild:
We’ve had 32 engine versions FULL of new amazing features! How about a modest engine release with NO new features? Only bug fixes, workflow improvements, performance improvements, documentation fix ups? Anyone?
X: I’m an engineer!
Y: Oh cool, what type?
X: Prompt engineer
Y:
You can check what’s brewing for yourself:
https://portal.productboard.com/epicgames/1-unreal-engine-public-roadmap/tabs/94-forward-looking
With sophisticated enough tools? Sure, why not - I can see it becoming a respected vocation. Sounds better than -Cheese Enthusiast on my resume. Node scripting (vs traditional coding) was frown upon (still is, somewhat) but Blueprinter is an actual skill now. It just needs a fancier name:
- Node Specialist
- Reroute Expert
- Noodle Artist
Oh yeah, AI is going to be all over games… But perhaps not the in depth dev.
I just can’t imagine what sort of prompt you could possibly give the system to develop blueprint ( or C++ ) code for subsystems and connecting them all together, where you have to get subtle concepts across. You’ll have to spend so long correcting things and shouting at the monitor, it would have been quicker to just code it.