Jeez Rama… You’re on fire And for sure, I’ll remind ya!
Speaking of tracing - One of the things that’s been bugging me lately is that if you don’t want to use physics but still want interaction between actors you have to use overlaps (as event hits don’t fire without physics). A shortcoming of that is you end up having to trace between the overlapping actors and as such will end up tracing between the pivot of the actors/components giving you inaccurate overlap point normals (not to mention no impact point).
If you ever come across that problem feel free to write a better overlapper that returns more data :). I’m ok-ish with my current hacky solution, it’s nasty and restrictive but works well enough. Sort of.
There’s gold in dem here parts of the woods anyway. Good luck with Solus!
As per your request I have now made a node to get the rotated,scaled,translated positions of all vertices of a static mesh!
This node is live in the current most recent plugin version!
Please refer carefully to my picture of how to set up this node correctly!
**[FONT=Comic Sans MS]Congrats on your first post, welcome to the forums!**
Thanks for the updated nodes. You did mention earlier about the challenges in returning vertex color information of vertices. Please bear with my ignorance :)…Still new to a lot of things here. I have made a Master Material with lots of procedural controls to allow material transitioning. But they are not giving me the aesthetic appeal I am trying to achieve.
If you are able to gather vertex positions and transformation information, why do you suggest it is difficult to set & get vertex color information in runtime? You also mentioned dynamic vs static mesh. By making a static mesh “Moveable” doesn’t it become dynamic? Also, in the Blueprint Communication content example, there is an Ice Sphere that will melt when player fires fireballs at it. Using World Position Offsets, the ice actually morphs away to little droplets. Isn’t that modifying a static mesh dynamically at runtime?
I need to tie in Dynamic Material Instancing to dynamic vertex color info. I am trying to transition material from where I click on the mesh. So for example, if I click on a spot, I will set that vertex color as red and each closest vertex to it will turn red as the fire propagates. This will reveal the underlying material.
If you can clarify my doubts further, I would deeply appreciate it. I am not able to fully understand the technical difficulties here.
I’m abit new to this part of UE4, I’ve been focusing on class Blueprints and not Level Blueprints, so I’d like to get a better understanding of what’s going on with Spawning Actors in Levels. I should also note that I mainly be doing this with AI characters.
There are two main ways I can see spawning actors in Sub Levels being useful, based on 6+ years of UE3 experience.
Firstly, if you have several people working on a map then it’s good to have a _Design level as well as _Static and _Audio levels. This was common work practice among some UE3 devs. The persistent would only hold minimal data.
On a slightly rarer note, it was something I used extensively in UE3. I was working on an open world game which was split into a number of levels, each a cube in size. Sub Levels for each could be named so that they would only appear during particular missions or parts of the story. This gave me greater control of resources over the course of the game (bloody handy on console) but also allowed me to do things like load certain enemies earlier, so that they can be seen at greater distances. Usually due to their position in the world being impossible to hide.
HOWEVER, I would never script the spawning of actors in one level via an action / node in another. An event in the sub level (usually Level Loaded & Visible / Begin Play) because the sub level would also have the behaviour logic and other data for that actor.
So, does the ‘Can / should only spawn actors in persistent levels’ affect all kinds of actors? Can I get away with doing this already with AI?
and if not, how much work would a Coder have to do to allow me to solve this problem?
I am giving you a full sample project as a download (8mb) where I’ve created a fully functional UMG Key Rebinding System!
You can click on the name of any key and then simply enter a new one on the keyboard/gamepad (just pressing the button itself that you want to be the new binding!
And I do track ctrl,alt,shift, and command!
And the list is scrollable too!
**Rebind Actions During Runtime**
And best of all, because of my Victory BP Library nodes, the changes you make in the Key Rebinding Menu instantly update the ingame character input component!
So if you rebind Jump from spacebar to page up, it takes effect instantly!
Why Am I Giving This For Free?
Because its really important feature for any game and I just finished figuring out how to do it for Solus.
I figured you would enjoy it as well, since it was honestly not that easy to do, and I had to really think about how to do both the UMG and the actual C++ code to dynamically update action mappings during runtime
**How To Use My Menu**
Go in game and press the K key!
Click on the black and red buttons to rebind the jump button!
Add new actions any time using **Project Settings->Input**
This was already incredible. now to have access to rebind-able keys at runtime AND a free sample project demonstrating how to achieve that in UMG is just all kinds of awesome :D.
Key rebinding is something many players seem to want but there wasn’t any indication it was even possible to do that in UE4 (at least via blueprints), so thank you very much for another incredible addition to your plugin, and thank you for all the blueprint nodes you’ve already made available, I’m incredibly happy that your plugin exists and grateful for your continued expansion of it :)!