I am creating a game which has elements that react to sound.
I would like to have a central “engine” which does the sound-analysis, which I can then use in different actors to make them sound-reactive.
What is the best way to do this?
Right now I have an Actor BP doing the sound analysis and setting the output to a variable within that blueprint, which I then access from elsewhere, but it seems buggy, keeps leading to my project crashing.
add to any pawn “pawn sensing component” and you will have access to hearing - so basically sound detection. you can detect sound form any source (so thrown physics objects etc) or just pawns via searching for “pawn make noise” function, or add a component called “pawn noise emitter”. from my experience I can say its best to work with that you already got in engine instead of reinventing the wheel
As to your title, GameInstance is used to store variables through the game. If you need interaction in one level, you can inherit from UObject, add your sound analysis functionality to it, spawn one in a world and pass sound data to it. Or you can do it directly in GameInstance.