We’re a small team of students (~12 Graphics Designers) working on our first VR game and I’m currently trying to figure out a way to drive a Metasound (I think this would be the best approach to it, though I could be wrong) via Control Rig.
The Player is onboard a Mech and with the press of a button will control the arm of the Mech, the Motion Controller’s transform is transcribed to the Mech’s scale and fed to a Control Rig IK for each hand so the Mech reproduces the player’s hand movement but on a bigger scale.
What I’d like to achieve is play some sort of piston like sound when the arms move, so the trigger is not really event based as the player is free to move his/her hands as he/she wishes.
I thought about getting a difference between the IK transforms at two different times to know if it was moving but I figure this could get quite expensive to run every frame so not really a viable option.
If any of you guys have an idea, I’d very much like to hear it.
Thanks in advance
Metasounds can be driven by Blueprints / Code / Sequences, through the Audio Component playing them. You’ll probably need to use certain events, to trigger the sound and to stop it, with some kind of a start-loop-end mechanism, but while it’s playing you can use Control Rig data or curves to manipulate the sound on the fly.
This is a field I currently explore myself, as my WIP game relies on audio events and data to control character animations, so it’s driving the same route just the other direction… please feel free to PM me and chat about it
I’d suggest checking out the Audio Parameter Modulation system, that allows mapping realtime parameters to Modulators, which can control audio behavior like volume / pitch / filtering etc, as well as any other game world parameter such as time of day or an arm movement velocity. They can also be implemented directly into the Metasound graph, and are UE’s next gen audio mixing toolset.