[Question] How to trigger a fire action by using the Oculus Touch accelerometer

Good day

I have a question about a blueprint action that I would like to add to my project, if anyone knows how to do it: how can I trigger a fire action based on a raw sensor data and motion state input from the Oculus Touch controller? My goal is to use the recoil of an airsoft rifle to trigger the fire action through the touch accelerometer. The airsoft rifle is equipped with a special recoil system that I built myself with an electric actuator. It is quite strong, the equivalent of a .556 (M16) cal. So far I have the Touch mounted onto the top picatinny rail of my rifle and the motion translation is quite good in Unreal. I guessed that if we add a linear acceleration data listener with a specific value we would be able to capture the recoil motion and have it translated as a trigger action. I’m currently using a USB mouse cable that is connected to the rifle trigger to get the fire input. But that is a bit cumbersome and unclean. I would like to get rid of that mouse cable and be completely wireless but I have no idea how to program the blueprint in order to capture that motion acceleration and convert into an action. I attached a snapshot of the TouchController blueprint in which the new action has to be implemented.

Thanks in advance to anyone who would have a solution.

Ok vr_marco, I made several attempts at assembling the nodes but my knowledge in blueprint programming is very limited. I’m not sure what and how to use the nodes required to calculate the frame lenght and how do I link the DOT Product to the Index action that leads to the firing action in the weapon blueprint. Do I need to link the Event Tick node to the new nodes in order to make it work? Can you please provide a visual input on how you would build the blueprint? I attached two snapshots, one with the new nodal scheme for the sensor motion data and one with the current nodes that I’m using for the touch trigger and mouse click.Tks

Excellent, thanks a ton! My touch motion controller is actually upside down and inclined in order to fit the scope ring on the picatinny rail. Because of that I will have to fiddle with the vector and math values, but we’re close to the end result. I attached a picture of the rifle config. Here’s a link to a video demo of the Oculus Touch tracking capability: Microsoft OneDrive - Access files anywhere. Create docs with free Office Online.

With the recoil recognition this simulation solution will be complete. A nice to have for people like me wanting to maintain their target shooting skill in off-season :D. When I get the rifle config done the next step is to build a pistol simu based also on recoil input, with CO2 blowback. I’m actually working on a target range made with Unreal that will implement these controller-based assets.