Haptic feedback

Hi Guys,

I try to trigger vibration from the Haptic system to have a more subtle feeling than with the regular Force Feedback.

However, I didn’t success to make it work on iOS. I’ve used the node Play Haptic Effect with the Player Controller as a target, but nothing happens.

Some people would have clues to solve this problem?

Any help? I promise a cute picture of cats :wink:

Hi everyone,
Still no news about the possibility to trigger haptic vibration on iOS from blueprint? I didn’t found a solution on my side.

Hey Hathis,

Newb here, but I did see your post and decided I’d go look for you and see what I can find. It appears to me the Haptic feedback system can be adjusted using frequency and amplitude curves. This would give you the ability to adjust the strength of the vibration up or down based on the curve values. I would look at creating a Haptic Feedback Effect Curve blueprint. It is found under the misc category under the right click menu. I suspect the frequency curve adjusts how fast the effect would pulse and the amplitude would adjust the force. I would try hooking this up to your blueprint setup and set the curve value for amplitude at begin play. Once you get this working I would also set up the buffer to accept multiple haptic events and keep track of their order. This haptic buffer is also under the right click menu. I would think in most cases you would want sound to play when your haptic event is firing so there is also a haptic sound blue print.

Thank you very much xpgained. I investigate your proposal, THX again.

Hey dude, I’m going to take a crack at this sometime in the next week. I will be happy to share with you what I come up with. I bet we could get this working together no problem. How far have you gotten? I’m fixing my build to get it deploying again. After that I am going to focus on adding haptic feedback for a simple click event.

Hi partner,

I’ve created the Haptic Feedback Effect Curve and triggered it with the node Play Haptic Effect (linked to the Get Player Controller)

Unfortunately, nothing happens. I continue to tweak the frequency and amplitude. I let you know

Sweet dude. I’m gonna try and work on it today. I got my app updated and building again. I’ll post some screens of my blue prints and setting if I get anywhere.

I have a feeling i’m exactly where you are. I’ve tried implementing this into the standard AR template in several places into the Player pawn that has all the object placement logic. The standard ARTemplate has logic in the pawn that is detecting if you hit world geometry or hit an actor. I’m tring to play the same haptic curve for both scenerios but am switching the sounds to show success or failure when placing an object in the world. I know we are overlooking something. I just don’t see it yet. I think it involves setting up a haptic feedback buffer and storing a feedback event every time we play the haptic feedback. As far as my haptic curve I set frequency at 120 for the whole duration at 1 sec. For amplitude I did a span from 0-1sec and set points at 0., .25, .5, .75, and 1. I also made it linear a straight line with no curve so it would just be more poppy and not slowly building up. I do think the answer is in the buffer. We need a buffer to track all the plays so the system knows if an event is already active or not. I created the haptic buffer under the right click menu and added about 10 nodes to the list. In the list this should keep track of 10 events at a time. I’m not sure if that will be enough since the haptic curve and haptic sound are all considered play events. Progress is ongoing. Continuing research.

As the great Sherlock Holmes would say… “The game is afoot!”

@Hathis Ok dude this is where I’m at. After doing some additional research in the unreal documentation I noticed there was nothing on haptic feedback listed in the documentation. There is a section however on the forcefeedback that says in the documentation that it can be implimented in ios. I also remembered playing around with the match 3 unreal sample project and having atleast some sort of feedback. So I went back into that sample project and started poking around. I believe I now have a forcefeedback effect playing when I click on screen and it plays a sound wave I have selected. When I click anywhere in the environment that has not had a detected surface I get a feedback tick and a sound file is played. When I click on an area that has been detected a mesh is placed down, the feedback tick plays and the selected sound. I know the feedback effect is working because if I mute the phone and initiate the touch event I can feel the tick. The problem is the forcefeedback curve I made isn’t very good. I don’t have much control yet until I figure out the channels & curve editing. Here is my blue print.

Ok so now that I have forcefeedback working It’s time to try and switch it back to haptic feedback. I think we have two paths to try to achieve better vibration results. We can dive into the forcefeedback channels and curves and see what kind of patterns and effects we can make, or we can dive more into the haptic system and see if we can get it working. I say we do both.

I had a setup for haptics implimented like the forcefeedback I did above, but I wasn’t getting anything to function.

I’m sure Nicholos Cage would just say “the secret lies with Charlotte” and figure this out in an instant. I’ll keep chugging on this.

After doing a bit of technical reading about the taptic engine, I am wondering if our frequency is wrong and that is why we are getting no effect. I found a Taptic Engine manufacturer and they listed the resonate frequency their engines operate at.

Here is the paragraph with the info…

“Our Linear Resonant Actuators (LRA) vibration motors are an excellent choice for haptic application requiring a device with high reliability and exceptionally long life LRA s have an internal mass that oscillates back and forth along the X-axis at its resonant frequency. They are an excellent alternative to the brushless vibration motors as the only internal parts that are subjected to wear / failure are the spring. Unlike conventional brushed DC vibration motors, linear resonant actuators must be driven by an AC signal at the devices resonant frequency. There are companies make IC drivers for linear vibration motors that supply the correct drive signals and contain a library of haptic effects you can choose from. TI company makes LRA drivers IC s. We can supply you with an evaluation board that incorporates the TI** DRV8601**,** DRV2603**, haptic driver IC. Contact us for info. It should be noted that unlike brushed ERM vibration motors, varying the amplitude of the applied voltage will only change the amplitude of the vibration force, not only the frequency of vibration. The frequency of vibration is fixed at the LRA s resonant frequency which in the case of our LRA s is either 180 or 200 Hz. Due the LRA s Hi-Q , applying a frequency above or below the resonant frequency of the LRA will result in the LRA producing a lower vibration amplitude or if far from the resonant frequency.”

So basically to get the haptics to work we have to vibrate the spring at the resonate frequency of the Taptic Engine. I’m going to try the recommended 180 or 200 hz above. I would think this means the frequency curve is actually a straight line for the duration of our effect. The amplitude adjusts the force and creates the pattern of the taps.

Here is another article explaining haptics and how the frequency and amplitudes function.

Going by that description above sending a curve will just slowly shift the mass into the positive along its axis, when what you really want to do is thrash it back and forth.

So it sounds like you might have to do some pulse width modulation. Instead of sending a variable curve that attempts to change the voltage, you might need to be sending an oscillating wave that varies in strength, such as a sound wave. It’ll need to be really low frequency data, in the mentioned ranges of 180 and 200. Rather than vary the frequency, you’d vary the amplitude of the data sent in order to produce, say, a tapping effect.

Because of the frequencies involved you’ll probably need to hook into the audio engine. From there you could straight up play a wav file. Is it easy enough to pull a raw stream from a sound file in UE?

Edit: second link pretty much confirms it by calling it a voice coil. It’s like a tiny low-frequency speaker.

@Antidamage So I’m correct in stating that the frequency should be a straight that operates at the required frequency like 180hz. Then in the amplitude graph it should look like a sound wave?

Ive been looking through the plugins and settings looking for things we might need turn on. Under the iOS settings there is some audio stuff to flag.

i also am still confused by the buffer since it’s the type as the haptic effect and the sound. Why would you just play a buffer?

Yes, that’s what I think it needs. I wouldn’t assume that it’s currently supported unless you’ve seen that written somewhere specifically.

The reason you’d play a sound is because it can already send 200hz data. You could also write a module to tick at 200hz and send alternating pulses of a curve, but that seems like more work just to test the idea out.

Keep in mind that it sounds like if you straight up send the maximum voltage on the right frequency all you’ll probably do is pull the driver to one side and have it sit there.

It also might be that you don’t need any of this and on some level the device handles it for you. It’s probably time to reach out to an apple developer on the apple forums.

I really should have scrolled back earlier. I saw you discussing the taptic engine and I thought we were talking about that.

Play Haptic Event is for Vive controllers, not iOS devices. This discussion would still be valid for an Apple Watch app though.

Ok so the play haptic effect, play haptic sound, & play haptic buffer is used for dedicated AR and VR controllers and not mobile? For mobile haptics we should stick to the force feedback?

If so, this makes life easier because I now know what I need to do and test.

Out of curiosity, how did you determine what the play haptic files were supposed to be used for?