EyeX Plugin for Unreal Engine 4 [eye tracking]

Hello fellow Unreal users,
My name is from headquartered in. is the global leading eye tracking company and has teamed up with SteelSeries to bring eye tracking to the consumer worldwide. Our vision is that, in time, this technology will be inside every computer.

The core tech mentality is non-intrusive eye tracking so everyone can use it. We have accomplished a lot in the assistive technology field replacing keyboard/mouse with eye tracking, but for the consumer market and games we will keep the keyboard/mouse and use the eye tracker as a third input for more natural and immersive experiences.

This means you, as a developer and gamer, will be given more options for gameplay. The center of the screen does not have to be the center of the screen anymore, the center could instead be where you look. Check out Son Of Nor, one of ’s 3 rd party game developers for an example of this.
For more applied examples in gaming check out the following links.

http://fablesinmotion.com/videos.html

We love games here at and there are so many ways eye tracking can make games more exciting/immersive.
That being said we do need more external developers using our SDK’s. With our dev kit you can explore how eye tracking can make your game more intense and intriguing.

Due to the new paymentplan Epic has, unreal engine users is growing like never before and we estimate it to continue grow. We would like to be a part of that growth alongside the developers. Therefore we are working on a plugin for eye tracking with UE4 and we will give support for it as soon as we possibly can through our devzone at developer…com.

We have now released the UE4 plugin, please visit https://github.com/TobiiTechnology/EyeXforUE4/ to pull the github repo!

Post your feedback in this thread works as i will be monitoring this thread carefully.

Teaser:

Your feedback in this thread will provide us with the necessary info of what YOU as a user of the plugin want or need for developing your product.

If this is something that seems interesting to you, you can buy a devkit at http://www…com/eyex

In Q1 SteelSeries will launch the Sentry eye tracker using hardware. (http://steelseries.com/products/other/steelseries-sentry-eye-tracker)

That’s actually very cool

Ironically I was thinking the other day how immersive it would be to use an eye tracker for multiplayer games so your friends don’t look like robots anymore and how I would implement it. I may consider getting the devkit just to make a proof of concept. Should be easy to do if you’ve already solved the input to the engine. :slight_smile:

Edit*
Ordered one. Was not as expensive as I first thought it would be. Can’t wait to play around with it.

So can I assume this plugin will not work if you are wearing the Oculus as it can’t see your eyes? Also could you compare/contrast the differences between and Faceshift? BTW Fablesinmotion just blew my mind. Are there any additional examples of in action? What would be really interesting is be to see the CouchKnights demo using !

I apologize for the additional questions. This is very new to me and fascinating. How far can you be from the eye tracking device? For example could you put it in a conference room on top of a projector so you are 3-5 feet away? Can you set control parameters. For example; does it register blinks? Say 2 blinks for yes, long blink for no. Can it work with more than one person at a time on a single device or would you need multiple trackers? What is the average tracking latency? Could I play a pong type game moving the paddles quickly enough with my eye? A free UE4 tutorial in the marketplace or be a guest speaker on the Live Twitch Stream would also be very useful.

I would love to see UI (Slate/UMG) interaction using your device. So the gaze data generates “OnMouseLeave/Enter” events for instance.

Any idea when the plugin is going to be ready?

If you can track the eyes, can you track the direction the face is pointing in the same way?

Very cool.
This technology would go integrated into a VR setup of some kind :stuck_out_tongue:
In a non-VR environment, it would have be great as an interface for people unable to use traditional mouse/keyboard/controller setups.

No need to apologize! I’ll try answear the questions one by one.

The distance to the EyeX eye tracker for optimal usage is 60-80cm, you can be a little further away though. You could probably be 3 feet away, but after that you’ll notice the distancelimit. This particular device is optimized for desktop/laptop use.

It does register blinks since we track if your eyes are open or not. You can then do 2 blinks for yes, long blink for no. We do not endorse this use since it’s an unnatural behaviour and the risk for false possitives. But if you want to do it - go right ahead. :slight_smile:

The tracker tracks the person infront of the computer using smart algoritms to define who the most probable user is. It doesn’t have support for tracking several people at once, you would need several trackers at the moment.

We currently run in 60hz but we’re investigating other hz aswell.

Yes you could play pong with just your eyes! A coworker made a singleplayer version of pong where you move the paddle on the side you are looking at. (paddle moves towards your gaze)

I’ll definitly look into the live twitch stream and see if they would be interested in having us on there, thanks!

No that is correct, the oculus and the EyeX cannot be used at the same time.

I’ll make a shot at trying to compare the 2 devices.
Faceshift is more focused towards 3D programs such as maya and 3D studio max. They are targetting motion capture developers, so the camera they are using is probably really high resolution. The price is probably reflected in this. From what i’ve seen mostly targetted at animating the face and speech, and as a maya user, i know how tiresome that can be to do manually. We don’t get the pitch up and down of the head but we can track the other movements of the head as long as the eye tracker is following you. The faceshift has some basic tracking of the eyes but i don’t know if they track the eyes compared to the monitor or just compared to the rest of the face. I would say it’s probably compared to the rest of the face. The EyeX device tracks where you are looking on the screen aswell as where you are in 3D space. Those are the main differences from the top of my head.

The OnMouseLeave/Enter is doable :slight_smile:

The estimate is just a few weeks away. The focus is high on getting everything stable. I will definitly update this thread as it gets more and more ready!

Yes, the device uses facial recognition algoritms so you can use functions to find out the direction of the head.

Glad you like it!

You can get true eye contact and make A.I that can react alot better to what the user is thinking/doing. Humans tend to look at what they are going to use next, so you could make some interesting A.I reactions to this. You could event make different regions on the A.Is trigger different A.I reactions. Would be interesting to see this concept used in a tavern/bar. We have support for C/C++, .NET and Unity atm but as indicated strongly earlier, we are close to completing the UE4 plugin aswell.

Something i would like to see is for example being able to see where the teammates in multiplayer is looking. I personally can’t remember how many times i’ve explained where i want them to go. If i do this in real life people knows where i want them to go just by looking at where i’m looking. Subtle things that makes a world of difference.

If you have any questions or problems don’t hesitate to ask on the dev zone!

developer…com

We have alot of hardware for assistive uses, so you are spot on sir.

The current push with the EyeX is focusing more on traditional mouse/keyboard/controller setups using it as an additional input to the ones we are already using today.

[QUOTE={};165146]
No need to apologize! I’ll try answear the questions one by one.

The distance to the EyeX eye tracker for optimal usage is 60-80cm, you can be a little further away though. You could probably be 3 feet away, but after that you’ll notice the distancelimit. This particular device is optimized for desktop/laptop use.

It does register blinks since we track if your eyes are open or not. You can then do 2 blinks for yes, long blink for no. We do not endorse this use since it’s an unnatural behaviour and the risk for false possitives. But if you want to do it - go right ahead. :slight_smile:

The tracker tracks the person infront of the computer using smart algoritms to define who the most probable user is. It doesn’t have support for tracking several people at once, you would need several trackers at the moment.

We currently run in 60hz but we’re investigating other hz aswell.

Yes you could play pong with just your eyes! A coworker made a singleplayer version of pong where you move the paddle on the side you are looking at. (paddle moves towards your gaze)

I’ll definitly look into the live twitch stream and see if they would be interested in having us on there, thanks!

Thanks for the quick reply. Is there a place I can leave my email to get notified when the UE4 plugin is released?

You could sent it to me in a private message if you want it by email. :slight_smile:

Icon1281.png

The plugin is now live!
The above link is a blogpost about it on our devzone wich also contains a github-repo with the plugin.

This new addition to the EyeX SDK provides developers with an exciting set of new features to easily and seamlessly create new gaming experiences never possible before.

“With the new EyeX plug-in for Unreal Engine 4, integrating eye tracking in games becomes very easy. The integration into the UE4 environment has already been done for you and you simply connect with the plug-in,” said Oscar Werner, vice president of Technology. “Our goal is to give developers tools that enable them to pioneer innovations with eye tracking in a wide range of consumer and enterprise software – from entertainment and gaming to office and productivity tools. This plug-in is the next step in this progression.”
The EyeX plug-in for Unreal Engine 4 serves as an extension to ’s current library of developmental tools, which include EyeX SDK for Unity, EyeX SDK for .NET and EyeX SDK for C/C++. It gives game developers a set of easy-to-use tools so they may unleash their creativity and design talent to create truly immersive gaming experiences only made possible with eye tracking.

Features of the EyeX plug-in for Unreal Engine 4:
• Gaze point: The point on the screen where the player’s eyes are looking.
• Fixations: Points on the screen where the player’s eyes linger to focus on something.
• Eye positions: The positions of the player’s eyeballs in 3-D space.
• User presence: If there is a player in front of the screen or not.
• Automatic detection: Indication of the actor who has the player’s gaze focus.
• Blueprint and C++ support: All of the EyeX plug-in features are readily available through both UE4’s visual scripting editor and from C++.

To learn more about the EyeX plug-in for Unreal Engine 4 in video format, please have a look at the video in the original post at the top.

To get your own EyeX dev kit, visit www…com/eyex

Hello,

Is the EyeX powerful enough to perform little amateur “neuromarketing” tests? I’d like to combine it with an Emotiv Epoc+ and record some brain+eye activity.

It is, but we have an entire buisness department for analysis. What do you want to use it for more specifically? Only UX testing and analysis? The eyeX devkit device is for making games primarily.

The license agreements can be found here; http://developer-files…com/wp-content/uploads/2014/06/EyeX-SDK-License-Agreement.pdf

If there is something outside the boundraries of the agreement you’d like to do just contact me (.wallin[at].com) and i’ll relay you to the correct person. We might be able to work something out since is not an unreasonable company. :slight_smile:

Here are some thinks that i would use eyetracking in games for.

  • Lens effects (dof, MB, hdr…)
  • gui (map nav…)
  • a.i.
  • trigger scripted events ( this is a must have. I hate to press a button to focus on an ingame event\cutscene and every design will agree that way to much of their hard work isnt regonized because of bad timing on the player and engine side.)

Those are all very good examples.

If you want some more direct interactions you could also have secondary attacks such as throwing a grenade in the direction you are looking, secondary attacks, primary attacks, picking up items you are looking at instead of pixelhunting with crosshair(items highlighting with you look at them) etc.

bonusvideo:

That’s a very interesting game concept. Seems like a good foundation for greater intrigue in the future.

I’ve been watching the Sentry videos, that is exactly what I was talking about and what I’m looking for. Now my question would be, is there any difference between the Sentry and the EyeX or are both the same hardware?

I totally agree, it shows the foundation for giving A.I some form of context to increase immersion. :slight_smile:

They are the same hardware, so anything developed with the EyeX devkit will work with the SteelSeries Sentry.

If you buy the sentry however, you will get the game analyzer software automaticly. Naturally if you buy a devkit and you want the game analyzer, you wont have to buy a new Sentry device as there will be an upgrade option. :slight_smile:

Awesome, thank you.