Free Myo Alpha Development Kit

are there developers here who have the myo from thalmiclabs? https://www.thalmic.com/en/myo/
They have invite me for a free Myo Alpha Development Kit so I was wondering if anyone else is invited and what you are plan to do with it. I also have the Oculus Rift Kit

I haven’t been invited to the alpha, nor am I planning to get one (no monies T_T, I would if I had the extra money), but it is an interesting idea. There is a small fault that it’s only really gestures though. From what I’ve seen and heard (via videos and comments/replies to comments) it doesn’t really track position, and the finger tracking is only open closed and fist. It’s cool technology don’t get me wrong, but I personally don’t think it has the fidelity needed for an immersive device with the rift.

I am interested in how it turns out though. Do you plan to upload videos of working with it? If so, leave a link to your youtube channel, and I’ll subscribe to see how it works for you. (I’m largely curious as I’m working on a vr glove myself, and want to see how I compare. Currently I can do everything that the myo does {wrist orientation, open/close finger states, plus I also track the upper arm and elbow open/close}, plus independently tracking the fingers with an analog open percentage. If you want to see more on my glove, go here: https://www.youtube.com/ I’m going to be trying to get it kickstarted soon I hope, so if you are interested, keep an eye out. :D)

yep Caliber I think you are right, there is not much you can do with the Myo yet… but I got it for free so I can play with and I’ll see what I can do with it. I will later make a youtube channel for it.

I’m interested in your glove project, it looks great already and I will be the first who back you up on kickstarter :slight_smile:

Myos are pretty accurate, they definitely have potential. I’m working on an integration for UE4 as we speak, for my project (Skycall) it makes perfect sense not to hold anything…

Cool! do you have a youtube channel? I would love to see some examples when you manage to get it to work with UE4

search my username in youtube and it should pop up.

Just wanted to add that I’ve recently completed a Plugin for the Myo. So if you are an UE subscriber and in the Myo alpha program, pop an e-mail to the dev relations to access the code.

Great… thanks getnamo

I’m a UE subscriber and have also access to the alpha program but I can’t find any UE plugin :frowning:

send an e-mail to the Thalmic devs, they’ll give you access.

I couldn’t find your youtube channel getnamo, could you link us?
I am purchasing one Myo device to use with a game I’m developing with UE4, can you elaborate on how your plugin works? Does your plugin allow UE4’s blueprint to recognize hand signs by any chance? The game I’m developing uses hand signs to cast spells.
I’m going to PM you if that’s cool.

STEMs still make more sense for my current project, but these look like they have a lot of potential for applications where hands empty is more logical. I’ll keep looking into these.

I noticed this thread reply quite late, just wanted to give a heads up that since the device went beta, the myo plugin is now available publicly
Unreal Thread
Github

In terms of performance, the myo reports 9-axis IMU (acceleration, magnetometer, gyro) at low latency and currently about 6 reliable EMG hand poses. You could definitely close your fist, make some motions and emit a hadoken :slight_smile:

There are strengths and weaknesses for both. Myo is a simple wireless device which allows you to do non-joystick interaction, it feels quite natural when you use it. STEM will have more accurate position data due to underlying technology (direct position tracking through magnetism), but I am testing constrained acceleration integration to see if hand placement can’t be more accurately predicted in the myo. That would allow for IMU class controllers to be usable as 1:1 hand controllers.

Nothing says you can’t use both either, using a myo on your wrist would give accurate forearm angles. The fusion of both systems would then allow you to map your arm’s position fully, without relying on IK joint-based elbow prediction. Having that sort of hardware fusion happen automatically is one of the goals of the VRMotionInput plugin which I hope to have released someday.