+ Perception Neuron = Dark Souls(ish) Test Demo = Full Body VR ( Downloadable project! )

Hi all,

I’ve been testing the GearVR lately to better understand how the mobile development works and I decided to share something pretty cool and usefull for development.

In short I added full body control in VR to the GearVR by using a mocap suit ( Perception Neuron ), so that the user can freely walk around and interact with the environment.

As of now the scene I developed is very simple, all the asset is from Infinity Blade, the characters are from Enemy Knights and can be downloaded from Gumroad ( please support them! ), but in the near future I’m planning to add lots of features, which can be used both with the Mocap suit and without ( by using the touchpad as a controller and so on )

In order to optimize the data streaming here you can find the settings form Axis Neuron ( version used is )

Output Format
Be sure to set, in UE4, the Perception Neuron Manager to 7002
Character Name
The APK you’ll find has set Char00 for the character in Axis, so if you’re using a character with a different name the streaming won’t work
Character Orientation
As soon as you’ll start the APK in your phone the orientation of the character will probably be wrong, so use the Yaw slider to adjust the orientation

The project ( and the already built APK ) can be found here

GearVR Mocap Demo

Here is a video showing the realtime movement and inside the VR environment


Full Body Tracking in VR - Lightroom by Koola

Dark Souls Inspired Scene Optimized

If you found this useful and want to support the development you can make a donation here:

Development Support!

Best Regards

Nicolas Esposito

Looks really promising!

Pretty cool, but in your video your walked across the room, while in VR your avatar stayed in place. I was hoping to see actual room scale VR with positional tracking, since I assume Perception Neuron would allow that to happen. Is it achievable at all or you just haven’t gotten there yet ?

Cool stuff. Hope you manage to get it all working.

really cool! :slight_smile:

I think it would be easier to do the player tracking off of the HMD.

Perception Neuron tracks relative motion between each neuron, it has no idea where you are in roomspace (if that makes sense).

The non-translation on the player ( well, the minimal translation ) is due to the setup I did, then I realized that the character was slightly moving forward, but I already uploaded the video :smiley:

During this weekend I’ll move all the setup in a larger environment and add some variation to the original scene, and also I’ll be testing another scene with some physic objects to interact with.

The tracking itself is done by using the relative position given by the legs orientation, which is then sent to the hips, and that’s where the position in space is set.
The delay you see between the real movement and the VR movement is surprisingly not that noticeable when you walk around, so this would be a good enough solution for games not based on aim gameplay ( survival horror or walking simulators ).

The tracking of the HMD is something that can be done, but thinking about the price to do that I decided to go with the integration of the mocap suit, since you will have full body control :wink:
I just wanted to share a different setup and also demonstrate that a workaround to the current Vive/Oculus setup can be achieved easily :wink:


I’ve tested the entire system using the S7 as a hotspot wifi, so basically I’m streaming the mocap data from the Neuron Hub to the laptio using 4g internet and holy cow, I’m blown away by the results!!

The latency is the same as using the USB cable and I’m quite confident that a FPS setup can be done!

On sunday I’ll post another video with the updated Dark Souls(ish) scene and another one

Are you taking the head rotation from the headset or Neuron?

Head orientation is from the GearVR, position is from the suit.
I’m using a Transform Bone to lock the orientation of the head joint, otherwise I’ll have double rotation ( not so funny to try :smiley: )

I’m trying to do this with an Oculus Rift and I really need to use the Rift to control the head orientation and position, because it’s way more accurate. So I’m guessing I can just use the same method? I’ll try it when I get home, thanks.

You can use the same method, but try to drive the root position by the HMD inside the AnimBP.
Alternatively lock the hips movement inside Axis and link the root position to the HMD.

I think it’ll look a bit awkward if you use this setup, because if you look down and move your head the entire body will move…anyway try different setups to see what is the best solution :wink:

This looks incredible! How did you get your game working with the GearVR though?

I have built a project using the First Person Template and got my project working on my phone but when I load the project on my phone it isn’t splitscreen for the 3D effect and also as soon as I put my phone in the GearVR it just loads up Oculus Home and I am then unable to open my app.

Be sure to follow the GearVR Quickstart, since if that is not set the phone won’t recognize the project as a VR project.

Test the First/Third person template and then, if you can see it using the GearVR you can go on with everything elese :wink:

Original post updated, added two more videos

Lightroom by Koola

Dark Souls Test Scene