Download

VR Controllers and Peripherals in UE4 (VRPN Integration Plugin)

I am keen to collaborate on a free to use VRPN plugin.

I have already coded a InputDevice plugin integrating wii controllers (and peripherals) into UE4, but can’t publicly release it due to license restrictions. If no one else is working on a VRPN plugin already this could make a good starting point.

Just checking in to see if anyone else is working on building something like this already (and would like a corroborator) and whether there is much demand for a plugin like this?

Matthew Spencer,
Software Engineer,
New Zealand

I’d be keen to see a plugin like this released. I’ve been working on getting Wii peripherals working cross-platform using third party software such as OSCulator and ControllerMate for OSX and GovePIE on Windows. It’s pretty tough re-implementing controls on different OS versions though so something more ‘native’ to Unreal Engine would be nice.

The only problem I see with this is that the majority of devices will be too different for games made with one in mind to be played with another. For example, even a wiimote with motion plus can’t compare to a Razer Hydra, which will soon be surpassed by STEMs. That said, I still think this is a cool idea. Getnamo released a Hydra plugin a while ago that you could take a look at. I use my own custom non-plugin integration. Keep in mind that if the game module is dependent on it, then it technically isn’t a plugin, and this can cause issues.

I’m interested in this, too!

I don’t have much experience with VRPN, but I do have experience with TrackD which is conceptually similar.

BlackRang666’s comment is worth consideration. In my experience, the “traditional” device inputs are usually classified into a collection of analog or digital inputs. For instance, a joystick is two analog devices. A button is a digital (on/off) device. So, a gamepad would have 4-6 analog inputs, and 8-12 digital ones.

Trackers are usually classified differently. They’re really just a collection of analog inputs (x,y,z / h,p,r), but as there is an implicit relationship amongst the values, they are reported together.

The most flexible systems (non gaming) I’ve seen allow users to configure their devices through an abstraction layer that maps device inputs to logical functions. So, for instance, you code against the “fire button” and then define which input that button is.

Mike

Hi, I have almost no knowledge in programming but I have lots of interests into VR andt he contoller inputs we can use on it. As I see now we have to go cheap with input devices, there are loads of interisting and not so cheap progects and the flagship of actual VR OculusVR had confirmed are working on an undetermined input device with undetermined release date, what as I see makes so dificult () to chose one standar. Maybe what I will say its a nonsense or maybe its obvious and have huge flaws I cannot see and thats what it had not been done, but if there are plenty of devices that send positional data like smartphone giroscopes, wiimotes, kinects, etc…and there is software that is capable to pull the data doesnt make sense to made a middleware pluguin that reads this pulled data and sends it to ue4 engine? This way I can chose to map my smartphone data to my hip position, my wiimote data to my right hand, nunchuk to the left, etc… I know it must be harder to do than to say:)
EDIT: Format

love problem solution baba +919829604976

Hi, I am also interested in VRPN under unreal engine, did anyone make any progress implementing this?

Hi,

I’m looking for someone with interest in developing some kind of plugin to add VRPN support for UE4.
I’m keen to colaborate with anyone. I use and compiled some custom VRPN for unity and in my lab but we are thinking in using unreal also.
However I have little knowledge in UE4 and being the juniorest coder in my lab, I could use some support for someone more experienced. If you have any idea of what is necessary and interest I could work with your guidance on that. If you are already developing some version of it, I’m also interested in collaborate.

Just wanted to add that I am working on a VR Body Input plugin which aims to abstract various controllers away from body type input (skeletal, full body including hands, fingers, feet), which I believe is a common type of input people look for.

For now you have 3 separate device plugins available that I’ve bound: the Razer Hydra, Leap Motion, and the Thalmic Myo. These are component based and event driven and allow for easy integration of specific controllers, all within blueprints.

In the future, the Body Input plugin would abstract specific bindings away and you could, for example, use either of these controller or a combination of all three to track various parts of your body which would result in an abstract skeleton that developers would forward to their game logic. The useful aspect of this is that if a new input becomes available, a plugin binding it to the Body Input plugin would allow for its use without any change in game logic. It would also allow users to mix and match controllers and allow for some priority based interpolation based on simple profiles to determine which controller should be used when you have simultaneous overlapping input.

The end use case is for developers to simply use default profiles for each of these controllers, or to change profile settings to specify desired button/axis mapping to own actions (input mapping), while letting the Body Input system forward you the skeleton that is being currently tracked. This would be, in my opinion, the most flexible and convenient VR input system.

I’m planning to write this in an open source manner and some thought is still needed regarding how to best approach the cross-plugin binding. Expect an early release in a few weeks time.

Hey! I appreciate your effort. Could you please explain in non-programmer-language what your plugin will be able to do? Im looking for a way to get skeletal data in a live session from a phase space mocap system into UE4 for animating an avatar in real-time, VR obviously. I’d like to skip MotionBuilder in the process, because it crashed like every 10 minutes and for research purposes that’s just inacceptable. Phase Space claim they have VRPN support.

How can I imagine it? Will the positional data of the markers be imported and then retargeted to a skeleton or will it be already mapped onto a skeleton?

Just what I was looking for.

I’m currently attempting to develop such a plugin!

What I’m trying to achieve is a VRPN connection from a Button & Analog Server in OpenVibe http://openvibe.inria.fr/ to stream data into UE4 to trigger events then save the timestamped data into a text file.

How far have you managed to get on constructing the plugin? I’m more that happy to help.

Peter

Just what I was looking for.

I’m currently attempting to develop such a plugin!

What I’m trying to achieve is a VRPN connection from a Button & Analog Server in OpenVibe http://openvibe.inria.fr/ to stream data into UE4 to trigger events then save the timestamped data into a text file.

How far have you managed to get on constructing the plugin? I’m more that happy to help.

Peter

Just to give an update.

I have written some code separately that takes VRPN analog & button connections and prints their values. So basically a client, which no matter how many analog connections prints out the output.

What I need to do next is translate this into a plugin to allow the channels (both analog & button, and add tracker) to be assigned to a variable so they can be read via a UE4 GET command.

Also I need to declare the bindings of the analogue/button (e.g connection parameters) as variables so that within UE4 you can specify the name the analog/button channel the connection is on.

This at the moment will be just a base implementation so that using UE4 via a blueprint you can set up a VRPN client and then get the values from the external server. In my case OpenVibe.

Just need to figure out the best way to do this and more importantly how to do this since the documentation for creating plugins is minimal. I have .h files, a vrpn.lib file and my current client.cpp file its just a case of starting it and translating it to be used as a blueprint.

I’ll keep everyone posted, I need to get this up and running soon so hopefully I will have something working quickly.

Peter

I have no experience with phase space, but if they have an sdk/lib then yes it would be possible to have those directly bound as a plugin and to have those bound to the BodyInput plugin. Initially I envision trying to get the commonly used inputs bound and translated to the skeleton, but ideally down the road you want to be able to have the ability of adding input not already bound to the engine via say sockets or a profile system. Currently a bit delayed on this project, but as soon as I get time I’ll push a repo with an early bind to get feedback on architecture, more use cases we get, the more robust we can make the overall system.

Sounds pretty cool! If you need guidance on how to bind plugins with input, you can see any of my plugins (see sig) for examples with code. It should point you in the right direction.

Thanks, will do! I think I have managed to get my head round the concepts of building a plugin.

Good day!
I am looking for some help in this area. I have a Gun Controller Prototype for VR, that will be for a game in Unreal 4 Engine, but it would be good if it worked well with other games as well. The gun currently has Rotation tracking(Accelerometer, Gyro, Magnetometer), also there is Positional tracking through a two points, one on the barrel, a globe like the PS Move device, and one on the top(not a globe, and I will use two cameras), but IR instead of RGB. Rotational Tracking works well enough emulating a mouse and keyboard, through the use of open source arduino atmel chips. I have a AHRS implemented and get good clean calibrated data on Serial Port, with plenty of buttons for doing other thing besides firing. The Positional tracking is done through a Modded PSeye, that only see IR now. I use openCV on a TUIO Client broadcasting on a UDP packet system that is registering less that 3ms tracking for my one blob. I would like to create a C#, or C++ Open Source App, that is cross platform, I was thinking MIT Licence, with a separate Unreal Plugin for Licencees. Also I will provide Blueprint, and a tutorial set for creating the hardware, again, all Open Source. The Idea is to create, a cheap, viable, fast tracking solution that will not only work as a cool motion controller but be a valid VR Input device. Even though I have put the tracking unit in a gun, it doesn’t have to be a fire arm at all. It is possible with very low cost hardware to create a very fast tracker that can be used in a variety of situations. This project is in part of another project I am working on. The controller is meant for a Fan Made Stargate Project, so mine looks like a P90, but it could be what ever you want.

wow lots to read, now Isee a Plugin through unreal4 would get me developing my game. So how can I help?

trying to realize a vrpn plugin, at least i stuck with integration of the vrpn.lib. which is still only available as win32 compiled version. (?)

i tried to compile the vrpn as x64 from source with no success. also no searching results.
further suggests?

You can use cmake-gui to generate .sln for VS2013-64bit, then compile from VS2013 to get the required x64 library version.

Hi, I am interested in VRPN plugin, is there any news?
In this area i think that there will be a beta of Middle VR for UE4 soon…