Feel free to ask any questions on the topic in the thread below, and remember, while we try to give attention to all inquiries, it’s not always possible to answer everyone’s questions as they come up. This is especially true for off-topic requests, as it’s rather likely that we don’t have the appropriate person around to answer. Thanks for understanding!
It would be great if Epic could create a pawn designed for the Vive HMD + Controllers, everything setup correctly so it’s just plug & play for VR developers. Right now you need to do some wonky stuff get get everything looking as it should (controllers rotated correctly, eye height, worldscale, etc…)
Couch Knight Demo I am trying to make an addition to the demo a addon so to speak.
My aim is to add 1st & 3rd person camera socket to the fighting couch knights.
I realise the demo is for Oculus Rift and Track IR and nooo, i do not have the spare cash to buy the future of gaming.
Question is How do i disconnect the spawn on “play” camera and add this to the little knights or add 2 key option commands that change the viewers camera vision to the actual knights fighting as 1st person and 3rd person view, please.
By achieving this option of camera views Oculus Rift and Track Ir players can get greater experience and offer us poor gamer’s an option to enjoy the couch knights demo more greatly.
Below is my post i made yesterday asking this about couch knights and i have been googling and searching for a away to achieve this what would be small achievement but for me is a huge step forward in learning.
Its been asked but Im also wondering about the PS move support. Is this only going to be for ps3/ps4 games? Will you need a playstation to test out the move? Or will you be able to make pc games with the move (hoping to integrate it with my oculus rift game before they release their motion controllers)
As far as I know, and as I’ve heard from Morpheus devs, there will never be official PSMove support except for people with a Morpheus dev kit & appropriately signed NDAs, and yes you will need a PS4 to act as the Move.me (or whatever they’re calling it now) server.
Note that the project and plugin are both under very active development and are likely to change. As of right now, it mostly implements the IMotionController interface so it should be pretty easy to switch back and forth between this and a (e.g.) Vive.
There are a few things you should note.
The plugin currently assumes you are using a PSEye with the PS3EYEDriver as this is the only driver that works with MSVC 64-bit, required by UE4 Editor. PS3EYEDriver is GPL’d, so you can’t publish closed-source games using this driver. We will eventually get around this by turning the psmoveapi into a server/client model, where the server will be GPL’d and the client - the only part that needs to go in your project - will have a much more permissive license. In the meantime, you can build for 32-bit and use the non-free 32-bit drivers from CodeLaboratories. Your users would be required to purchase these $2 drivers too.
Getting the PSMove paired to the computer can be quite tricky. It seems to depend on the bluetooth dongle and possibly what other USB devices are attached during pairing. The good news is that once it is paired, it connects very reliably in the future. If it doesn’t, then you either have a defective controller (e.g., bad battery) or an incompatible bluetooth dongle. We don’t have a list of ‘good’ bluetooth dongles, but the ASUS BT-400 has worked well for me and the other main plugin dev.
It’s pretty sensitive to lighting conditions. Sunlight or fluorescent light sources (& reflections) in the FOV of the tracking camera will cause lots of problems for you.
Everytime you move the PSEye camera, or DK2 camera, you will have to redo a coregistration step. It only takes about 20 seconds and it should be good as long as the cameras don’t move relative to each other.
You have to recalibrate the PSMove magnetometer (and other sensors) definitely every time you change rooms, and it’s probably a good idea to do it once per day.
When using a Vive, you are prevented from running into walls by a virtual representation of your play area’s boundaries (Chaperone system). Does UE handle the rendering of these walls or is it the responsibility of the game? If it’s done by UE, can it be disabled or replaced with a custom style?
Do you know if the Chaperone system allows for a virtual “inset”, i.e. can you set up the Lighthouse boxes all the way in the corners, but then define a smaller play area to account for a TV set or furniture?
Where is your Universal motion implementation is going in the mid term: can you share features that will be coming next?
(I am especially interested in a auto device configuration switch, so you can set up in your project all the different motion controllers bindings, and then UE4 will know what’s connected and use them accordingly)
Just watched the stream, thanks for covering a lot on the Hydra plugin, always great to see other people’s approach to using the plugin
Since this stream, I’ve pushed an experimental commit which makes the Hydra Plugin fully support the 4.9 Motion Controller Component system grab it here. Simply drag and drop Binaries & Plugins folder from the zip into your project root and restart your project. After that follow the instructions Sam talked about in twitch for Steam Controllers (not the hydra specific stuff), same as in the documentation on Motion Controllers. No hardware specific instructions! (NB: start = face button 5, joystick click = face button 6, but hydra specific IM events will still emit)
Regarding the hydra base offset for VR, see our discussion in the hydra thread for my details on how to do a calibration system for 1:1 hand motion in VR.
This is something I’m passionate about as well, and I hope Epic is thinking about how to expand this in the future; e.g.expanding enums for motion controllers to cover a whole body state and adding accuracy to each location etc so we can specify motion/coverage requirements only in games and have any hardware that in sum reaches that coverage work with the game. Additionally different input types could auto-calibrate each other e.g. leap motion could calibrate the hydra so you wouldn’t have to specify your offset. Its something I will continue to push for and something I wish to implement down the road if Epic’s input system doesn’t expand that way.
I’m trying to setup the Vive Controllers using the First Person Template, but when I add them to the character, they show up in the scene, but are inheriting head rotations and are sliding all over and appear to be mirrored and are no where near where you would want them to be. I can get it working fine in a default scene.Any ideas how to add them correctly to the First Person Template and have them behave correctly? Thanks!
I recently purchased a Hydra motion controller and was following along up until about the 15 minute mark where Sam was talking about implementing the Hyrda Interface. He says to click on Add and select it from the drop-down only it doesn’t show up for me. If you watch closely in the video you’ll see that it doesn’t show up there either!
So what exactly do I have to do to get it happening?