Sam Deiter and Wes Bunn walk you through setting up motion controllers in VR and using them to interact with the environment as they build the classic carnival game “can-toss”.
Additional topics covered:
- Merge Actor Tool
- Material Instancing
Tuesday, November 3rd @ 2:00PM-3:00PM ET - Countdown]
Sam Deiter - Sr Training Content Creator
Wes Bunn - Sr Training Content Creator
Feel free to ask any questions on the topic in the thread below, and remember, while we try to give attention to all inquiries, it’s not always possible to answer everyone’s questions as they come up. This is especially true for off-topic requests, as it’s rather likely that we don’t have the appropriate person around to answer. Thanks for understanding!
Archive is up!
Are there any games that have implemented a 3D space mouse on a 2D curved screen? I want to have a mouse drag around UMG’s around a character (so the surface is the inside of a sphere), but have trouble with doing this in UE4.
Really looking forward to the implementation of motion controllers!
In my game I add child actor components that are attached to each motion controller on construct, these are my base hands in the game. I want to be able to switch to other blueprints that have there own meshes and fire their specific projectile blueprint based on which one of hydras buttons were pressed. Whats the best way to hotswitch child actors and there meshes but still have general trigger actions work across the board?
The reason the HMD button is red at 10:10 is not that the HMD is off but because the compositor is not fullscreen. This happens when a window from first screen overlaps the hmd screen. This issue is also described here:
Maybe this info could be forwarded to Sam and Wes. This is currently a quite common issue and there are a lot devs that struggle with it.
Have you made the project files available?
Thanks for these videos. I have been trying to get this working for days with no luck. The problem I seem to be having is the controllers are not generating hit events. For testing I made a cube with physics that plays a sound when hit. When I hit the cube with the controllers, the physics works as the cube will get batted around, but the sound does not play. Only if hit by the player capsule component will the cube register a hit and play the sound. I have tried everything I can think of, but nothing I do with the controllers will generate a hit. What am I doing wrong? Thanks for your help.
this only works on a single instance of a object. For example, pick up a ball and throw it in the air. Then hit the trigger again before the ball hits the ground and like magic, that same ball that was just in the air is now back in your hand. How would you make it so that you can pick up any object that is simulating physics and attach it to the arrow component.
I´m trying to follow this tutorial and I´m stuck at 14:50 where you setup the “follow hmd orientation” node. I can´t seem to find this node in 4.12. Was it replaced ?
First of all thanks so much for making this tutorial series.
Unfortunately I’m finding myself getting stuck in some areas. I don’t know whether it’s because I’m using Unreal 4.12 or somehow I’m missing a step somewhere.
I’m working on setting up the CanGame_BP at 26:29 of the 1st video.
I can not get the following variables to work right
Here’s what I have:
The ScoreBoardWidget should be accessing the ScoreBoardWidget BP we created earlier, correct?
and BallsRemaining and CansRemaining integers?
I’m unable to get them to connect as you have with Target BallsRemaining
Wow, very nice.
When can I have it yes.
I don’t know if this helps but I was also looking for this same issue and came across this:
I’m still trying to add movement using HTC Vive trackpad but struggling at it. I’m using Unreal Engine 4.27.