Hi! I’m trying to use a pointer on a widget actor I’ve placed in my level. The pointer works on the wrist-attached menu, and when the menu is activated, the menu widget actor in level also responds to the pointer, but I can’t make the pointer appear when the XR hand points at my menu widget in level. How would I apply it? I have been trying to use an “on hovered” event on the button, but can’t attach the niagara system on the xr hands from the menu blueprint as far as I know, nor can I use an event like “on hovered” referencing the menu widget in level within the mannequin hand XR blueprint, which is where I can attach a beam to the hands, but it is always on. Any help would be much appreciated!
The VR template is, hands down, one of the coolest things I could ask for as someone starting in VR! Except for the hands down… because, literally, the VR hands are pointing down when I point the controllers forward - but only after enabling the Meta XR plugin.
I can run the VR Template on the PC via Link / Air Link with the Quest 2, and in the Quest 2 as a stand-alone app. In both cases the hands are turned down about 60° from where they should be.
In the screenshot above (from the Quest 2), you can see the Quest controllers (and hands) and also the VR template hands, at the same time, pointing at the same orientation.
This happens on Unreal 5.1.1, Windows 11, Meta XR plugin v. 1.82.0.
Is there a simple / easy solution for this, without compromising portability? Or should I use the UE fork from Oculus?
Thanks!
Meta is using their legacy poses when you use the MetaXR plugin to maintain upgrade compatibility. Switch MotionControllerLeft and MotionControllerRight Motion Sources from Left/Right, to LeftGrip/RightGrip and the hand models will be aligned as intended.
Are there differences between 5.1 and 5.2 template? Planning on starting a VR project and don’t know if it would be better to wait for 5.2, since it shoud be out soon.
Very minor changes, but 5.2 will be out so soon at this point that I’d wait
LOL, indeed
How can I build an UE plugin with the VR Template?
When I launch UE 5.2, I can create a new “Blank” project (or “FPS”, “3rd Person”, etc.) that is “C++” based (instead of “BLUEPRINT”), but there is no such choice when creating a project based on the VR Template.
I created a project from VR Template, then created a “Source” folder inside to be able to generate VS project files and build the plugin from the source (by right-clicking the .uproject file).
But when I build the solution, it doesn’t build the plugin. It seems it is not included in the VS solution file, and there is no Intermediate\ProjectFiles\<project name>.vcxproj
either (the solution includes only “Engine”, “UE5” and “Visualizers”, when a C++ project includes “Games” and "<project name>"
as well).
But if I create a “Blank” “C++” project I can build the plugin without issues…
Thanks, as always, Victor!
I want to detect when the user “shakes” the controller. Is there anything in the OpenXR Template, Enhanced Input System or anything else in the UE to help me with that (such as access to IMU data, or detection of speed / velocity of controllers), or do I need to build my own solution (which would be less precise and have more lag), e.g., from tracking the controllers position (via Get World Location
) or adding a collision box as a child of each of the motion controller and using Get Physics Linear Velocity
?
Edit: I’ve found Set XRTimed Input Action Delegate | Unreal Engine Documentation and Get Controller Transform for Time | Unreal Engine Documentation , and the latter seems to provide linear velocity and acceleration (among others), but I have no idea how to use it from reading the documentation…
Hi! I’m currently using UE 5.1.1 and I’m working in a VR Project.
My aim is to use handtracking with Meta Quest 2 to interact with actors in the scene.
The interaction would be to activate a sequence (animation) using an overlap event between the VR hands and the other actors’ colliders.
My problem seems to be related to the UOclus Hands, which I add as a component to the Motion Controllers along with a Sphere Collider.
But after making a package, the hands don’t appear in-game and when I open UE they have also dissapeared from the VRPawn.
Am I mistaken in any basic matter? Is it fixed in 5.2? Is there any tutorial I could use?
I appreciate your help. Thx.
Sorry to ask again about this, but you suggested that the UI/text quality render in VR might be adressed, either as updates to the engine or as an example/template project of best practices (I think about best practices in the line of what meta did with this app, “2D pannels demo”, but UE oriented and with the project Blueprints so see).
Or/and updates to the engine in the line of my referenced post from march’22 (the post which I’m answering here).
I understand priorities change from time to time, but I allow myself to push a little in this direction because with the recent change of price scheme in Unity I assume you’ll soon have a flood of ex-Unity VR devs that will perhaps justify to priorize it somehow. One can always dream at least
Since updating to the latest 5.3 version, the VR template crashes every time I try to render a debug box/sphere.
I submitted a bug report a few minutes ago.
Can someone please test this to confirm if it’s just my setup??
Steps:
- Within the latest launcher version (5.3.1) Create a new VR Template project [the issue doesn’t happen on third-person template]
- Create a new actor blueprint. Drag off the BeginPlay node and add a DrawDebug(Sphere/Box) node. I used GetActorLocation for the center.
- Add new BP to the level.
- Press Play.
I then get an editor crash with the pop-up:
“GPU Crashed or D3D Device Removed.
Use -d3ddebug to enable the D3D debug device.
Check log for GPU state information.”
This wasn’t happening before the update (I don’t think) and it doesn’t happen with blank VR projects in 5.2.1
It’s currently not on our roadmap to implement natively, but there are solutions that you can implement yourself as you’ve discovered in the Forum thread you linked.
Hey @Kavan, I’m not able to reproduce this myself. Not sure what might be different w. your setup…
Oh… Ok. In any case, if an engine implementation is not on the roadmap, just to have a VR example showing recomended best practices for UI text render in VR with current native UE would be great.
Anyway, thanks for the info update on the subject.
Anyone having trouble getting the VR template working on an android device using UE5.3.2? I’ve been working for weeks and I can’t get the VR Template to include OpenXR in the build like I can in UE5.1.1. Builds fine, just won’t run on the HMD. AndroidManifest.xml doesn’t show openxr inclusion like it does in UE5.1.1.
Could really use an assist or point in the right direction. I HAVE to use UE5.3.2.
Thanks in advance!
Hello.
Any changes to the VR template in 5.4? Thanks.
I’m getting artifacts when creating a new VR project from the template in 5.4.
They are marked inside the yellow “circle” in the attached image. The artifacts are not stable, kind of vibrating. And when in VR they are only visible through the left eye (and also in the PC preview window).
It does not happen if I create a new VR project in 5.3.
Thanks
UPDATE. I’ve found a bug report about it. I leave the link here
This issue has been fixed in 5.4.3 which was just released
Hi, I am fairly new to Unreal and I am developing my first VR project wit this engine (I have experience with Unity tho).
I am using the VR template in UE 5.3.2 and I am finding it very badly optimized. The VRTemplate level doesn´t pass 72 FPS in editor - I have a MSI laptop Creator 15 with i7-10875H CPU @ 2.30GHz, 32GB Ram and a 3070 RTX GPU so the hardware shouldn´t be a problem. If I play the level but i don´t put my headset on, I get around 125FPS but as soon as I put it on, it drops to 65-75 FPS maximum.
I did some debugging and found a couple of things:
- the Game (in Stat Unit) is normally taking 13-16ms but every 2-4 seconds it goes to 26ms for a second or two and then back to 13… why!?
- I used Trace and found that the CPU is sometimes waiting for the GPU (i.e. WaitFrame event in Insights window) at the beginning of some frames and I have no idea why this could be…any ideas?
I made a test where I spawn 10 more Pistols in the VRTemplate level and the FPS drops to 30-40 steady. I am quite surprised about this as I thought Unreal would handle better this amount of meshes/vertices? Should they be simpler to achieve a better performance?
I have considered changing the plugins from OpenXR to Meta XR I´m but not sure if that is going to improve anything?