I’ve managed to compile UE4.22 with the adjustments as described in the PDF, as well as the template.
When launching the template project on the Quest however, it deploys the application to the device but the application closes instantly. In the bottom right corner I get a “Launch complete” message, instead of it remaining in the “Running on quest…” state.
When trying to run the deployed application via the Library in the quest dashboard, I am getting the following error: " keeps stopping". Installing a packaged version via ADB results in the same " keeps stopping" error while trying to start the application.
I got the Epic VR template running on the Quest, so I am guessing I it has something to do with the projects settings? Any idea what I might be missing? Or does it perhaps have something to do with the pp materials not being supported as mentioned in this post:
Got it running with Avatars in Quest and Rift connecting trough each other. Had to set up both Apps on the oculus backend, add both App ids to the config, had to upload a build to the quest release channel to get trough the verification process that crashed my quest app. And also combined both apps with the new grouping method on the backend.
Now i can deploy from editor and join with the quest to the rift and vice versa!
App crashes on Quest: On Teleportation, On Grabbing Objects
No voip working yet between those devices.
(i also disabled package all assets to the apk, so that my app generates an obb for the bigger assets)
I managed to recreate the issue on the Epic VR template by importing a complex shader from a Vive project. It would deploy and launch form the editor, but shut down instantly after that. After removing imported assets it went back to normal.
The strange thing is that I didn’t import anything in to the template yet. Since it only acts up on my end, it couldn’t be due to the use of complex shaders. Later this week I’ll try replacing the ini files with the original once that came with the template. Maybe I messed something up in those.
Problem: just built and opened the project and it seems the motion controllers are only spawning the " Interactions" actor and the hands are not spawning. Have made no modifications to the project.
I should release it soon, the only thing that keeps me is work, work, work!
I’ve been quasi-exclusively developing with the Quest / Rift S for the last 2 months. Learned a lot of things; I had also a lot of communications with the Oculus for Business people, so I’ll share with you what I’ve learned.
As a preview, here’s what you can do/do not with Quest as of now:
You can use the Expressive Avatars, it works well
You can use ES2, ES3.1 and Vulkan, they both work well. ES3.1/Vulkan are pretty much identical, but maybe some performance gain with Vulkan
You cannot enable the mic with the Quest, as it’s clashes with the Avatars plugin. So as we speak the Expressive Avatars on the Quest are mute. Anyway unless you’re multiplayer, it will not matter for now.
You cannot use platform features (including multiplayer), unless you’ve gained access to the Quest API
Oculus (as for now) are only giving access to Quest API to accepted pitches.
Accepted pitches are only for the general public store. No keys-only apps.
There are other ways you can publish/sideload your app (i.e. WeAreVR, SideQuest, etc), but again, without Oculus platform features
You can use photon for multiplayer, but I’ve never gone that way. As I have a good relation with Oculus I prefer to stick to their platform.
All business/education/keys only apps will have to be managed through the Oculus Business Software Suite, sold as 499$ upgrade / 999$ already installed on a 128Gb Quest. The Suite will be launched this fall, probably at OC6?
I spoke to many people at Oculus, it seems that it is not clear for now if business/education/keys only apps will use platform features of the general public store or a “business” store. It seems it is not decided for now.
So I’ll do a kind of mini-guide, including more infos on the bugs and small changes to do to VS. See you soon!
It’s ok for me to load the template but i have a problem with the build in the engine : Lighting build failed. Swarm failed to kick off. Compile Lightmass
Each time you build a new version of UE4 from source, when you have finished, you have then to right-click and build UnrealLightmass (under list of “Programs” in VS).
Thanks big time, is voice + avatars doable by code or is it something oculus has to solve?
I started looking into enabling spatialized voice for the Rift Platform atleast, will report.
Interesting feedback on vulkan, tought we have to skip it because of the missing for multiview (which is important!)
Do you have a very rough roadmap for us? No pressure !
@n3fox VOIP and Expressive Avatars used in combination works in tethered HMDs (Rift & S); for Go & Quest you have to choose between having VOIP or animated-mouth Expressive Avatars because a conflict in mic management in Android (you can still have VOIP and closed-mouth Avatars);