I think he’ll share his works if somebody help him.
During the “Showdown Released!” live stream last week, was asked about Google cardboard support. He answered “There are plans in motion right now.” Which sounds quite promising!
Glad to hear, thank you!
That would be interesting The main thing, besides getting it to work smoothly, with AA, no drift, distortion and potentially chromatic aberration correction (as togglable option), multithreaded rendering, it would be essential to have several example projects (FPS and TPS) made by Epic with gaze controls and trigger controls, surface-based UI.
It’s worth noting that his implementation is NOT based on the Google Cardboard SDK. (Source: Reddit - Dive into anything)
If this is the case, I personally wouldn’t hold out for this.
I can only imagine his reasoning for this is so he can more easily support non-cardboard VR implementations.
There is a lot that the Cardboard SDK brings to the table. So he is either a genius low level programmer or it will leave a lot to be desired.
There are SO many features that are to be desired in a VR implementation. It’s not a simple case of spawning 2 cameras for each eye. For example , in the latest video here: https://www.youtube.com/watch?v=W8bREFpp2o8 touches on instancing material draw calls between eye viewports for a huge performance increase.
I think Epic really needs to take the lead on this one. At this point is the only people that have the skills to create a fully polished Cardboard implementation. So please Epic, IMO we need you on this!!
I’m going to be integrating and open sourcing just the HeadTransform part of the Google Cardboard SDK in to UE. I wonder if anyone (Epic?) would be interested in this? If so, I’ll start doubling down on it.
It will most likely be a pull request, not a plugin as it will require punching a hole through the AndroidJNI.cpp layer down in to MainActivity.java. Both of these files, to my knowledge, cant be modified via a plugin.
Best,
Minxies
So with Unity 5.2 multithreaded rendering works again! Plus, Google added time warp to latest SDK (not for Unity yet). I am just hoping UE4 will not drag behind Unity/Google on this project.
Btw, there is also this sweet app: https://play.google.com/store/apps/details?id=org.hitlabnz.sensor_fusion_demo
Maybe it can be integrated into UE4 to provide spot on tracking for Cardboard. It’s source code license allows integrating it into proprietary engines.
Hey folks,
I have some news for you. There hasn’t been any news on my Cardboard implementation since I spend two nice weeks in California after SIGGRAPH. I tried to get some news about the Cardboard support by Epic on SIGGRAPH but unfortunately nothing specific from their side.
Therefore, I pushed the implementation for the Cardboard Plugin forward last week (Android only, sorry guys, but iPhone implementation should be possible in the same way). It is now running within an independent plugin than can be easily linked to any project. It is also updated for Unreal 4.9.
@Minxies: A pull request isn’t necessary. You can access the Activity within a plugin. I tried that within Java code and it took some time to figure out how but it is working. However, I am using the NDK (also within the plugin) for performance reason which is the better way to access the Android sensors at this point.
Of course, stereo rendering is hungry for GPU performance. However, using a Galaxy S6 I got 60 Hz update rate in 2.5k resolution, with a Nexus 5 (2013) I got 30 Hz@1080p, 60 Hz requires rendering with reduced resolution.
You can download some demos from my blog and test it with the Google Cardboard on your devices.
More info on Unreal4Cardboard: Demo applications released | in magic we trust
-Michael
Nice, thanks virtuellerealitaet !
Nice work any timeframe on when you will release the plugin itself?
Very nice.
Yeah, as asked, when can we expect the plugin source?
I’m very eager to get some commits in.
At this point, I have experience with the Google Cardboard SDK and would eager to contribute some extra features.
Regards,
James
I think the plugin still needs some improvement. That’s why I was asking for user feedback first. Currently I am looking into the resulting performance which is a trade-off between visual quality and rendering speed (as always).
When Epic provides a solution for code plugins I planned to provide the plugin via marketplace.
OK, well I’ll be happy to pay $100, money up front, for early access.
I’m low maintenance, as I’m a developer by trade.
I’d also be interested in pushing back changes I make.
Happy to sign an NDA or whatever you need.
Feel free to PM me if interested.
Best,
Minxies
APK feedback:
Looks like your Kalman Filter (of whatever system you’re using to fuse the sensors) is slightly off and you get hiccups (jumping) every once in a while
Also, on a tablet device (Nexus 9) the sizing is incorrect and stretched way out of proportion.
Best,
Minxies
@Minxies: how is performance / lag ?
Lag is fine.
Performance needs tweaking. You would just need to try it for yourself.
However ill try to do the best job I can explaining.
It it quite jerky at the moment. Other words I’d use to describe it is juddery and has the odd hiccup.
These are issues that can be fixed in software, though it will take a good understanding of Kalman Filters and math.
And of course, I’m very happy to help with this
Minxies
Do you know if this https://play.google.com/store/apps/details?id=org.hitlabnz.sensor_fusion_demo would help better tracking ?
The results of Pacha look smooth and stable. However, I don’t know if the achieved results are better without some ground truth tracking data.
On which device do the jumps occur ? I cannot reproduce the behavior with my Nexus 5. Are you testing the applications with the cardboard ? The orientational tracking is only working with cardboard. Otherwise the the yaw angle may not be updated correctly. But even then it shouldnt jump but drift.
Michael
It more than likely could, but Michael’s argument is that that’s a Java implementation and so would be a bit slower.
I think the best option is to use the Google Cardboard SDK and either use the Java code there or porting that to C++, leaning on Java only for the sensor values.
Regards,
Minxies
Well, I could be converted to C++ perhaps. The whole idea that the code is there, it works and it has compatible license.
If Cardboard Android SDK is more stable / precise and the license allows it, might as well integrate that instead. It has time warp code already in it.
Am also interested in this. My friend and I want to dev something small for our robotics league.