Virtual Camera/Unreal Remote for Android

I really think people should be able to get that option. What’s so special in Apple devices or mocap solutions? Every smartphone has camera and accelerometer with gyroscope. So it should be possible. Epic team, please port the Unreal Remote app to Android.


I strongly second this (as I have mentioned in other posts). It’s very likely that the iOS version of the Virtual Camera app is using ARKit for the tracking so Android and ARCore should completely provide the same functionality. I have a specific use case that would have benefited from having this for Android: I’ve been promoting the use of UE4 to a college where I teach and they have a bunch of Samsung donated Note 10+ devices that could have been used for my virtual cinematic class. So it would definitively be great if Epic ported this app to Android. If the remote app was made using UE4 then it would be great if at least the source was released so other developers can contribute to porting it.


And that’s just one example! It feels like discrimination of Android, to be honest.

1 Like

I am interested in this too, no one in our team has an iPhone.

I was looking into the code for the Virtual Camera plugin and from what I see it seems that Epic developed it in a platform agnostic way so it should be relatively easy to get it working on Android. The positional data seems to be collected using the platform neutral AR API and then transmitted from the mobile device using standard OSC messages, then the video feed is sent over a network connection. It also seems that the Virtual Camera plugin code is designed to work both as the client (on the mobile device) and the master (the Editor) so it might be just a matter of building it to the device. I tried but unfortunately it seems that anything I build with 4.24 for AR is crashing on my Samsung Note 10+ so I haven’t been able to verify this.

Some guidance from Epic would be appreciated, particularly if they don’t have plans of doing this Android port for us. I think sometimes the old bias towards thinking that iOS devices are somewhat superior has to be put aside. There are more Android users than iOS and the quality and performance of the devices is not behind Apple’s anymore.

Many thanks! I hope you will get it working.

Just for a quick update: after looking into the code it basically comes down to getting the Remote Session plugin to work on Android. The code has a few comments from the developers saying that Android support will be coming next so hopefully that happens sooner than later. Otherwise for now I instead managed to get a Vive controller setup as virtual camera tracker, using the Virtual Camera plugin and an Android phone connecting via a remote desktop app (local network) to the desktop computer for monitoring and control. This setup works quite well with the added benefit that the Vive’s tracking system is much more reliable and stable than ARKit/ARCore.

Okay, it’s 2020 and not a word about Android support. And I don’t have HTC Vive…

I don’t think there is anything for Android yet. Has anyone found a workaround to use Android as a tracker for unreal engine.

Do you find any other way to get video feed and controls directly from unreal engine without using remote desktop…??

No. I’m using an excellent app called Super Display that converts an Android device into a secondary monitor via USB or WiFi. I saw direct mentions from Epic Games staff about an Android version of the Virtual Camera app being under development but that was already a few months ago.

Yes Super Display can be used to get the video on android as it extends your android device screen into an Extended/Duplicate Display for your Desktop. But the main thing would be to get Tracking data of the camera movement from android using its sensors. Have anyone got that part working?

Extra Tip: SpaceDesk is an free alternative for Super Display app.

I guess we are still waiting for the Android port of the VCAM app. The iOS version has pretty good tracking probably due to some of the Apple devices having the LiDAR sensor (integrated in ARKit). I would expect the Android version (which would be using AR Core) to be much more drifty and unstable. To get the app going is not that hard, I did try to develop one some time ago but I stopped because I found out that the Remote Session plugin (the one in charge of streaming the viewport) was only implemented for iOS (using iOS specific APIs) and not Android. I wonder if that has changed.

Kinda found workaround, but it’s not free
Plugin costs 80$

I am also very perplexed as to why we don’t have this
even android face tracking.
but perhaps that requires lidar
even so
on the other hand
I would say
it’s only $450 for a second hand iphone 11 and $300 for an XS Max
I find it disturbing, as I do not like to work in any apple environment
having done so for years
so I really wish the devs at a company like Samsung would at least match them with an arkit clone.
As it is I accept that this is the current state of affairs and have literally been spending zilch for a month so I could spend half of my next check to get an iphone 11 second hand for face tracking and vcam in Unreal.
Sometimes you gotta use what works atm.

by the way some function are able for all but not for apple and other are able for apple but not for all is like there is some kind of contract between epic and apple like that one that chains Microsoft to use batteries in their xbox

I don’t think its an apple epic thing
its that the android phone manufacturers haven’t put lidar and the arkit system into their phones.
Until a company makes an android phone with lidar and arkit enabled we will only have iphone.

Even though android doesn’t have full fledged Lidar inbuilt, there is arcore compared to arkit and I have seen android ar applications working great lots of time. Can’t that be used to achieve this for some extent?

is not arkit
arkit is an apple system for face and body animation capture
for the face it uses a standardized set of facepoints and phoneme blendshape/morphshapes.
I do not think arcore has the same level of functionality atm.
Because I do not think they have the facial blendshape system working for an end user.
and no programmer has made a livelink plugin for android arcore.