Metahuman on ios

Hi there fellows,

first I apologies if :
this question is redundant.
this question is in the wrong section.
if my question is too broad and the answer isn’t that simple.
last but not least, i am no progammer and have an enthusiast level of knowledge playing around with blurprints, so appreciate your patience.

long story short, I wanna export the meta human project to an iphone and have it track my face locally through the same iphone all within the project. so I basically wanna open the app which will show a fix camera pointed at the character and will track my facial expression live, right there and then

  1. how would I go about tracking the facial expression?(when using a pc i use liv link face but in this case id need to include a code of some sort to user the ir camera to do it within the app itself)
  2. is the metahuman project too heavy for an iphone 11 or 12? is there anyway to optimise it? if metahuman is too much for a phone to handle, can I achieve this with just lowering the LOD or making my own character?
    3.i am quite clueless to it all and even being pointed to some fundamental guides would help me tremendously in figuring this out! your help and guidance is much appreciated!
1 Like