I’m researching face mocap pipelines with AppleARKIT. Basically I denoise and multiply the arkit values in post. However, when recording mocap like this in sequencer, all my post data cleanup is baked into the animation.
This leads to problems, if I want to tweak the same performance with different denoise / multiply values.
Since the live link app already records the raw data into to the phone as a CSV file, I wonder how to stream / load this raw data back to UE4?
I’m using the metahuman project.