Decoupling eye data with Livelink and Unreal in real time

Hey guys, I was wondering if there is a way to decouple the eyes in Live link on a metahuman, so using Nvidia Audio2Face I can stream facial animations in real time but for the eyes animate them with a BP as the eye movement in A2F is very sub par.

Currently I have a BP setup for just the eyes however only one works at a time, only streaming data from Audio2Face or the eyes blueprint, not at the same time. Please help!