Unreal 4.17 - ambisonic question

What would be the best option for spatially mixing music stems for cutscene (linear, not interactive) in a VR experience produced in UNREAL 4.17. The easiest way as I see it would be to mix ProTools with linear spatial plugins, producing FOA-HOA ambisonic file. I’ve searched this forum, and it seems that ambisonic encoding/decoding is not supported (native nor via spat plugins) until 4.19 is released.
Could implement stems in Unreal with spatial tools - but I have a hunch that linear DAW tools would be better suited for this task.

What are you hoping to get from an Ambisonics mix-down that you would not get from literally just placing your stems in space?

You can just have all of your music stems be sound sources in 3D space.

Hi Dan. Good question. I might be wrong, but my feeling is that linear daw ambisonic HRTF tools - especially the ambisonic convolution reverbs is of better quality than the tools like Steam and Google Resonance. Simply put - it would sound better in this situation where the goal is to adapt an existing stereo music track to an immersive environment. But given my options, it seems like you’re suggestion is the way to go. Is Unreal capable of sample accurate sync of several sound files without any drift over time?