Essentially yes for the code integration.
As for the hands, the Kinect does not currently retrieve bone information for fingers so we cannot dynamically set their rotations. It is relatively trivial to make hand states trigger blended animations for the hands, however each use case for this is too different to provide just one solution. The models set up of hand bones and what exactly each hand state looks like changes for every implementation.
In addition to that, the Poseable Mesh is more of a quick drag and drop solution. The main way we have done hand animations on our projects is to check the hand state and then selective blend in a hand animation using blend nodes in the animation blueprint.
Also the problem with the links is now fixed, there was a mistake in our naming convention.