Announcement

Collapse
No announcement yet.

Live Link in 4.23, Vive Controllers, Virtual Production

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    Live Link in 4.23, Vive Controllers, Virtual Production

    Somehow I got the impression Live Link could be used to with VIVE hand controllers and tracking pucks, but now that I have 4.23 it looks like it does not.

    When you want to use VIVE (or other) hand controllers and pucks for motion capture or camera tracking It would really be useful to have the ability to bring them in through Live Link.

    As I understand it, Live Link data has timecode and can be buffered and interpolated to match up with the timecode of the rendering engine. This feature would be VERY helpful when you are trying to do virtual production where an incoming video source has timecode and the graphics engine is genlocked and synced to it. If I understand it correctly it would automatically delay the live link data to match up with the video.

    Right now it's harder to use vive trackers in a virtual set scenario because there is no way to easily sync them up with the incoming video.

    Specifically, it would be nice to have the option to stamp all incoming VIVE tracking data with the current timecode (from the engine or video timecode source) so that when the frame is finally rendered the tracking data that corresponds to the timecode will automatically be used when rendering that frame. That way the position of the tracker in the video and the camera and other objects they were controlling would remain in sync with the video.

    Right now I have to use a blueprint that grabs the motion controller data every frame, throws it into a FIFO and uses the motion control data from several frames back. It works but is cumbersome.

    Being able to treat hand controller and tracking pucks just like data from a mo-cap system would be a big help for live Virtual Production and Mixed Reality filming. Not all of us have access to a high-end mo-cap stage, using these VIVE and other tracking devices is a good alternative.


    P.S. I know the mixed reality plugin can delay tracking data, but this only seems to work if you use that plugin. It would also be great if the "tracker delay" function was always available and there was a way to sync up rendering and tracker data to the frame-start of a video source without timecode.

    Hope this makes sense.

    Greg
    Last edited by Greg.Corson; 09-06-2019, 10:32 PM.

    #2
    Originally posted by Greg.Corson View Post
    Somehow I got the impression Live Link could be used to with VIVE hand controllers and tracking pucks, but now that I have 4.23 it looks like it does not.

    When you want to use VIVE (or other) hand controllers and pucks for motion capture or camera tracking It would really be useful to have the ability to bring them in through Live Link.

    As I understand it, Live Link data has timecode and can be buffered and interpolated to match up with the timecode of the rendering engine. This feature would be VERY helpful when you are trying to do virtual production where an incoming video source has timecode and the graphics engine is genlocked and synced to it. If I understand it correctly it would automatically delay the live link data to match up with the video.

    Right now it's harder to use vive trackers in a virtual set scenario because there is no way to easily sync them up with the incoming video.

    Specifically, it would be nice to have the option to stamp all incoming VIVE tracking data with the current timecode (from the engine or video timecode source) so that when the frame is finally rendered the tracking data that corresponds to the timecode will automatically be used when rendering that frame. That way the position of the tracker in the video and the camera and other objects they were controlling would remain in sync with the video.

    Right now I have to use a blueprint that grabs the motion controller data every frame, throws it into a FIFO and uses the motion control data from several frames back. It works but is cumbersome.

    Being able to treat hand controller and tracking pucks just like data from a mo-cap system would be a big help for live Virtual Production and Mixed Reality filming. Not all of us have access to a high-end mo-cap stage, using these VIVE and other tracking devices is a good alternative.


    P.S. I know the mixed reality plugin can delay tracking data, but this only seems to work if you use that plugin. It would also be great if the "tracker delay" function was always available and there was a way to sync up rendering and tracker data to the frame-start of a video source without timecode.

    Hope this makes sense.

    Greg
    Hi Greg

    I agree with all that you said, but I think that epic give more attention to big company for commercial reasons that's why vive tracker may disappear from the list, in a short time , so please continue to develop for solutions like live tracker, and good luck!. Abdel

    Comment


      #3
      I'm sure Vive trackers will always be a part of Unreal Engine because all kinds of game developers use them in shipping games. How much support there will be for using them in Virtual Production remains to be seen, though they can be used I'm still trying to figure out just how effective they can be.

      Comment

      Working...
      X