Awesome Idea For New Feature!!!

So this just popped in my head and this is a great idea! I would program this right now but I’m caught up between my prototype but if anyone in the community creates this, throw me some credit :slight_smile:

Okay, so you know how in the industry when doing Motion Capture, studios will give cameras a more realistic organic feel as if they are being held by a person by motion capturing a rig being held by someone and mapping the translation and rotational data. What if, and I’m looking at you Epic :D, you could create a starting CameraAnim asset that has maybe just translation information and then you can double that up with recording data from an Oculus Rift to supply it with the rotational data and possibly additional positional data from the DK2 to add to the animation to give it that feel? Basically you could see a preview of your animation being played and then potentially on another track, record the data coming from the Rift and pilot the Rift as a camera while seeing the results in real-time and use that as a makeshift camera rig to emulate motion capture on a small budget!!! :slight_smile:

This would probably be a great feature to add as a component to the new Sequencer feature being worked on, hint hint Epic…

So what do you guys think?

You could just do that in a separate program and import your camera animation to UE4.

What other programs are out there now that do this?

EDIT: Or plugins for existing programs that record data coming from a rift and let you see the animation in real-time? Also, if so, that would diminish how well you could tweak it in the engine if you had to keep re-importing it all the time. You would in that case also have to re-create the scene in another program to have an idea of what direction the rift would be aiming and seeing what that preview would look like in real-time. That seems unproductive compared to a direct solution to tweak in real-time within your own scene within UE4. But I would be curious what plugins can already do this. Thanks for the response.

EDIT 2: Another thing is that my suggestion is one that layers or “adds” the data to an existing CameraAnim. Are you able to merge the data of 1 or more CameraAnims together? Now I don’t think that the Rift would be capable of complete Positional Tracking since the IR LEDs exist on the front and sides of the device so I don’t think it will work completely by itself in all cases. Now the Rift is one example, another example would be using something like the new STEM system to act as a mount for the virtual camera. I think any motion controlled device falls into that category as a good input driver for this example. But it makes more sense that this would not only work better but I think it possibly would require little effort to create the internal hooks in the engine to make this work and since Epic is working on the Sequencer feature for cinematics, this would be a great opportunity to add additional value to developers if it can help people achieve this on a low-budget potentially supporting a range of input devices. Also, since DK2 hasn’t made it out yet, I doubt anyone has a positional/rotation plugin that is similar to the use-case I described but you never know. That would be a great thing to have and to be able to tweak the “Input Driven Tracks” as quick as possible and maybe add additional layers that act like additive layers and combine the camera data from all tracks together, ex. maybe in one section you want a track that acts as a camera shake that is driven by the rift but you want to do multiple takes then pick the best one and it simply “adds” its positional data and rotational data to the overall CameraAnim. Hope that makes sense but let me know definitely the other solutions you are thinking about. Take it easy!

EDIT 3: One last cool idea I also thought would be that this feature could possibly be extended to capture input data from a game running real-time and capture that maybe as a “Real-time Input” data track for Actors, that way, you could replicate movements of a Pawn and extend that for using in a cinematic. Maybe it could be made accessible from Blueprints to provide “Replay” functionality available to a game that needs that functionality. But for cinematics, it of course would only work for game instances running within the editor and not standalone but I think that would be really powerful and add a ton of productivity and even prototyping to some animations.