Creating Mixed Reality Videos

I am looking to create an effect similar to this video:

The effect is achieved by attaching a real camera and a virtual camera to a motion controller. If done correctly, you can easily overlay both videos with any normal video editing application.

Currently my problem is getting a recording from the virtual camera with someone playing the game in VR. I have tried adding a local player, but any kind of local multiplayer with VR seems to not work with the current version of Unreal (there is a post about it here). I also tried recording with Matinee, but it doesn’t seem like the game is playable when Matinee is capturing.

What I am planning to do now is creating a replay (console command: “DemoRec <replay name>”) while recording someone playing the game in VR with the real world camera. Later I will watch the replay with the spectator pawn attached to the motion controller and capture the screen with a tertiary application (AMD’s thingy).

Do you have any suggestion, what I could or should do differently?
Or do you have a different workflow for creating mixed reality videos?

Edit 1: Currently I can not get the Replay System working. I am getting the following console output:

Cmd: demorec <test>
LogNet: GetLocalNetworkVersion: GEngineNetVersion: 2872498, ProjectName: replaytest, ProjectVersion:, InternalProtocolVersion: 9, LocalNetworkVersion: 1044399586
LogDemo:Warning: UDemoNetConnection::ReplayStreamingReady: Failed.
LogDemo: StopDemo: Demo <test> stopped at frame 0
LogNet: UNetConnection::Close: Name: DemoNetConnection_0, Driver: DemoNetDriver DemoNetDriver_0, PC: NULL, Owner: NULL, Channels: 0, RemoteAddr: UDemoNetConnection, Time: 2016.03.21-18.46.32
LogDemo:Warning: Demo recording failed: Couldn't open demo file <test> for writing

When I try recording a second time, I get the following output:

Cmd: demorec <name>
LogNet: CreateNamedNetDriver DemoNetDriver already exists as DemoNetDriver_0
LogDemo:Warning: RecordReplay: failed to create demo net driver!

Do you know what’s wrong?

Edit 2: The problem was that I only tried recording replays in Play in Editor (PIE) mode. It works when playing in Standalone.
However I probably cannot work on this much for about 1 or 2 weeks. So do not expect an update if I got it working earlier.

Im really want this feature to be added to unreal.

As far as the Unity Mixed reality plugin goes, I believe they render objects behind the vive and in front of the vive separately so they can composite the player in between the foreground and background. Is there a way to add this in to Unreal?

I haven’t thought about that before. In my opinion this should be possible with the replay approach. If I could just get replays to work. :frowning:

I dont know about much replays at all in Unreal since Ive never grabbed recording from UE4 except OBS. Is there any documentation on it?

It would be great if you can somehow “tag” objects as background or foreground when setting up a replay so they record separately.

My approach would be to attach a Box to the aforementioned custom SpectatorPawn for the replay. The box extent would scale every tick according to the distance to the headset. On Box Begin Overlap -> set Component Invisible - vice versa for End Overlap. That might just work good enough. Then you would replay and capture the replay twice, one time with the box enabled and one time without the box.

I’m also interested in this, I posted about it as well.
I tried the multiplayer approach with no luck, I will give the replay option a go.

How you you replicate the same movements with the controller parented to the spectator camera? You’d have to capture everything in the same recording session.

I found this thread that might be worth investigating

as well as tom looman’s custom depth buffer tut

Inside the SpectatorClass I would get a reference to the Motion Controller and Set the SpectatorPosition to that of the Motion Controller every frame.

Ok I misunderstood I thought you were going to get your footage (real & virtual) on two separate takes.

Then I probably wasn’t clear enough, because I want to get real and virtual footage in seperate takes:

I want to have someone play the game in VR. I want to simultaneously record the person with a real camera and record a replay. After the game is finished, I stop recording with the real camera, disable the HMD (so the game is now showing normally on the Monitor) and start playing the replay. Then I record the game on my monitor with the replay going on with OBS, Fraps,, Shadowplay, whatever.

I hope I am being clear enough now. :slight_smile: English isn’t my first language.

Just a thought- In the player camera manager, what if you parented the camera component to a Motion Controller, then literally attached the controller to a real physical camera (in reality - maybe with rubber bands) to give it the positional data. I’m going to try this in a few minutes and see how it works out.

EDIT: Perhaps this would require the use of a 3rd controller or some other sort of externally tracked motion controller device.

Haha your english is fine I thought it was your first language :slight_smile:

Do you think once you get a replay going that you can turn off the render on objects?

Thanks :slight_smile:

I haven’t gotten replays to work at all, so I don’t really know. My assumption is that it is just like spectating a networked game live, which should mean that it is possible.

Update: attaching the FirstPersonCamera component to a motion controller component in the Default Pawn Class works exactly like I would hope. I have the camera parented to the left controller, which could be mounted to a real-life camera. It is, however, absolutely nauseating and would need some workaround so that the user sees from the headset’s position.

Yes, exactly. That’s why I am proposing to record a replay, while playing in VR and then play the replay to capture the game from the motion controlled camera, but I cannot get replays to work for me. For anyone who has no idea about replays: the console commands are “DemoRec <replay name>” and “DemoPlay <replay name>”.

Apparently Fantastic Contraption has the mixed reality plugin as a feature so twitch streamers could show their viewers what is going on in the game. How To Mixed Reality | NORTHWAY Games

This is a HUGE and necessary for future VR games because it helps a lot with second hand marketing by any twitch streamers.
@SBiegun_PDG in theory that camera setup would not be viewed thru a headset but just recording the scene so we could then composite the vr user in between the foreground and background objects