Holy ****! the future of UE4 rendering?

Take a look at the UE4 demos on the OTOY website of the brigade renderer - a realtime path tracer.

Click on cloud demos at the top.

The Sun Temple one’s interactive and I can’t get realistic room to work yet.

Not really–that’s being rendered on a server and streamed to you, you have to have a service for that, and it runs pretty poorly anyway

Here’s a video of OctaneRenderer (Otoy’s other raytrace renderer) running as a plugin within UE4. The performance obviously isn’t great and it looks like **** as soon as you move the camera - but it’s raytracing, running in real time, in a mass market game engine. Pretty incredible stuff.

I imagine it will continue to simply be a tech demo for some years, but I wouldn’t be surprised if in maybe 5 years or so this becomes something that’s at least partly viable for actual games of some kind, even if it’s restricted to the high-end PC space. This could be especially true if some sort of temporal AA solution can be used to reduce the noise without having to massively increase the sampling of the ray tracing. I think this tech might, might be closer then perhaps some people might think, but we’ll have to see.

It’s not octane render, it’s brigade - also, this is an unreal plugin/build, not an otoy native scene. I think it’s only being streamed so we don’t have to/can’t run the build yourselves. In their latest presentation they’re talking about doing this off the cloud using your GPU as a GPGPU.

Ok, well to be honest it´s not that impressive as it looks in the first place. The render power behind those demos from Otoy must be enormous. Probabely some dozens of highend GPUs maybe even more.
I searched a bit and didn´t find a statement what exactly was used. But I found this video is just a bunny in an outdoor scenario. This means the room it´s in, is just an environment image. Two Titans were used. And you still can see a lot of noise.
Regarding the 5k instances. A pathtracer doesn´t really care how much polys are in a scene as it´s raytracing. It´s just a matter of how much memory your scene eats. As those characters are instances they don´t use memory at all. I remember back in 2000 when all those modern renderes came to live people would render gazillions of instanced polys.
For an indoor scene you usually need way more samples to get it clean. Also effects like glossy reflections will take much time to render with a path tracer.
This is how one of the fastest pathtracing “realtime” renderes performs in an indoor scenario on 2 titans.
Even if this runs as streaming service. Each user would occupy dozens of highend gpus. For the moment it might be interresting for industry customers. for gamers this will be interesting once our gpu s are a hundred times faster.

This is just X.IO streaming. I’ve done it, it’s too slow, looks bad. Mouse controls are poor. Still in beta tho!
Works with any app in fact. It’s like amazon ec2, etc.

Brigade looks good but the ue4 cloud demos and brigade are not the same thing.