4.13 Release Preview - Aug 4th -Live at Epic HQ

Oh, didn’t know that. Thanks, I’ll test it!

I guess the major I’ve encountered with dithered LODs is that fps drops when mesh is at the base LOD and is very close to the view (which is weird as no dithering happens at that time). Any idea what could be the ?

Thank you! Mesh decals are awesome and i’ve missed this so much. Now i’m really excited to hear that it’s going to be implemented in the next version, awesome!

Do mesh decals work on mobile (ES2) ?

@DarkVeil You’ve mentioned in the stream that 4.13 has shadow map caching for movable point lights and stationary lights so that shadow maps for static objects get cached to save performance. What about directional lights?

Nope, just movable point lights and spot lights for now. Directional lights use view-dependent CSM and perhaps wouldn’t yield great results with caching, especially given their “global” influence. Note the new shadow map caching is explicitly for fully dynamic movable lights (not stationary) with shadows enabled, and will benefit during frames where the light didn’t actually move at all. There will be more info in the full release notes.

Thanks for the fast answer!

I hoped it could help with shadow performance for a dynamic directional light (sun). I decreased the shadow distance to only a few dozen meters but shadows still take 1 ms in VR, so I really hoped it might help with that since most of the level is just static geometry (but runtime generated, so no baking possible). Well so I can only hope you might add support for directional lights later :slight_smile:

Oculus Mobile SDK 1.0.3 is in preview 1, I’m still looking into those other ones, but it’s looking like a few “no’s” though. I’d assume no unless you see an update in the notes or the tracker.

Archive is now available. 4.13 Release Preview | Feature Highlight | Unreal Engine - YouTube

@DarkVeil Any volumetric light/fog news available? It was marked for July until very recently, now it’s marked as backlogged. Wondering what state it’s in if any iteration is on it, when we can anticipate learning more about methods/implementation plans :slight_smile:

Thanks!

Yep, that did the trick with HISMCs! Thanks.

How can I set it to be a default value of the console variable ? (so that even when I load my project initially, I already have it in effect even in the Editor)

How to I revert back to default behavior, dithered LOD? (any value I tried doesn’t bring dithering back)

Thanks beforehand!

You can add to the DefaultDeviceProfiles.ini for specific device profile:


+CVars=foliage.DitheredLOD=0

This is already possible with some manual intervention - in fact we do already this to record some sequences for our Paragon trailers. Sequence recorder can record any actor, including those driven by (and spawned by) demo recordings. Sequence recorder is also able to be driven by Blueprint, so one can easily set up an environment where sequences can be recorded while a demo is being played back. The only caveat is that at the moment there needs to be at least one actor recorded (i.e. you cant start recording with an empty scene, expecting to record later spawned actors).

Also, to directly answer the question, we don’t have a direct ‘turn my demo recording into a sequence’ button built-in.

Did this get answered?

Have you figured out how WidgetInteraction works? I was looking for docs but couldn’t find them, and the C++ didn’t get me anywhere. I’m lost and have been trying to implement it for days now. :<

Getting to the next level of editing in VR may require more of an AR approach. I don’t mean you can see VR objects in the real world, I mean you can see real world objects in VR. More than just using the two controllers, having real world objects that map into the VR space might be useful.

Like if you had a simple hand-sized cube, you could map a VR object to it and then when you move the cube in the real world, you move the object in VR space. Or maybe if you had a real world tablet that mapped into VR it would provide a more substantive means of interacting with the menus. Heck, what about an entire workbench of such tools?

Just a thought.

Cheers,