Objects far away are clipping

So to give a background of the issue, in our VR class, our professor asked us to make a game / experience where you can scale yourself 1000X smaller / bigger relative to the environment.

Our team is making a game in this direction. We are scaling the environment 1000X up when scaling the player ‘down’. And when coming back to normal size, the environment resets itself.

We have a MasterActor that we have objects attached to it and we scale that up 1000X up and down and change player locations etc.

Everything is working great game thread-wise and interactions and even physics (with so much work lol)

However, when you are ‘tiny’ (AKA the environment is 1000X bigger), objects start to randomly flicker based on your camera direction etc.

I know this has to do with far clipping issues or something like that. Because the transforms are large. The scales are 1000X so as the locations. the objects are far away when you become ‘smaller’ so it has to do with distances and scales. But this is a must thing for our project So idk.

I have tried Bounds Scale. It still doesn’t work well. + not recommended in general.

Are there any fixes / suggestions on this issue? It is ONLY render related.

1 Like

For the meshes that are flickering, try increasing the bounds scale

Start with 2 or 3, but you might have to go quite a way to correct it :slight_smile:

Another option is to change the player size.

Can you post a video of what you’re seeing? The images don’t really show the issue.

Depends on the ‘flicker’. Clipping artifacts tend to shimmer and z-fight, not pop whole objects and out. You can test clip plane, temporarily set it pretty large, like 50 or more and see if artifacts get better.

How large are you scaling? If those objects are only a few ‘kilometers’ away/in size when scaled up, it shouldn’t be too big of an issue. I am making a game for quest3 hardware with long sight distances and start to get a bit of z-fighting on shorelines at appx 5km with 10 as my near clip as a reference. PC VR should give significantly further distances if that’s your target.

If its popping in and out it may be something to do with culling, either outside max draw distance or the bounds are not right after scaling. Here are some console commands to try:

Show object bounds:
show bounds

Turn off occlusion queries:
r.AllowOcclusionQueries 0

In game to stop culling/etc, fly around with f8 to view the scene:
FreezeRendering

draw box around occluded objects:
r.VisualizeOccludedPrimitives 1

It might have something to do with the master actor relationship, too… maybe your child bounds don’t scale when you scale the master or something like that. Try removing problem object from that relationshipo and scaling separate and see if that fixes, if so you might need another way to scale objects, like iterating thru objects with an actor tag you create and assign “ScaledWorldObject”.

I am changing the environment’s scale by 1000 (1000 times bigger everything around the player). So I see flickering of objects popping in and out.

This is a video clip of it. I have fixed some by increasing the Bounds Scale but it reduces performance I have heard so I am trying to keep away from it.

The child actors DO scale. As I said, everything on the game thread and logic is great. Had to fake physics at times and write my own custom solution. But everything gets scaled correctly.

And btw I tried the commands (thx btw) and it looks like they are getting occluded with

r.VisualizeOccludedPrimitives 1

They pop in and out based on the camera direction.

r.AllowOcclusionQueries 0

This would completely get rid of occlusion even when I’m in normal size. I really don’t think this is what I should be looking at.

Thanks for the response though! I have attached the video in the comments you can look at them.

I understand the child actors scale fine, i was saying the computed bounds may not update when you scale like that. seemed like something to check if you have odd occlusion behavior as object bounds determine camera frustum occlusion. That’s the performance issue with doing Bounds Scale, you’re increasing the chance the object will render when it’s off screen by making the bounds bigger… no big deal if you need it, but only change it if you need it. I don’t think it’s the correct fix for your issue though.

I didn’t put those commands like r.AllowOcclusionQueries as something you would enable for gameplay, use them to troubleshoot… turn them on and off to see if occlusion might be causing issues,etc.

The flickering in the video def looks like z fighting. Make sure you don’t have two surfaces really close together or odd geometry issues with your mesh, that’s where I’d look first.

It’s hard to tell the actual scale (like how long is that flickering piece approximately in unreal units (cm) when you scale it up?)

make sure your character and world you want to interact with stay within appx 900000 units radius of the world center. As you get past this your numerical precision drops to where you will start showing artifacts. Generally you can have static meshes render outside this range no problem, like a mountain range and skydome in the distance. Movement, physics, lighting tend to break down outside that range.

Maybe your character is moved outside this range? if so that can cause all sorts of weird rendering bugs.

What is your near clipping at? if it is set small it may cause artifacts at far rendering ranges.

Thanks for the response. Yeah so it definitely IS Occlusion Culling and Bounds.

So what I realized is since when I get tiny, I can only set Bounds Scale for certain objects around me to 100. Not everything obviously. Just the big walls mostly next to me (usually 6 maximum per spot).

Everything else works fine. Though yes. In terms of lighting obviously there are issues too. Which I will I think just disable lights or have very specific key lights for certain areas that REALLY need it.

I’m wondering if a better approach would have been scaling down the character or camera response..

This is what chatGPT said..

Yes — scaling the player instead of the environment is a common and recommended approach in Unreal Engine :+1:. In many cases it’s actually better than scaling the whole world, because large world scales can cause precision and physics issues.

However, if you shrink the player a lot (like 1000× smaller), you’ll usually need to adjust several systems so the game still feels correct.


1. Scale the Player

You can simply scale the character actor.

Example:

  • Normal player scale: 1.0

  • Tiny player: 0.001

or

  • Character capsule half height

  • Mesh scale

  • camera boom

Often developers shrink the capsule + mesh rather than the whole actor.


2. Adjust Movement Speed

If the player becomes 1000× smaller, movement will feel extremely fast relative to the environment.

Example:

Setting Normal Tiny Player
Max Walk Speed 600 ~0.6

You’ll likely need to scale:

  • Walk speed

  • Acceleration

  • Jump velocity

  • Gravity


3. Adjust the Camera

Camera scale cues matter a lot.

Recommended tweaks:

  • Lower FOV slightly (e.g. 90 → 70)

  • Lower camera height

  • Reduce near clipping plane

These help sell the illusion that the world is huge.


4. World to Meters Setting

In World Settings there is a value called:

World to Meters

Default:

100

This defines how Unreal converts world units to real-world scale for the camera.

If your character is extremely small, adjusting this can help maintain realistic perception.


5. Physics Limitations

Very tiny scales can cause problems:

  • collision jitter

  • unstable physics

  • penetration errors

Unreal physics is happiest when objects are roughly 1–10,000 units in size.

So sometimes devs avoid extreme shrink values and instead scale something like:

  • player = 0.1

  • world = 10

to keep physics stable.


6. Lighting & Shadows

When characters are tiny, you may need to tweak:

  • shadow bias

  • contact shadow length

  • Lumen settings

Otherwise shadows can look wrong for small objects.


:white_check_mark: Summary

Yes — you can absolutely achieve the effect by:

  • shrinking the player

  • adjusting movement speed

  • tweaking camera settings

without scaling the environment.

This is actually how many “tiny character in giant world” mechanics are implemented.


:light_bulb: If you’re aiming for something like “ant-sized character in a normal house”, there’s also a very useful Unreal trick involving camera near-clip + depth-of-field that dramatically improves the scale illusion. I can show that setup if you’re interested.

OP has spent more time than ChatGPT going down that route:

2 Likes

Fair enough

1 Like

This is a challenging problem with a 1000x scale difference. Very fun to think about. That means you’re less than 2mm tall when shrunk.

In the end, you’re probably going to need to build 2 levels for this problem, the large and the small. I’m sure you can share assets between them, it’s just the vast scale difference is a big hangup… like how do you build the assets so they look good at both scales and handling collision for super tiny objects while at large scale will be problematic.

I would look at using runtime DataLayers for switching the large and small worlds:

I ended up fixing the problem by only settings Bounds Scale on the meshes that are around the player only (those are the ones usually causing problems) and they are mostly just simple walls so it doesn’t matter. Cuz when you are small, you do not go around the room a lot u r just so tiny and your movement is limited.

Appreciate the help A LOT though @Countsie you have replied to a lot of my questions about rendering.

Thanks again!

2 Likes