Need an advice for reducing spatial observation size in LA framework observations.

Hello. I am trying to cook something with Learning Agents framework. I set up observations and actions and one of observations is a LIDAR-like observation to give an agent a spatial awareness. For that I do several grids of line traces/shape sweeps in different directions in background threads and get values 0.f → 1.f where 0 is “at agent” and 1 is “as far as trace goes”. The observation itself is basically a conv2d of a continuous observation.

The problem I currently have is that in auto-generated json schemas I can see that LIDAR observation has vector size of 3240 and encoded size of 5040 and it is like ~4 times more than the rest of my observations combined. I don’t understand much in ML yet so I consulted with Chat GPT and it basically suggested quantization of LIDAR values and I agree with that - I basically need values from 0 to 1 with 2 digits precision. I guess the closest realistic solution here is to use uint8 instead of floats and I’d reduce the size 4 times, but it seems it isn’t possible to with current state of LA framework, is it? I mean apart from enum and ints, all observation seem to be floats, and continuous observation is float in particular. Chat GPT also validated that after I fed it python scripts that are provided with LA plugin and it said like “nah man you see they use np.float32 here so even if you shove literal uint8s into float* and feed it to observations it would be garbage as the training process assumes that observation values are floats and not just a stream of bits.”

So… what do? Maybe some form of quantization is planned in future versions of UE’s LA framework and I should just wait? Or maybe the approach I took for spatial observation is ■■■■■■■■ and I created myself a problem that I shouldn’t even have? Or maybe 3240/5040 observation size is fine in general and for LIDAR in particular and I should just go on with training?

For context, here’s a fragment of observation schema:

"Observation.Surrounding.LIDAR":
			{
				"VectorSize": 3240,
				"EncodedSize": 5040,
				"Type": "And",
				"Elements":
				{
					"Raindrop.Near.Sides":
					{
						"VectorSize": 1440,
						"EncodedSize": 1440,
						"Type": "Conv2d",
						"InputHeight": 20,
						"InputWidth": 12,
						"InChannels": 6,
						"OutChannels": 6,
						"KernelSize": 3,
						"Stride": 1,
						"Padding": 1,
						"PaddingMode": 0,
						"Element":
						{
							"VectorSize": 1440,
							"EncodedSize": 1440,
							"Type": "Continuous",
							"Num": 1440
						},
						"Index": 0
					},
					"Raindrop.Near.VerticalPlanes":
					{
						"VectorSize": 1800,
						"EncodedSize": 3600,
						"Type": "Conv2d",
						"InputHeight": 30,
						"InputWidth": 30,
						"InChannels": 2,
						"OutChannels": 4,
						"KernelSize": 3,
						"Stride": 1,
						"Padding": 1,
						"PaddingMode": 0,
						"Element":
						{
							"VectorSize": 1800,
							"EncodedSize": 1800,
							"Type": "Continuous",
							"Num": 1800
						},
						"Index": 1
					}
				},
				"Index": 4
			}