Accessing mesh UV coords after packaging

Hi. I am trying to make a paint system similar to Portal 2 or Splatoon.

I am painting things using a texture mask, and I am dynamically changing this texture using UpdateTextureRegions().
To do this, I find the UV coordinates of the place I’ve hit.

I found how to find the UV coordinates of a raycast using the function JGagner wrote in this:

This works perfectly, the problem is that I have read Here that when you package the project, all mesh data gets serialized or something like that, so every thing related to FStaticMeshRenderData does not work properly, and I have tried to package it, and it really did crash.

My problem is not how to get the UV coordinates when doing a raycast, as I saw Rama’s method of doing it (with physx, so it works after being packaged), my problem is that I want to use GetVertexUV() independently of the raycast (to make some calculations), and I cannot do it because as far as I’ve found there is nothing else except FStaticMeshRenderData.VertexBuffer (which doesn’t work when packaged) to find a vertex’s UV (I tried to search the physx source code and found nothing, and the implementation of the actual physx raycast which also finds the UVs is inaccessible so I cant see how they found the UVs…).

Is there a way to access a mesh’s vertices UV coordinates, with it working when packaged? Thank you very much :slight_smile:

Hi. I have done a further bit of research and found this post:

It seems that the UV data is unreadable as in shipping builds there are no CPU accessible vertex buffers, in order to save memory, but as it says there in order to be able to use the vertex buffer on a packaged build I will need to set bNeedsCPUAccess to true on FStaticMeshLODResources::Serialize. I will do this on meshes that are paintable. but I wonder, how much memory does it take, and is there a smarter way to do this?