Hello sorry about the delay but I was OOO at the tail end of last week.
At the moment I don’t have a particularly good solution to your problem. I am assuming that when you say “outside of the Content” that you mean outside of an unreal project as well and as you’ve probably noticed all of the virtualization tools and utilities very much rely on knowing which project a package file is part of before being able to do things like rehydration as it’s currently the only way we can work out where the virtualized data might be stored.
For certain internal use cases we did add automatic package rehydration to the asset migration tool (https://dev.epicgames.com/documentation/en\-us/uefn/migrating\-assets\-from\-unreal\-engine\-to\-unreal\-editor\-for\-fortnite) but this is still a little limited as the tool is run on the target project that you are importing the data into and ends up using the virtualization settings for that project, so in order for it to work correctly it would need access to the same persistent storage backend as the project that the package was first virtualized in. This works fine if you are storing your virtualized data for all projects in the same place but it does not sound like that is the case for your setup.
I do hope to extend the rehydration pipeline in 5.7 (for both this tool and UnrealVirtualizationTool) to allow the caller to supply the source project which would make things more flexible but that of course relies on the user a) having that project synced b) knows where the package came from in the first place. So I might also try to add the ability to provide a custom ini file containing a custom virtualization graph where you could detail all possible storage locations at your company but of course none of these future plans help you now. Please note that 5.7 is my target, but it is not officially scheduled and I cannot guarantee that it will land there.
Potential Fixes
The obvious suggestion is that the shared storage area be converted to be a dummy project. That project could be set up to have access to all of your persistent storage backends which would make it much easier to run rehydration passes on etc, but I will assume that moving stuff around would be too much work for too little gain.
I’m not sure what process you have for moving package files to this storage area. You could probably add a custom editor option to “export to shared storage” quite easily. All it would need to do is copy the package file to the target location and run the rehydration process on it from within the editor process so that it already has the correct settings. There is the possibility that out of the box the rehydration process would complain if the target package is not within a project, in which case you’d want to copy the package within the project first, then rehydrate, then copy to the shared location before submitting. We probably could provide a standard asset action to “export as hydrated” to the editor for projects with virtualized assets enable, I will add that to the backlog but as the initiative is no longer in active development I cannot give you any timeline as to when that might be done. If you want to take a look at adding one for your team, I suggest starting at Engine\Source\Editor\VirtualizationEditor\Private\RehydrateProjectCommandlet.cpp which should get you most of the way there.
Another commandlet you might find interesting is Engine\Source\Editor\VirtualizationEditor\Private\CheckForVirtualizedContentCommandlet.cpp which can be used to make sure that a project does not contain virtualized content. You might be able to make a similar piece of code to scan your shared storage area but I suspect you might hit areas of code that are relying on the packages being within a project.
Speaking of validation Engine\Source\Editor\VirtualizationEditor\Private\ValidateVirtualizedContentCommandlet.cpp (-run=“VirtualizationEditor.ValidateVirtualizedContent”) can be run periodically on a project to check if any package contains any virtualized data which cannot be found in that project’s persistent storage which can be useful to flag problems before they impact users.
My final idea is not something that I suggest you do (at least not long term) but might be useful as a short term fix if this is really impacting your developers. You could add the persistent storage backends for all of your projects to the VA graph but set the backends for the other projects as read only. Just make sure that you place the current project’s backend first so that it is hit first.
[VA_DefaultGraph] PersistentStorageHierarchy=(Entry=ThisProjectsCache, Entry=Project2Cache, Entry=Project3Cache) ThisProjectsCache=(Type=P4SourceControl, DepotPath="//Payloads/Project1/") Project2Cache=(Type=P4SourceControl, DepotPath="//Payloads/Project2/", ReadOnly=true) Project3Cache=(Type=P4SourceControl, DepotPath="//Payloads/Project3/", ReadOnly=true)
I don’t really recommend this as it doesn’t really fix the data, but you could run this as the default graph, then run the ValidateVirtualizedContent commandlet with a custom graph that only has the current project’s backend to identify content that does need fixing. As a bonus you would then be able to rehydrate these broken packages in place and then re-virtualize them to get the data into the correct backend.