Unreal Horde Secondary Backend for Writing

Thanks Julian,

yes, you are right, what we have in mind is basically a cache. Local data that are used when possible and remote (cloud, AWS) data served to remote users.

For a bit more context, we are basically trying to recreate a setup that we used previously where we had our own custom tool downloading data/builds/tools to users (devs). We had the data in office on a shared drive and also in the cloud. People working in the office would get the data from the local shared drive and would download them from cloud only if they were not available locally (basically cache miss). People working from home/remotely would get the data directly from cloud (no need to be connected to VPN, etc.).

So the main goals are saving on the cloud bandwidth for in-office users, making data available to remote users (without the need to use VPN and without data flowing through the office internet connection) and a data backup in case of a failure of a local storage (network drive).

Does it make sense in Unreal ecosystem (Horde, UGS)? I’ll admit that I am not yet familiar with how are artifacts/builds built by Horde downloaded through UGS, so it is possible I am missing something there (e.g. is the UGS downloading data directly, or are data distributed by Horde - something I haven’t looked into yet).

So you think Implenting new storage would be best/easiest way to go about it? As mentioned in the initial post I already considered adjusting ChainedObjectStore.cs but I suspect additional changes might be required to ensure writing works correctly.