Downloading Content Dynamically

Hi there,

I’m back looking at my shop. The game consists of a small initial release and 10-1000 individually downloadable pieces. The pieces will be developed post launch and be significant in size containing audio, images and video.

Beyond the basic unreal content packaging issues has anyone got any pointers towards interfacing with a CDN of some description? I’ll need a pak file download from a remote server. I’d prefer not to host the server myself.

Initial investigation points me towards google cloud services, amazon, limewire and a bunch of other hosting services but I’m interested in any real world experiences downloading data into an Unreal game on a mobile device.

Any suggestions on approach here?

1 Like

Hi,

I’ve done a POC (Prove of Concept) about it here, it surly needs some love and is might be a bit outdated but it’s a valuable starting point. In my example I host an UnrealFileServer, you can just download the file manually and then load the pak file locally.

S3 and HTTP is how Paragon does it. Gears and Infinity Blade didn’t use Amazon, so it was just served by the web servers that acted as the backend for the games

All of the games we make that use such a scheme make sure that signatures are valid before loading/using a file. The process is such:

  • Ask for a list of files from the backend (HTTPS with an auth exchange)
  • List comes down including the current meta data about the files (size, hash, etc.)
  • Local cached data is hashed and if it doesn’t match, downloaded again
  • If local data matches signature, it is used

Signature verification is pretty important so that local tampering or version changes on the backend are automatically downloaded

Thanks Moss - looks like the link gained a , - here’s the thread.

This part strikes me as a bit odd:

“// We have to mount the file into the engine content dir, if not you will not be able to load it async!”

From the little I’ve played it with the directory structure is quite sensitive. I’ve been mounting pak files to nullptr which then gives me a usable directory structure (i.e. one where I can match paths between my development environment and the game file load).

Btw I’ve read a few of your threads and there is a LOT of useful information in them. Thank you for taking the time to explain what you’ve found!

My current leaning is to try amazon s3 and see if I can download files using HttpRequest’s as suggested by @joeGraff here.

Information I find is collated back into my master: OMG I’m trying to do a lot of DLC thread.

What @joeGraf said is the best way of doing it. If you are able to use cloudfront and S3 you will get the best out of the system. Download it and then mount the pack.

Mounting the pack within the engine directory was the only way I was able to stream and load textures, again it was back in the 4.2 days so it might work in any directory right now.

If I’m not mistaken you can even add a hash in the HTTP header when requesting the files in your S3 so it will auto-redownload if the hashes do not match.

If you want to serve from cloudfront bear in mind that it caches for ever, once up you can only invalidate a limited number of times per month for free. So you would need to come up with some versioning or some unique ids to generate different URIs each time you upload your content. I would go with the UUID route.

so it might work in any directory right now.

It’s working for me on 4.11

If I’m not mistaken you can even add a hash in the HTTP header when requesting the files in your S3

The header is etag. The disadvantage of using the approach is that you need to make sure your cloud provider sets that, uses a standard hash, and it requires one round trip per file. Using the table of contents approach, gets rid of the extra round trip requests, so can be more cost effective.

It’s working for me on 4.11

Nice to hear that!

Yeah that’s true, using a POC is essential in all cases, you might want to stream different content depending on the clients locale or even device such as smaller textures based on the devices DPI.

Good to hear on the folder situation. So far I’ve only been loading pack files containing raw text but textures and audio assets are a minimum next step - and the more standard content possible the better!

So I’ve managed to make contact with amazon servers but am stumbling on the signing process. Any pointers on how to approach this in unreal (preferably in a cross platform manner but PC and iOS are the current minimum).

Here’s my code creating the request:

FDateTime date = FDateTime::UtcNow();
FString dateStr;
date.ExportTextItem(dateStr, date, nullptr, 0, nullptr);
const TCHAR* dayOfWeek[] = {
	_T("Sun"),_T("Mon"), _T("Tue"),_T("Wed"), _T("Thu"), _T("Fri"),_T("Sat")
};
const TCHAR* monthOfYear[] = {
	_T("Jan"),_T("Feb"), _T("Mar"),_T("Apr"), _T("Jun"), _T("Jul"),_T("Aug"), _T("Sep"), _T("Oct"), _T("Nov"), _T("Dec")
};
FString dateString = FString::Printf(_T("%s, %02d %s %d %02d:%02d:%02d GMT"), dayOfWeek[(int)date.GetDayOfWeek()], date.GetDay(), monthOfYear[date.GetMonth()-1], date.GetYear(), date.GetHour(), date.GetMinute(), date.GetSecond());
m_URequest = FHttpModule::Get().CreateRequest();
m_URequest->SetVerb("GET");
m_URequest->SetHeader("Date", dateString);
m_URequest->SetHeader("Content-Type", "application/octet-stream");
m_URequest->SetHeader("Authorization", FString::Printf(_T("AWS %s:%s"), AuthID, AuthKey));
m_URequest->SetURL(FString::Printf(_T("https://s3.amazonaws./%s/%s"), RootBucket, *rFile));
m_URequest->OnProcessRequestComplete().BindLambda([this](FHttpRequestPtr, FHttpResponsePtr, bool){OnCompletion();});
if (!m_URequest->ProcessRequest())
{
	Errorf(_T("Failed to start HttpRequest"));
	m_eStatus = eStatus_Error_NoService;
}

And the response is:

SignatureDoesNotMatchThe request signature we calculated does not match the signature you provided. Check your key and signing method.

Making the test file public and removing the auth gives me a successful download (ACE!!! Big grins…), but of course I’d like this file signed to the App.

Going through the AmazonS3 signing process I have a few comments

  1. Wow
  2. Room for some user error here
  3. Perhaps including their SDK is easier than ploughing through this and the likely debugging marathon coming when I finish
  4. Or it would be lovely if someone could share some fairly standalone code for signing an s3 GET request from within Unreal. The examples I’ve found so far require a whole lot of extra code digging to understand due to the environments they are coded in.

Guess I’m pulling in a SHA256 hash and verifying the example keys with those generated locally until I get this right.

Found this as a relatively clean example.

Got something working. HTTP GET downloading from an Amazon S3 bucket link text

AWSS3HttpDownloader does the brunt of the work. Its a singleton class and uses the Http module (you’ll need to add to your build.cs for it to link).

  1. NOTE1! This has only been tested on one file. It probably doesn’t work for anything other than simple downloads. It is meant purely to shorten the development curve for others.
  2. NOTE2: My project is bolted on top of Unreal, normal C++ singletons work within my use case but may not for you
  3. NOTE3: Use at own risk

In the zip attached there are also functions for SHA256, and HMAC SHA256 hashing and key generation.

Example use:

mp_HttpDownloader = new AWSS3HttpDownloader();
		if (!mp_HttpDownloader->StartService())
		{
			Errorf(_T("Failed to start Http Download service"));
		}
		else
		{
			AWSS3Credentials credentials(AuthID, AuthKey);
			mp_HttpDownloader->StartDownload(credentials, "/mybucket", "/MarketPlaceData/DLCDirectory.mdc", OnDLCDescriptionDownload);
		}

You put your post file loaded (or Error - check the response code and act appropriately) functionality in whatever callback you define for OnDLCDescriptionDownload.

link text

thanks for sharing the example. trying it now.
were u able to load material texture files?

Yes it works with image files. I’ve likely made a fair few code changes since posting the above but it should get you moving. Working on from that point I have uploads, SQS communication, database communication etc. different services require fixes to the signing code.

Image, text and binary files are my bread and butter, I have so far failed to load wav files on the file on non ‘desktop’ platforms.

For image files look for references to IImageWrapper. This seems fairly robust so far though about 5% of images I try get rejected, not sure why it’s specific images that cause problems decompressing.

Hi theonecalledtom, great stuff. Just wanted to confirm if, based on your research, what you are saying is that its possible to download a complete level (Static/Skeletal meshes + BP + Sounds + Textures) and stream/load it in at runtime?

Yes - but I’ve only done audio and textures myself and you need to be prepared for a fair amount of research and work to get it running.

How to get some of the resources in Pak?

How to get some of the resources in Pak?

@theonecalledtom Kindly could you share the updated code

@,

Pretty busy this week might get a chance next week. Did you try the code I added here?

Since I wrote my own code aws released an open source C++ library. If I were to start again I’d be having a hard look to see if it can be integrated into unreal. The main question in my mind would be how the http messages integrate with the cross platform nature of unreal’s code.

https://aws.amazon./sdk-for-cpp/

There may already be plugins integrating this?

This looks interesting.

https://www.unrealengine./marketplace/amazon-gamelift-unreal-plugin

@
Missed your question above - use UnrealPak.exe. You feed it a manifest file of fully qualified file paths and it spits out a pak file.