Horde Backend storage getting rate limited by AWS S3 causing exceptions

Intermittently we are seeing issues with Horde Artifacts hitting rate limiting on the AWS S3 API at the end of a job step in Horde.

The attached stack trace occurs after uploading a large amount of files to Horde storage -

Written 5947 files (49409.8mb, 2472.8mb/s)
Reading block "Compile Tools Win64 Development":"ToolsFiles_Win64_Development" from temp storage (artifact: 68f618cbcc5b4f736ce2ac3a 'compile-tools-win64-development' (step-output), ns: horde-artifacts, ref: step-output/REDACTED_dev-ue5/3934999/compile-tools-win64-development/68f618cbcc5b4f736ce2ac3a, local: T:\HordeData\StandardCi\Sync\Engine\Saved\BuildGraph\Compile Tools Win64 Development\Manifest-ToolsFiles_Win64_Development.xml, blockdir: block-ToolsFiles_Win64_Development)
Using 16 read tasks, 16 decode tasks, 16 write tasks
Written 0 files (0.0mb, 0.0mb/s)
Exception while executing step: EpicGames.Horde.Storage.StorageException:
...

Is there any configuration or setting to self-limit the upload rate so we don’t hit this? Otherwise should the storage backend be waiting and retrying?

As far as I am aware, there’s no account level request to increase S3 upload resources in AWS

Hi Jacob,

We have too have encountered this slow down error. We have an opened task to investigate this problem and is a priority task for us.

Thank you

Matthew

Hi Jacob,

We submitted a change to the storage backend to add retry logic for this

Engine/Source/Programs/Shared/EpicGames.Horde/Storage/Backends/HttpStorageBackend.cs

https://github.com/EpicGames/UnrealEngine/commit/09ba5318ae716f4700b8c7d586f0215a0c6daafd

This was released in 5.7.

Since then we have not seen a repeat of this issue.

Matthew

[Attachment Removed]