Massive model of London - optimization advice needed

Hi All!

Firstly, I am fairly new to unreal and have only done a few small archviz projects in it so far. But none of them required the level of optimization I am going to need for this model.
I have a 274 million triangle map of London that I want to bring into unreal. It has already been decimated to a point, where I don’t want to loose further detail.

Currently the model consists of the following:
+/- 1400 model parts of about 200K triangles each.
+/- 1400 textures which are a mixture of 2K and 4K maps.

The final product would likely be running on a server and be accessible through pixel streaming. If html5 output is possible with this amount of data through some sort of streaming of assets, then I would definitely like to explore that as well. Even on Pixel streaming I would like the server to be able to load the initial small version of the map fast, and then stream in assets as the user navigates around the map.

What we be the best approach to handle this?
Should I use virtual texturing?
Should I use level streaming?
How would I handle asset streaming?
Would unreal even be able to handle this amount of data in the editor?

I realize that the question might be vague. I guess, what I would like to know is;
if you were tasked to do this, what are some of the basic things, you would have to consider in order to have this run as efficiently as possible.

Any advice would be greatly appreciated.

I have attached some renders done in Corona renderer.

Hey BenPret i am also working on a project for south west region sheperds bush labroke grove hammersmith and fulham up to putney and wandsworth i would love if you could to send me a file that would work on unreal engine for my project it would help me allot and maybe i could help you to work on your project if i can help create storys for you if you needed them for future projects.