I can’t say for sure, but in 4.17 the compilation manager introduced compile time regressions for some blueprints. Details at the end of the comment, but they are pretty gory.
The good news is your compilation time should go down quite a bit in 4.18 and you can disable the compilation manager, as AceD suggested.
Here’s the story of the regression: When a blueprint (A) changes its public interface (e.g. the number of inputs in one of its functions changes) we need to recompile every blueprint that calls that function. We don’t track what blueprint calls which function, so we recompile every blueprint that has a dependency on blueprint A. The way this was implemented in the old compilation flow was we would hash a blueprint’s public interface, and when that hash changed value we would recompile dependencies.
This hash had to be computed at the end of the compilation process and so it was difficult to port this logic to the compilation manager. The hashed value would also sometimes change every time you compiled a blueprint, meaning that for some blueprints we would always recompile dependencies.
For the large blueprints I was testing against internally the hash was not determinstic (e.g. changed every time you clicked compile), so the compilation manager always represented a speed up in individual compilation. I did not even know, at the time, that the hash was not deterministic, I only discovered that after the compilation manager was released and we got reports of compile time regressions. Very unfortunate.
Anyway, as of 4.18 the compilation manager has an extra phase where it skips compilation of blueprints that are being recompiled as a dependency if none of their dependents had an interface change. The new implementation does not rely on a hashed value and so is easier to maintain and hopefully more reliable. It also avoids dependency recompilation when a new function is added (because nothing could care about the new function’s signature).