Are there any integration tests on UE4 build?

I’m using Linux version on UE4.

In last two versions, 4.13 and 4.14 where are a major bugs, that made it hard to work on a project.

4.13 - the UMG editor crashed UE, and it was impossible to build menus and HUD interface.

4.14 - persona editor crushing UE, and now it’s impossible to open skeletal mesh , animation or animBlueprint

4.14.1 - UE4 crashing while debugging some of the blueprints.

All this bugs are already documented in, but I have question on how to prevent such a big bugs in future.

I’ve analyzed source code, and didn’t find any sign of some integration tests.
I’ve seen Automatic Testing described here: Automation System in Unreal Engine | Unreal Engine 5.2 Documentation, but, as I understood, it’s used to test the game, not the editor.

The tests, what could be run on the editor after it’s build, and check, does any of previous bug scenarios work fine now.

For example, tests have prepared set of blueprints, animations, meshes, materials, textures, that covers all the basic functions of editor, and tests itself could be like this:

  1. Run editor

  2. Open blueprints (specially prepared for the tests) with different nodes inside.

  3. Open animation blueprints

  4. Open meshes, materials,

  5. Run game (of course, specially prepared), make sure that all things in game work fine.

  6. Run some network-utilizing templates, to test local server.

  7. Make some changes into assets

And so on.

All this actions could be made automatically on a dedicated server (for the Epic team), or on a personal machine for whose, how compile editor themselves (and automatically send report to the Epic)

So, awful bugs, that make impossible to run huge parts of editor without crash, would happen rarely.

I know, that integration testing on graphical interface of such complexity as an UE, is not easy. And, for complex action, like “open blueprint, switch to Events tab, click right-mouse-button, enter ‘float’ into search bar, press enter”, it will take a lot of work to implement some inside-GUI automation. But simple things, like opening blueprints and other assets, could be implementing by some external tool, that using window manipulation, click and keystrokes emulation, and “optical” recognition of results.

I just want to know Epic’s official point of view on this question, and community thought about it.

Maybe I’m wrong and something like described already exists?

I can start writing some visual testing framework by myself, but need to know, what do you think.