Firstly here is the setup you should be using in your case. All individual properties are plugged into their respective inputs in a Principled BSDF. Make sure that your texture sampling settings are the same as here for each value (otherwise the data is going to get skewed and it gets worse the more steps you have)

Secondly when baking, you instead select a specific value you wish to bake. If this isn’t an option, make a shader where you plug the input into an emissive node, and bake Emit. It’s still not a recommended option, but will work perfectly fine in your case (Albedo, roughness, metallic etc (NOT for normals))

When baking normals for use in Unreal Engine, make sure the Space and Swizzle is as seen in this screenshot. (Not for baking to be used inside Blender as it has different Swizzle. [Here’s a simple article on that.][4] Scroll down to the bottom to see coordinates for various software you might be using). Different programs have different ways of reading normal maps because the data is stored differently. Even if most normal maps look blue they probably aren’t cross-compatible.
You can also see the Margin-bit here, turn it up enough that it fills all the empty space. It won’t stretch the boundaries over the UV space that you actually use, but fills the empty space to make sure that when the texture is sampled etc that there is information for the visible texture on a mesh.

Finally before saving an image, make sure the scene’s colour management settings look like this.
Let me know if you still have issues :- )
