If the input on my monitor is HDMI AND there is NO other cable plugged in my monitor, the problem happens.
This almost certainly has to do with the ports on the card and which ones are in use (or were). From what I can gather graphics cards that have multiple ports especially various different types of ports for different displays can only do certain things with certain configurations. A good example is my own setup. 2xDVI to HDMI where my card has 2xDVI on it and the cable has DVI on one end and HDMI on the end going into the monitors. I also have a 3rd port on the card that is HDMI out and it runs HDMI to HDMI from the card to a 42in tv. I can NOT run all 3 at once. I mean I might be able to with this right config MAYBE. But I usually turn off my second monitor and enable the tv as my second display when needed. I don’t often need the tv so this works fine. I also never need both monitors and the tv at the same time.
I think if you look up the port configurations for your card it will tell you what happens depending on what is plugged in. That information could really help in your case since it could tell you which configuration will allow everything to run smoothly when you need it to.