Very strange and spontanous drop in performance when in fullscreen in Standalone game

This performance drop occurs ONLY when I run my project as a Standalone game (or packaged game), and set my resolution to 1920x1080 (my default monitor resolution) in fullscreen. Stat unit and stat fps say my fps is 22-23. One might say I have a bad GPU, but my GPU is GTX 1070 OC and reducing the graphics quality to LOW doestn’t help at all. When I disable VSync my fps goes over 200, but the lag remains. I think my monitor might have something to to with it, because this along with other problems, like problems with my monitor’s aspect ratio didn’t occur when I was using a DVI cable, now I am with an HDMI cable, and different monitor input.

An interesting thing to point out is that, if I set my game to fullscreen with Alt + f4 AND it is a Standalone game, NOT a packaged game (with the packaged game, there is no way to get rid of the lag), the lag disappears. What!?

I have absolutely no idea what might be causing this problem, but I am near shipping and have to take care of it fast!

What you need is good debugging. What are EXACTLY the conditions to reproduce the problem? Does the lag also occur in other resolutions? What about in windowed mode?

A serious lag often comes from “On Tick” since that runs EVERY frame, so in most cases you have something consuming too much CPU every tick.

If it only occurs on very special conditions like “I set res to 1920x1080 in the game menu” then check that game menu, the problem will likely be there.

Also try to remember since when the problem occurs and what you changed since then.

Yes, the problem occurs only on that resolution and on fullscreen and stays even with level transition. After tinkering I figured out that:
-If the input on my monitor is DVI, I have NO problems AT ALL, even when I don’t have other cables plugged in(like an HDMI cable)
-If the input on my monitor is HDMI AND I have another cable plugged in my monitor (works both with HDMI and DVI cables), I have NO problems AT ALL
-If the input on my monitor is HDMI AND there is NO other cable plugged in my monitor, the problem happens

OK… I think I just fixed my problem… I just changed my desktop resolution to something different, then reverted it back to 1920x1080, and the problem disappeared!

If the input on my monitor is HDMI AND there is NO other cable plugged in my monitor, the problem happens.

This almost certainly has to do with the ports on the card and which ones are in use (or were). From what I can gather graphics cards that have multiple ports especially various different types of ports for different displays can only do certain things with certain configurations. A good example is my own setup. 2xDVI to HDMI where my card has 2xDVI on it and the cable has DVI on one end and HDMI on the end going into the monitors. I also have a 3rd port on the card that is HDMI out and it runs HDMI to HDMI from the card to a 42in tv. I can NOT run all 3 at once. I mean I might be able to with this right config MAYBE. But I usually turn off my second monitor and enable the tv as my second display when needed. I don’t often need the tv so this works fine. I also never need both monitors and the tv at the same time.

I think if you look up the port configurations for your card it will tell you what happens depending on what is plugged in. That information could really help in your case since it could tell you which configuration will allow everything to run smoothly when you need it to.