Quote:
Originally Posted by White Out 403
I think I'm prob misunderstanding you which is likely my fault as I've been out of gaming for a while
|
Ok, looking back at my posts I might have been assuming you had some information that I didn't provide.
There are a number of things you "want" out of a video game when it comes to graphics, but in the most basic sense, with all settings on "ultra", you want two things: a high frame rate (measured in frames per second or "FPS") and a high resolution (1080p, 1440p, 4k). Unfortunately, those things are in tension: the higher you set the resolution, the lower your frame rate, because it takes a more powerful video card with more memory to show all those frames when the resolution is pumped up (as well as the hardware to support the video card).
When you're looking at a monitor, again, you're looking at two primary factors: first resolution, and second, refresh rate, which correlates to frame rate. The refresh rate of the monitor (60hz, 120hz, 144hz etc) is the highest frame rate that the monitor is able to display. Crucially,
this does not mean that the monitor is always displaying content at the refresh rate it is set to. You still need to have the hardware - primarily the video card - capable to run a game at a high enough frame rate to take advantage of the monitor's refresh rate.
In other word, if you have otherwise-identical 60hz and 144hz monitors side by side, and the hardware you're using is only able to display the game you're playing at 50fps, what you see on the screen will be exactly the same on both monitors. However, if the hardware you're using is able to display the game at 130fps, the 60hz monitor will only display it at 60FPS, while the 144hz monitor will display it at 130fps.
In your case, you have a monitor that can be set to a refresh rate of 144hz or 165hz. But can your hardware play your games at 120fps, 140fps or 160fps? Because if it's 120fps or 140fps, it does not matter if your monitor is overclocked or not - you will see exactly the same thing on the screen. If your hardware can play the games at 160fps, you might see a difference.
There is software that allows you to see your current FPS displayed live on the screen. Steam has an option that shows live FPS, but most people use MSI Afterburner, because it works really well and has a lot more optionality for what you can actually display (I typically show some combination of CPU temp, GPU temp, FPS, Avg FPS and 1% low). When I see what the FPS is, I can then adjust the game's graphics settings to get the frame rate as close to my monitor's refresh rate as possible if that's what my priority is, or in other cases, at least ensure that it never drops below what I consider a playable number for the game (usually 60fps).
That's what I meant by "trial and error" with your game settings to get the most out of your screen. You paid for 144hz, might as well try to get the benefit of it, is my view.