Quote:
Originally Posted by btimbit
Looks like I did gain 20fps with DLSS, but everything is just a bit blurry. I can gain more with just leaving DLSS off and reducing minor graphics settings, and still look better
I'm guessing it's just like photon said and I'm simply not benefiting from it because my PC is powerful enough anyway? Later tonight or tomorrow I'll fiddle with it in VR since that's more taxing on the system
The main reason I ask is whenever I mention doing a new build next year and going from my 3080 to maybe a 7900xtx people point out that I'll miss out on DLSS, but I'm trying to figure out if that's really something that bugs me or not
|
Its not so much about image quality but all about smoothness.
Just catching up to this. DLSS will never look as good as running full resolution. It won't make your games look prettier (subjectively it could replace antialiasing). It is however, a fantastic way to squeeze out extra performance and still have great graphics with a small tradeoff.
Back in the day if you were playing Crysis at 1080p and you didn't have enough FPS, you would have to drop to a lower resolution like 720p. That resolution would obviously look uglier, cause more aliasing, etc. and it would look like crap on the native resolution of your monitor.
The strategy now is to always keep games running at native resolution but have the engine render at a much lower internal dynamic resolution depending on your framerate target.
You might be running a 4K monitor but you don't have the GPU to power that. That's fine, the game could run 1080 or 1440p dynamically depending on the situation to hit a certain frame rate target. nvidia DLSS or AMD FSR basically do upscaling like some of those old DVD players did (but with AI). They take that lower resolution scene and upscale it with AI to 4K. It's less compute intensive on the GPU and provides similar image quality depending on what setting you use.
These days, everybody seems to mainly care about "smoothness" and running their games at native res and the maximum refresh rate. With gaming being focused around 100Hz or 144Hz+ monitors, most people seem to want to have their games hitting these framerates for shooters and compeitive play. They want to avoid frame drops and they want vSync and FreeSync and GSync to take care of all this and they want a locked in frame rate. That's another big thing that DLSS/FSR is here to solve. The last aspect of that is the other trend toward getting VRR (Variable Refresh Rate) which more monitors and TVs support. They will be varying their refresh rate (rather than locked at 60Hz ,100Hz, etc.) so that its tied to framerate to avoid tearing and choppiness.
The final piece is DLSS-G which is another evolution in the Ai generation of smoothing out gameplay. Its like the soap opera effect for TVs but for games. G is for frame Generation which AI creates missing inbetween frames to artificially increase your frame rate. This is less costly than actually having the GPU rendering a whole frame. The AI just looks at the scenes and guesses what the inbetween would look like and draws it out.
In the end, all these are great for lower powered gaming devices (all the consoles will be trending this way), handhelds, or most consumer PCs. Nobody is going to be running Cyberpunk at 4K Ultra Settings with maximum Ray Tracing Overdrive at 144 FPS. There is no consumer PC under $10,000 that can run that without DLSS helping.
So far, DLSS and DLSS-G is the superior tech to AMD and I don't think I would ever go AMD at this juncture. That said, I would say if you care about image quality the better investment you could make is an OLED monitor with high refresh rate.