I assume it's being positioned like the 3090 as a Workstation option, but I can't imagine most 3D modelers or video editors want to install a 1300w PSU, heavy cooling and have a dedicated circuit for their PC and peripherals.
3d modelers, cad designers, animators etc. Even things like flight simulator games blow through 24gb with out full out settings.
You have to look at a 3090 or 4090 has the professionals card that can game. The titan was never a gaming card untill they added rtx and the end of it in Pascale. The 90s cards are titan class cards for work first gaming second.
I know people that working animation and gaming studios that could still blow thru 48 gb for work .
There was a reason that the 3090 still supported SLI. This just shows professionals still needed SLI . This also seems like this would officially be the end of SLI.
Last edited by combustiblefuel; 07-27-2022 at 03:18 PM.
With the performance gains that are being rumoured, it feels like those with 3080/3090 (inc. TI variants) could theoretically go with a 4060 or 4070 and still see massive gains. Would that be a fair assessment based on what the rumours are showing?
What rumors are those? I mean there was similar buzz about the 30000 series over the 2000 series but the 3060 isn't even close to the 2080ti. I would think it would be more likely that the 3080 and 4070 will be close in performance.
__________________ "The great promise of the Internet was that more information would automatically yield better decisions. The great disappointment is that more information actually yields more possibilities to confirm what you already believed anyway." - Brian Eno
3d modelers, cad designers, animators etc. Even things like flight simulator games blow through 24gb with out full out settings.
You have to look at a 3090 or 4090 has the professionals card that can game. The titan was never a gaming card untill they added rtx and the end of it in Pascale. The 90s cards are titan class cards for work first gaming second.
I know people that working animation and gaming studios that could still blow thru 48 gb for work .
There was a reason that the 3090 still supported SLI. This just shows professionals still needed SLI . This also seems like this would officially be the end of SLI.
I mean most of the real serious ones just remote into their workstation at the office/datacentre via Parsec or something where they have a huge server full of Quadros. It seems exceptionally niche to target home workstations at 800w TDP
What rumors are those? I mean there was similar buzz about the 30000 series over the 2000 series but the 3060 isn't even close to the 2080ti. I would think it would be more likely that the 3080 and 4070 will be close in performance.
Disregard...I was reading the news articles incorrectly. My bad.
Considering a lot of TVs are using BGR subpixel layouts, the text will look like poo when using a TV as a computer monitor.
OLED monitors seem to have the same issue too. Since I use my main computer monitor for work as well as for play sub-par text rendering would be a deal breaker.
I was thinking of maybe having a dedicated gaming monitor and a dedicated work monitor, but that seems annoying and expensive.
The new PG42UQ though claims to have a better sub-pixel layout for text, might be worth checking out.
Though is an OLED monitor ever appropriate for work? Is a static work image going to be a burn in problem even with the newer tech?
__________________ Uncertainty is an uncomfortable position.
But certainty is an absurd one.
New credible gpu leaked specs show the next gen cards specifically the 4080 is no where near as powerful as once said.
The 4080 is expected to be basically a 3080 ti now.
The 4090 is getting the biggest uplift with 6000 more Cuda expected.
Everything else seems to be relatively close to the 30 series but are getting vram upgrades in capacities
Even less shocking: I officially ended the Linux experiment today and installed Windows. Yesterday I re-organized my office setup to be easier to plug and play for my work laptop and everything was working great. This morning I booted up the Linux rig and it had somehow, and for no reason, decided overnight that I did not have a 3080... despite the fact that it was outputting video from the 3080. It was basically like it had removed all the drivers. I did nothing whatsoever to provoke this result. That was the last straw. Now everything works predictably, and it also took me about 20 seconds to undervolt the GPU and save myself about 10 degrees of heat.
With reasonable fans I am now in the 60s in games on both GPU and CPU, which for the form factor of my build is perfectly fine. I'm going to see if I can manage 4K60 in RDRII once it finished installing tonight.
EDIT: Yep, easy 4K60 in RDRII on the 4K monitor. On the 165hz 1440p monitor I can pump it up to high / ultra in basically every setting, put on DLSS Quality and still get 111 fps average in the benchmark, with none of the scenes dropping below 80 (action test was around 100-105). All of that without either GPU or CPU temps going much over 60. It's just so much easier to tweak things to optimize the balance between temps and performance using tools that are available in Windows.
On something a bit less demanding, ME Legendary was doing 130ish with everything at max at 1440p, which was actually a bit less than I was expecting but still more than fine. The GPU wasn't taxed or anything so that just seems to be where it wants to sit for some reason.
The most impressive thing though is that I can now get Control to play at a locked 4K60 with ray tracing on maximum and all settings maxed if I set DLSS to render at 1440p. That gets the GPU up to 75c, but the fans aren't maxed... and it looks absolutely incredible.
__________________ "The great promise of the Internet was that more information would automatically yield better decisions. The great disappointment is that more information actually yields more possibilities to confirm what you already believed anyway." - Brian Eno
Last edited by CorsiHockeyLeague; 08-28-2022 at 06:41 PM.
Ryzen 7000 series release update with 7600 starting at 299$US with top range 7950 at 699$US
all of them 5 GHz+
AM5 support through 2025+
releases September 27, 2022
release of high end X670 chipsets only, b650 follows in october,
New memory branding/overclocking/optimization EXPO certified DDR5 that is basically their version of intel XMP.
Lots of support, apparently 15 kits at launch that will support.
I just don't buy any of it. Gaming performance is ALREADY not CPU bound, such that there is little reason to spend a bunch of extra money on a 12900k, for example, as compared to a 12400. You're barely seeing any performance difference, other than more power draw and more heat. So to the extent this new generation makes improvements in efficiency and thermal performance, that's fantastic... but suggesting that the actual performance will matter to anyone who's using their PC primarily for gaming? I am extremely skeptical.
__________________ "The great promise of the Internet was that more information would automatically yield better decisions. The great disappointment is that more information actually yields more possibilities to confirm what you already believed anyway." - Brian Eno
The Following User Says Thank You to CorsiHockeyLeague For This Useful Post:
New G502 mice from Logitech this time they include “Lightforce” hybrid optical-mechanical switches. I've always liked Logitech mice but I moved away from them after having a few where the switches went bad and their warranty claim process got a lot harder to navigate.
I mean when there's guides on how to order Japanese switches and how to replace the faulty lower quality ones in your mouse...
My Razer has lasted longer than my last Logitech so far, but will keep an eye on these to see if the switches last longer.