NVIDIA 40-Series announced (AMD 7xxx in Nov)

Yes, yes. Figured I'd start a new thread for this since it's finally that time again.
Last time we had an update was... wow, two years ago, but now that the Ethereum monkey is off our backs (at last!), it's time for Team Green (and Team Red (and Team Blue)) to finally exit their vaults for all the world to see. So what's on the table?

Well, a lot, actually.
According to NVIDIA's (no doubt hand-picked) benchmarks, the 4090 and 4080 are around 100% faster than the equivalent 30-series cards they replace, and all that extra speed comes at "only" a ~50% price increase. In "RTX [ON]" games, they even bench as high as 4x the speed of the 30-series. Plus of course the newest revision of DLSS allows 4K gaming at frame rates usually associated with 1440p etc etc. Oh and also they will have dedicated AV1 encoders onboard (some? all? of the cards)...which takes away the only real advantage Intel had going with their cards.

Also they are HUGE and can pull up to 450W from your PSU, which means 1000W PSUs are probably going to be the new minimum.

Of course now that mining is over, there are used 30-series cards galore out there, and I've even seen NIB 3090 Ti's going for under $1100. Benchmarks and suchlike of the 40-series won't (officially) be coming out until mid-October, so if you're uncertain about the 2022 offerings and are looking to take advantage of current-gen cards, your best deals are as follows:
Gaming: NVIDIA - RTX 3080 (non-Ti), AMD - RX 6800 XT, Intel - A770
Productivity: NVIDIA - RTX 3090 (non-Ti), AMD - RX 6900 XT, Intel - A770

AMD is holding off to announce their RX7000-series until the first week of November, so I guess we'll have to wait until we're drowning in Mariah Carey to get the details on their newest cards. And Intel strongly insists their GPU division isn't dead yet, so I suppose we might get to see what their Battlemage cards can do sometime in mid-...June, maybe?

Last edited:
Needless to say, I don't really recommend looking at the Intel cards right now except for special, specific use cases (or as a curiosity), but that's mainly because they're so new and untested...and also because their DX11 (and by extension their DX10 and DX9) gaming performance is so bad.

I got a 3080 for retail, finally, knowing full well these were coming soonish. Have 0 interest in the 4xxx line or regrets with the 3080. The 1070 I'd had was fine for years, unless this goes tits up, I'll keep it for many generations.
Yeah I’m still rocking my RX580 and will probably upgrade to a 6800XT* but am waiting to see what happens in November. Why not a 3080? Because while NVIDIA gives better FPS w/ DX12, AMD is better at DX11 and older, and the majority of my games aren’t DX12.



Staff member
Still not really seeing why I need to upgrade my 1060. At least unless I start really getting back into VR, which I don't see happening soon.
My GTX760 is getting slightly long in the tooth, but...I still haven't found a game I couldn't play that I wanted to. And most importantly...Buying a new GFX card would at this point also mean a new motherboard, new processor, new memory. And I just can't really justify the expense to myself at this point. maybe after the wedding? But then it's honeymoon, then renovations which are getting urgent....
Guess I'll put the graphics cards on my wishlist :-D
Still not really seeing why I need to upgrade my 1060. At least unless I start really getting back into VR, which I don't see happening soon.
I've got the 1070, which I think benchmarks similarly to the 1060 (at least isn't world's different) and I haven't felt the urge or need to update. I mean, if prices weren't crazy I probably would update, but the machine I built years ago is still chugging along, running everything just fine. It's not like the old days where every few months I would need to update in order to remove bottlenecks in the system for gaming, productivity, etc.
Red Dead 2 was what made me finally push for the upgrade. It's my favorite game by far and seeing some of the lighting that was possible, I was curious to push it.

Old Benchmarks I was getting with most things on medium on the 1070.

After the upgrade with everything at ultra, left uncapped.

The 30 low was concerning but I've played a lot since and never seen it dip there.
In whichTech Jesus takes the piss out of all the AIB vendors:

“The Dark Obelisk?” Seriously? Then why does it light up?