Build your own computer guide

A 2600. It's still going strong, but the raise has given me the urge to reward myself with an upgrade.
*cries in i5-4570 in his gaming desktop*

If I just upgrade the CPU, it's still saddled with a B450 motherboard and an RX 580 GPU.
*cries in GTX760 in his gaming desktop*

Feel free to just bundle up all your old parts and send 'm this way, I'll give them a happy retirement home :awesome:


Maybe this just drives home how much I'm really not a gamer anymore by any decent definition of the word. Dang.
 
Maybe this just drives home how much I'm really not a gamer anymore by any decent definition of the word. Dang.
I know this feeling so well. Not an hour ago, I was on the Dell website, wondering if I should buy myself a new PC for Christmas with my Dell credit. Looked at everything, sighed, and thought "what's the point? I don't really game much, and my current build is just fine for the gaming I do."
 

GasBandit

Staff member
Ditto. I often think "Oh man, my next video card update means a new CPU/MOBO.... so I... wait, what do I even play that needs more than I've got?" And then I stick with my GTX-1060.
 
I look at my current 2010-era rig with its 2016-era GPU (total components MSRP ~$8000 but of course I paid nowhere NEAR that much because I bought none of it "new" -- MAYbe $2500 out of pocket, max) and think, "Welp 11+ years old, probably time for a new build with newer hardware with newer capabilities."

But then I look around for decent parts and all the pieces I want to get (except cases) have prices that range anywhere from twice to over three times the MSRP and they're still sold out anyway, and I think to myself, "Aw, HELL naw!"

--Patrick
 
I am wanting to go from a laptop to a desktop so I can have upgrades a heck of a lot easier (other than RAM), but the chip shortage prices right now are making me have the same reaction - esp. since I'd have to get everything...
 
I don't want a new graphics card cause I need one necessarily but I'd like to have a second around in case the other dies, my current 1070 is getting up there in age and use so it makes me worried it's gonna crap out on me. If I'm going replace it and keep the older one around as back up, I'd rather it be an upgrade but I will not justify scalper prices nor resort to hunting the damn things down via alert bots and whatnot. So I wait.
 
Last edited:
I went ahead and did a fresh install of Windows 11 tonight just to make sure everything is as it should be with the new CPU. I'll run the multi-core Cinebench a few more times over the next week to help the thermal paste cure. The temps are okay, but could they be better?
 
Probably. But is it worth all the effort to try and lower them just a few more degrees if you’re not actually going to be redlining it all the time?

—Patrick
 
Probably. But is it worth all the effort to try and lower them just a few more degrees if you’re not actually going to be redlining it all the time?

—Patrick
If I can get it further away from 90C at full load just by adding a couple of extra thermal cycles per day for a week or so, then yes. It'll give me reassurance that my current thermal solution will be good enough for the new application.
 
just by adding a couple of extra thermal cycles per day for a week or so
I thought you meant you were considering the implementation of more exotic cooling solutions, not merely musing what might happen once your paste was done being cured. My bad.

--Patrick
 
[I was going with] A "Tatlow" Xeon build.
Well after some research I decided not to. Sort of.
The "Tatlow" platform (i.e., Xeon W-23xxG + C25x chipset) is nigh-impossible to find at reasonable prices (have only ever found the CPUs listed for somewhere around twice its MSRP :rolleyes: ) unless one wishes to acquire it as part of a prebuilt system, which I do not.
So I will instead be going with its fraternal Workstation twin, the W-13xx[P] + W580 series. Sure, I will lose out on RSTe and VT-d, but I gain additional display and sound capabilities and I really doubt I'll end up needing to dedicate onboard hardware to any specific virtual machines anyway. Also I gain another 30W of CPU headroom, which should translate into longer turbo times assuming I cool it well enough.

And yes, @DarkAudit , I know the 11th Gen 14nm+++++++ Intel processors are "a waste of sand," and the board I've chosen is absolute effing overkill (it actually has a PCI slot? Whaaaat???), etc., but much like my previous Phenom II X6 build, it's being built for a purpose, and that purpose is to hopefully give me 10+ years of rock-solid, reliable service (plus the older, "better" 10th-gen CPUs are limited to PCIe 3.0 and get only half the DMI bandwidth). Published benchmarks show it'll be essentially equivalent in performance to a high-end 5800X-based system, though it'll have ECC RAM (and of course that lovely PCI slot) all at about a +$600 price premium* over the 5800X system. At any rate, it'll mean a jump up to something that's almost 250% the speed of my current system, sooooo...yeah, it should be a noticeable upgrade. Also with Intel suddenly deciding to delete AVX-512 support from their "Alder Lake" 12th-gen chips pretty soon, there was some concern that the supply of "unhobbled" 11th-gen models might dry up (their prices are starting to trend back upwards over the last week, and I assume this is the reason why).

--Patrick
*Most of the "extra" money is thanks to the gonzo mobo and my decision to purchase the retail version of Windows 10 Pro for Workstations instead of the "standard" W10Pro.
 
the best values in the GPU market these days are either the 8GB AMD "Polaris" (RX 4x0/5x0) or NVIDIA 1060 6GB cards.
And now they are also essentially the new minimum.
If you are building new or upgrading, there is now no reason to get anything older than these two unless you want to instantly be left behind by any upcoming game(s).

--Patrick
 
As a sort of addendum to the above, here is a chart assembled by someone who decided to gather together multiple years' worth of reviews from TomsHardware:

2021GPUs.png


Handy for when you're shopping for a card and want to know how it compares to last year's/the year before's/etc. No hard data yet on RDNA3, Xe, or 40-series, so of course there are none present. The Y axis is performance relative to a GeForce GTX 3090.
Also a reminder that if you are building new, published game minimum requirements suggest you should probably not look at any GPU older than Pascal or GCN 4/5 (Or Gen9 "HD 6xx" in Intel, if you want to go that way). Coincidentally, all three of these graphics technologies were introduced in 2016, so it looks like the industry has designated that year's products as the "anchor" for the present time.

I wish their relative wattages had been included as well, but I suppose that's already a ridiculous amount of work.

--Patrick
 
Last edited:
As a sort of addendum to the above, here is a chart assembled by someone who decided to gather together multiple years' worth of reviews from TomsHardware:

View attachment 40842

Handy for when you're shopping for a card and want to know how it compares to last year's/the year before's/etc. No hard data yet on RDNA3, Xe, or 40-series, so of course there are none present. The Y axis is performance relative to a GeForce GTX 3090.
Also a reminder that if you are building new, published game minimum requirements suggest you should probably not look at any GPU older than Pascal or GCN 4/5 (Or Gen9 "HD 6xx" in Intel, if you want to go that way). Coincidentally, all three of these graphics technologies were introduced in 2016, so it looks like the industry has designated that year's products as the "anchor" for the present time.

I wish their relative wattages had been included as well, but I suppose that's already a ridiculous amount of work.

--Patrick
Wow, this chart really puts into perspective how much I need to upgrade. My poor 860M didn't even make it on here
 

GasBandit

Staff member
Wow, this chart really puts into perspective how much I need to upgrade. My poor 860M didn't even make it on here
It was a good value for its time, but on this chart it would be roughly where the RX 550 is.

That said, all you really need for 1080p 60fps gaming is a 1060 3gig, in my experience. That's what I'm still rocking, and I don't feel any real need to upgrade. The only times I've had performance issues were in the more demanding of VR games.
 
Wow, this chart really puts into perspective how much I need to upgrade. My poor 860M didn't even make it on here
Tell me about it. Roomie came into some money in Summer/Fall of 2015 and we (mostly me) built a fresh new system for him to replace his old Athlon 64 we built him in 2008. It's a Core i7-5775C with a GeForce GTX 960 4GB and a 400GB Intel 750 SSD, which was absolutely AH-MAY-ZING for 2015 but now "game compatibility checker" websites laugh at us like we've brought a Prius to a drag race.

--Patrick
 
Do y'all think the upcoming Intel ARC graphic cards will be worth looking into/early adopting?
Until I start seeing benchmarks for the ARC (first one will be called "Alchemist") up on videocardbenchmark.net, I'm going to take their claims that it will be similar to a 3070 with the world's hugest grain of salt.
The first benchmarks for Intel's Arc lineup are starting to hit the 'net, for the A380 model at least. The A380 appears to be an A370M (a laptop chip) with slightly better specs, and it seems to score right around the level of the GTX 960/R9 390X, which means that puts it on par with cards that were released in 2015. It does have the advantage of full DX12U(ltimate) and better video encode/decode support, but that's about the only things really going for it.
But if the performance of the unreleased cards scales appropriately, and it's the 500- and 700-series cards we're all really waiting for, then it's probably going to look something like this:
A580 - 1.6x performance of A380 would put it right around GTX 1060/R9 Fury level.
A750 - 2.5x performance of A380 would put it right around GTX 1080/RX 5700 level.
A770 - 3.4x performance of A380 would indeed slot it between the RTX 3070/RX 6800.
Plus of course the full DX12 and encode/decode support.
IF my estimates are true (and if the cards are fairly priced), then these would at least be decent "budget" cards, assuming Intel doesn't crash and burn in the drivers department. If nothing else, they should be good for media boxes thanks to the enhanced video support.

BUT...

If you're holding out for their release in the hopes that one of them might be for you, keep in mind the following system requirements, as recently announced by Intel:
  • You must be using a 10th generation or newer Intel processor, or a Ryzen 5000-series or newer.
  • Your motherboard must have "Above 4G Decoding" enabled in the BIOS/UEFI setup.
  • Your motherboard must also have "Resizeable BAR" (also called ReBAR or "Smart Access Memory/SAM") enabled.
  • CSM must be disabled in the motherboard setup. No legacy stuff to get in the way.
  • The card must be installed in a (native?) PCIe v3.0 or newer slot.
  • You must be running 64-bit Windows 10 build 20H2 or newer, or any 64-bit version of Windows 11, and it must have been installed in UEFI mode.
These are all just my best guesses, btw (except for the above which was explicitly stated by Intel). I don't have any kind of insider info on this stuff, I'm just extrapolating based on what we know/has been revealed so far.

--Patrick
 
Top