Build your own computer guide

A 2600. It's still going strong, but the raise has given me the urge to reward myself with an upgrade.
*cries in i5-4570 in his gaming desktop*

If I just upgrade the CPU, it's still saddled with a B450 motherboard and an RX 580 GPU.
*cries in GTX760 in his gaming desktop*

Feel free to just bundle up all your old parts and send 'm this way, I'll give them a happy retirement home :awesome:


Maybe this just drives home how much I'm really not a gamer anymore by any decent definition of the word. Dang.
 
Maybe this just drives home how much I'm really not a gamer anymore by any decent definition of the word. Dang.
I know this feeling so well. Not an hour ago, I was on the Dell website, wondering if I should buy myself a new PC for Christmas with my Dell credit. Looked at everything, sighed, and thought "what's the point? I don't really game much, and my current build is just fine for the gaming I do."
 

GasBandit

Staff member
Ditto. I often think "Oh man, my next video card update means a new CPU/MOBO.... so I... wait, what do I even play that needs more than I've got?" And then I stick with my GTX-1060.
 
I look at my current 2010-era rig with its 2016-era GPU (total components MSRP ~$8000 but of course I paid nowhere NEAR that much because I bought none of it "new" -- MAYbe $2500 out of pocket, max) and think, "Welp 11+ years old, probably time for a new build with newer hardware with newer capabilities."

But then I look around for decent parts and all the pieces I want to get (except cases) have prices that range anywhere from twice to over three times the MSRP and they're still sold out anyway, and I think to myself, "Aw, HELL naw!"

--Patrick
 
I am wanting to go from a laptop to a desktop so I can have upgrades a heck of a lot easier (other than RAM), but the chip shortage prices right now are making me have the same reaction - esp. since I'd have to get everything...
 
I don't want a new graphics card cause I need one necessarily but I'd like to have a second around in case the other dies, my current 1070 is getting up there in age and use so it makes me worried it's gonna crap out on me. If I'm going replace it and keep the older one around as back up, I'd rather it be an upgrade but I will not justify scalper prices nor resort to hunting the damn things down via alert bots and whatnot. So I wait.
 
Last edited:
I went ahead and did a fresh install of Windows 11 tonight just to make sure everything is as it should be with the new CPU. I'll run the multi-core Cinebench a few more times over the next week to help the thermal paste cure. The temps are okay, but could they be better?
 
Probably. But is it worth all the effort to try and lower them just a few more degrees if you’re not actually going to be redlining it all the time?

—Patrick
 
Probably. But is it worth all the effort to try and lower them just a few more degrees if you’re not actually going to be redlining it all the time?

—Patrick
If I can get it further away from 90C at full load just by adding a couple of extra thermal cycles per day for a week or so, then yes. It'll give me reassurance that my current thermal solution will be good enough for the new application.
 
just by adding a couple of extra thermal cycles per day for a week or so
I thought you meant you were considering the implementation of more exotic cooling solutions, not merely musing what might happen once your paste was done being cured. My bad.

--Patrick
 
[I was going with] A "Tatlow" Xeon build.
Well after some research I decided not to. Sort of.
The "Tatlow" platform (i.e., Xeon W-23xxG + C25x chipset) is nigh-impossible to find at reasonable prices (have only ever found the CPUs listed for somewhere around twice its MSRP :rolleyes: ) unless one wishes to acquire it as part of a prebuilt system, which I do not.
So I will instead be going with its fraternal Workstation twin, the W-13xx[P] + W580 series. Sure, I will lose out on RSTe and VT-d, but I gain additional display and sound capabilities and I really doubt I'll end up needing to dedicate onboard hardware to any specific virtual machines anyway. Also I gain another 30W of CPU headroom, which should translate into longer turbo times assuming I cool it well enough.

And yes, @DarkAudit , I know the 11th Gen 14nm+++++++ Intel processors are "a waste of sand," and the board I've chosen is absolute effing overkill (it actually has a PCI slot? Whaaaat???), etc., but much like my previous Phenom II X6 build, it's being built for a purpose, and that purpose is to hopefully give me 10+ years of rock-solid, reliable service (plus the older, "better" 10th-gen CPUs are limited to PCIe 3.0 and get only half the DMI bandwidth). Published benchmarks show it'll be essentially equivalent in performance to a high-end 5800X-based system, though it'll have ECC RAM (and of course that lovely PCI slot) all at about a +$600 price premium* over the 5800X system. At any rate, it'll mean a jump up to something that's almost 250% the speed of my current system, sooooo...yeah, it should be a noticeable upgrade. Also with Intel suddenly deciding to delete AVX-512 support from their "Alder Lake" 12th-gen chips pretty soon, there was some concern that the supply of "unhobbled" 11th-gen models might dry up (their prices are starting to trend back upwards over the last week, and I assume this is the reason why).

--Patrick
*Most of the "extra" money is thanks to the gonzo mobo and my decision to purchase the retail version of Windows 10 Pro for Workstations instead of the "standard" W10Pro.
 
the best values in the GPU market these days are either the 8GB AMD "Polaris" (RX 4x0/5x0) or NVIDIA 1060 6GB cards.
And now they are also essentially the new minimum.
If you are building new or upgrading, there is now no reason to get anything older than these two unless you want to instantly be left behind by any upcoming game(s).

--Patrick
 
As a sort of addendum to the above, here is a chart assembled by someone who decided to gather together multiple years' worth of reviews from TomsHardware:

2021GPUs.png


Handy for when you're shopping for a card and want to know how it compares to last year's/the year before's/etc. No hard data yet on RDNA3, Xe, or 40-series, so of course there are none present. The Y axis is performance relative to a GeForce GTX 3090.
Also a reminder that if you are building new, published game minimum requirements suggest you should probably not look at any GPU older than Pascal or GCN 4/5 (Or Gen9 "HD 6xx" in Intel, if you want to go that way). Coincidentally, all three of these graphics technologies were introduced in 2016, so it looks like the industry has designated that year's products as the "anchor" for the present time.

I wish their relative wattages had been included as well, but I suppose that's already a ridiculous amount of work.

--Patrick
 
Last edited:
As a sort of addendum to the above, here is a chart assembled by someone who decided to gather together multiple years' worth of reviews from TomsHardware:

View attachment 40842

Handy for when you're shopping for a card and want to know how it compares to last year's/the year before's/etc. No hard data yet on RDNA3, Xe, or 40-series, so of course there are none present. The Y axis is performance relative to a GeForce GTX 3090.
Also a reminder that if you are building new, published game minimum requirements suggest you should probably not look at any GPU older than Pascal or GCN 4/5 (Or Gen9 "HD 6xx" in Intel, if you want to go that way). Coincidentally, all three of these graphics technologies were introduced in 2016, so it looks like the industry has designated that year's products as the "anchor" for the present time.

I wish their relative wattages had been included as well, but I suppose that's already a ridiculous amount of work.

--Patrick
Wow, this chart really puts into perspective how much I need to upgrade. My poor 860M didn't even make it on here
 

GasBandit

Staff member
Wow, this chart really puts into perspective how much I need to upgrade. My poor 860M didn't even make it on here
It was a good value for its time, but on this chart it would be roughly where the RX 550 is.

That said, all you really need for 1080p 60fps gaming is a 1060 3gig, in my experience. That's what I'm still rocking, and I don't feel any real need to upgrade. The only times I've had performance issues were in the more demanding of VR games.
 
Wow, this chart really puts into perspective how much I need to upgrade. My poor 860M didn't even make it on here
Tell me about it. Roomie came into some money in Summer/Fall of 2015 and we (mostly me) built a fresh new system for him to replace his old Athlon 64 we built him in 2008. It's a Core i7-5775C with a GeForce GTX 960 4GB and a 400GB Intel 750 SSD, which was absolutely AH-MAY-ZING for 2015 but now "game compatibility checker" websites laugh at us like we've brought a Prius to a drag race.

--Patrick
 
Do y'all think the upcoming Intel ARC graphic cards will be worth looking into/early adopting?
Until I start seeing benchmarks for the ARC (first one will be called "Alchemist") up on videocardbenchmark.net, I'm going to take their claims that it will be similar to a 3070 with the world's hugest grain of salt.
The first benchmarks for Intel's Arc lineup are starting to hit the 'net, for the A380 model at least. The A380 appears to be an A370M (a laptop chip) with slightly better specs, and it seems to score right around the level of the GTX 960/R9 390X, which means that puts it on par with cards that were released in 2015. It does have the advantage of full DX12U(ltimate) and better video encode/decode support, but that's about the only things really going for it.
But if the performance of the unreleased cards scales appropriately, and it's the 500- and 700-series cards we're all really waiting for, then it's probably going to look something like this:
A580 - 1.6x performance of A380 would put it right around GTX 1060/R9 Fury level.
A750 - 2.5x performance of A380 would put it right around GTX 1080/RX 5700 level.
A770 - 3.4x performance of A380 would indeed slot it between the RTX 3070/RX 6800.
Plus of course the full DX12 and encode/decode support.
IF my estimates are true (and if the cards are fairly priced), then these would at least be decent "budget" cards, assuming Intel doesn't crash and burn in the drivers department. If nothing else, they should be good for media boxes thanks to the enhanced video support.

BUT...

If you're holding out for their release in the hopes that one of them might be for you, keep in mind the following system requirements, as recently announced by Intel:
  • You must be using a 10th generation or newer Intel processor, or a Ryzen 5000-series or newer.
  • Your motherboard must have "Above 4G Decoding" enabled in the BIOS/UEFI setup.
  • Your motherboard must also have "Resizeable BAR" (also called ReBAR or "Smart Access Memory/SAM") enabled.
  • CSM must be disabled in the motherboard setup. No legacy stuff to get in the way.
  • The card must be installed in a (native?) PCIe v3.0 or newer slot.
  • You must be running 64-bit Windows 10 build 20H2 or newer, or any 64-bit version of Windows 11, and it must have been installed in UEFI mode.
These are all just my best guesses, btw (except for the above which was explicitly stated by Intel). I don't have any kind of insider info on this stuff, I'm just extrapolating based on what we know/has been revealed so far.

--Patrick
 
A750 cards have been sighted in machines in YouTube videos, Intel has even published some no doubt hand-picked benchmarks which put it a bit faster (+16%) than my previous estimate. If true, the A750 would be around the level of the GTX 1080 Ti / Radeon 6700 XT. However, if the Fortnite benchmark is to be believed, it's more likely only around 9% faster than my previous estimate (assuming the variance isn't just due to immature drivers).

Still, if it pulls numbers anywhere close to the RTX 3060 like Intel claims, AND if it can do so while pulling right around 200W (compared to the 3060's 175W), then at least it'll be worth considering. If, however, it pulls 230W or has an MSRP much more than US$400...well, then it's all just been a big waste of time.

--Patrick
 
...until Intel comes out with a PCIe 4.0-enabled Optane controller, we're probably not going to see a significant advantage to 4.0 except in synthetic benchmarks.
Well, so much for that idea.
So...the absolute highest performing persistent storage on the market for gaming or other random-intensive workloads (seriously, at 3-6x the speed, it's not even close) is being discontinued and they are writing all the chips they have in storage off as a loss (rather than building them into actual drives that someone could buy)...all [in my opinion] because people didn't want to pay the inflated prices they were charging for the drives (5x or higher compared to "regular" flash drives). Yes, they could make money by making them into drives and selling them, but they wouldn't make as much money as they were hoping, and so guess better can the whole thing and just throw them all away, darn.

Oh sure, there is the promise of CXL on the horizon, but this could've been here NOW and it would mean better compatibility with "legacy" (i.e., pre-2024*) machines by just dropping a card into a machine rather than requiring a complete redesign of the motherboard, processor, RAM, etc., etc.

--Patrick
*Yes, I said "2024." CXL isn't supposed to really start getting popular until early 2024, and that's only in server deployments. No idea how long it'll take to trickle down to client (i.e., gaming) computers.
 
Last edited:
Well, so much for that idea.
So...the absolute highest performing persistent storage on the market for gaming or other random-intensive workloads (seriously, at 3-6x the speed, it's not even close) is being discontinued and they are writing all the chips they have in storage off as a loss (rather than building them into actual drives that someone could buy)...all [in my opinion] because people didn't want to pay the inflated prices they were charging for the drives (5x or higher compared to "regular" flash drives). Yes, they could make money by making them into drives and selling them, but they wouldn't make as much money as they were hoping, and so guess better can the whole thing and just throw them all away, darn.

Oh sure, there is the promise of CXL on the horizon, but this could've been here NOW and it would mean better compatibility with "legacy" (i.e., pre-2024*) machines by just dropping a card into a machine rather than requiring a complete redesign of the motherboard, processor, RAM, etc., etc.

--Patrick
*Yes, I said "2024." CXL isn't supposed to really start getting popular until early 2024, and that's only in server deployments. No idea how long it'll take to trickle down to client (i.e., gaming) computers.
I have a question, so is this basically, "you wont buy, so we will just throw it all away"
 
I have a question, so is this basically, "you wont buy, so we will just throw it all away"
Not entirely. There is the very real chance that this is being done because they're having a bad year, and their Alchemist product is running into tons of VERY bad delays (more on that later), and they plan to go all-in on CXL because, aside from requiring you to build a brand new computer in 2025 or so, it actually IS significantly better (though more expensive--right now, maybe not so much in 3yrs), and so they just don't actually have the capacity to care about Optane right now, and so are just shelving/abandoning it.
...but yeah, it really feels like they're throwing out an entire buffet rather than donate any of it to feed the homeless or something.

--Patrick
 
Last edited:
their Alchemist product is running into tons of VERY bad delays (more on that later)
It's later.
Much of this is based on rumor (I put off for a week hoping more facts would come to light), but Alchemist does not appear to be doing very well. Intel likes making money, so they wouldn't delay Alchemist for S&G, therefore something must be up.
  • Alchemist performance suffers greatly once the frame rate goes above 90fps. Rumor is this is due to some kind of hardware issue that Intel has tried to mitigate via drivers but with only limited success.
  • DX12 and Vulcan performance is okay, but DX11 (and by extension DX10 and DX9) performance is much, much slower, meaning older (i.e., most pre-2016) games are going to noticeably underperform.
  • Igor's Lab reports that one of the board partners (companies such as ASRock, ASUS, Sapphire, EVGA, etc--they don't say which one) has decided to abort making Alchemist cards entirely due to unspecified "quality concerns."
  • There are rumblings and leaked slides and emails and other internal stuff which suggests Intel, while not exactly panicking about Alchemist, is definitely unhappy/angry/worried/etc. about the whole thing, and may go so far as to cancel any/all future discrete GPU development (Battlemage, Celestial, Druid, etc).
  • In fact, there are even rumors that Intel will just hang up the discrete GPU business entirely, and cancel the 5xx/7xx series GPUs altogether. And before you say, "That couldn't possibly happen!" remember that we are talking about the company that just took a write-down on over $550million dollars' worth of unsold Optane chip inventory that they're just going to sit on/scrap/set on fire/idaknow what's gonna happen to all of it.
So...yeah. Is it canceled? Is it still coming? Is it any good? We still don't know for sure, and if the following video is any indication, we may not find out until Xmas 2022...which incidentally is also when NVIDIA/AMD are supposed to start releasing their next-generation GeForce RTX 4xxx/Radeon 7xxx cards.
Uh-oh.



At this point, the only real jewel in Alchemist's crown is its HW AV1 encoder, but if it can't game, then I don't know if that's going to be enough to get people to buy them (unless Intel decides to just market them as transcoding coprocessors).

--Patrick
 
A bit of an end-of-September update, now that there have (finally) been some official announcements:

Sep 27 - AMD's "Zen 4" 7000-series CPUs are available for purchase. The prices are reasonable compared to the 5000-series, and performance of the 7950X is up about 25% over that of the 5950X, though it also pulls around 60% more power, on average. Curiously, it also supports AVX-512, which is an instruction set that Intel introduced, but then abandoned when they introduced their 12th gen chips. If you're not doing scientific computing, this probably doesn't matter...unless you do a lot of emulation (especially console emulation), where it can make a very large difference, indeed. Upgrading to the 7000 series will require a new AM5 board, which will also require upgrading to DDR5 as this new CPU does not support DDR4.

Oct 12 - NVIDIA's official launch date for their 4xxx-series "Ada Lovelace" cards. They are very large (~3+ slots wide) pull a lot of power, generate a lot of frames, and pissed EVGA off enough that EVGA decided to stop making GPUs entirely, which was a big deal since they only ever made NVIDIA cards. NVIDIA claims they bring about 2x the speed of their 3xxx-series counterparts (4x the speed when ray tracing is enabled) at around 50% higher cost than those 3xxx-series counterparts. Most if not all of the important features have been upgraded (new tensor cores, new CUDA cores, etc) and they have also added hardware AV1 encoding support which takes away that one thing Intel had that nobody else did.

Also Oct 12 - Intel FINALLY has an official launch date for ARC Alchemist. The top-of-the-line 16GB A770 will retail for US$349 with the other cards going for, well, less than that. Performance of the A770 is supposed to be on par with NVIDIA's RTX 3060, which currently retails for around US$375. I personally may pick one up just to have one for its spot in history, and so I can physically touch it to prove to myself that it actually exists.

Oct 20 - Intel releases their 13th gen "Raptor Lake" CPUs, which build on the 12th-gen's transition to the hybrid core design (separate performance and efficiency cores). All the chips support both DDR4 and DDR5 (AMD's 7000-series only supports DDR5) and some of them will even support ECC memory IF you pair them with the right motherboard. This technically isn't a new thing for them, but it was usually something they only did with the lowest-end chips (the i3's) and the Xeons, not the mainstream ones in the middle (i5, i7, i9).

Nov 3 - AMD's release date for their 7xxx-series "RDNA3" cards. No dependable data yet, only claims of 20-25% better performance-per-watt compared to NVIDIA's 4xxx offerings and the fact that they will be chiplet based rather than the usual solid, monolithic die. No mention of AV1 support, framebuffer (VRAM) size, output options, what type of power connectors they'll have (8-pin or the new ATX 3.0 12-pin), whether they'll be PCIe 4/5 at 8x/16x, nothing. So far AMD is doing a good job of containing the leaks, we'll see what things are like in mid-October once the boards are actually being manufactured.

Merry Techmas, everyone!

--Patrick
 
A770 reviews are in. Overall verdict is...actually not terrible? But don't buy if you want consistent performance across multiple games.

Gaming numbers are too inconsistent to really recommend the card unless you know for a fact that the card will perform well in your game of choice, in which case the card does okay, matching or beating the performance of NVIDIA's RTX3060 or AMD's RX6700. Curiously, its relative performance gets better as resolution increases (i.e., it pulls further ahead as screen size gets larger). HOWEVER there is no clear evidence as to whether its hit-or-miss performance is something that will even out as drivers mature, so for now the buying advice is still "NVIDIA for performance, AMD for value," but I guess that now comes with the addition of "Intel later. Maybe."

--Patrick
 
Maybe not the best place to put some pre-builds, but what do you guys think of these:







I i got about 3 hours before Black Friday ends.

Any of them look worth it?

For prices in USD, just divide by 4.7.
 

GasBandit

Staff member
Never get Acer, and stay away from Lenovo if possible.

The ASUS ones look fairly decent... I'd have like to seen 32 gb of ram on those, and the hard drive space is also a little on the low side, but that can always be upgraded later.

Prices seem to be about what I would expect... which probably means they're low.

But I'm worried what the shipping will be like from Romania.
 
Is the GTX1660TI good enough for most games nowadays?

I've decided a while back that video-cards aren't worth buying expensive, and it's better to just replace them in 2-3 years, after a Lego Star Wars game (II i think) forced an upgrade because it required shader 2.0 or something like that.


And yeah, RAM and drive space i ca upgrade later (i was thinking i might just use the HDD i have right now, maybe even the SSD, they're both fairly new)
 
Hmm... would i be better off buying something similar without a graphics card, and just use my GTX0150 until low cost raytracing, VR, 4k cards come out ?

At this point i'm just looking to play some Diablo 2 Resurrected...

And i haven't had that great an experience with ASUS either. My last card ASUS was replaced 3 times, and they all had issues (though i do guess them not being produced anymore had something to do with it, prob getting already used cards etc.)

Also, i just noticed the last one has a PS/2... WHAT?!

EDIT: All the ASUS ones have it... WHAT YEAR IS THIS ?
 
Based solely on specs, I would limit choices to the 5800X and 12700F models and discard the rest.
Of the remaining 4, only two come with a 3060 Ti (the GTX 1660Ti is essentially an RTX 2060 without the R).
Of those two, the Lenovo has better specs (Bluetooth 5.2, WiFi 6, 1TB SSD, slightly faster CPU performance) as well as being 200 Lei less expensive, but I have no real experience with the longevity/reliability of the brand, and neither of them show the interior view to make any guesses as to ease of later expansion. The specs on the Lenovo say it only goes up to 32GB but it should be able to go up to 64GB even with only two RAM slots, it would just mean buying 32GB modules which appear to be anywhere from 1000 to 1500 Lei for a pair of decent 32GB DDR4-3200 ones.

--Patrick
 
Top