A 2600. It's still going strong, but the raise has given me the urge to reward myself with an upgrade.What do you have under the hood right now?
—Patrick
*cries in i5-4570 in his gaming desktop*A 2600. It's still going strong, but the raise has given me the urge to reward myself with an upgrade.
*cries in GTX760 in his gaming desktop*If I just upgrade the CPU, it's still saddled with a B450 motherboard and an RX 580 GPU.
I know this feeling so well. Not an hour ago, I was on the Dell website, wondering if I should buy myself a new PC for Christmas with my Dell credit. Looked at everything, sighed, and thought "what's the point? I don't really game much, and my current build is just fine for the gaming I do."Maybe this just drives home how much I'm really not a gamer anymore by any decent definition of the word. Dang.
If I can get it further away from 90C at full load just by adding a couple of extra thermal cycles per day for a week or so, then yes. It'll give me reassurance that my current thermal solution will be good enough for the new application.Probably. But is it worth all the effort to try and lower them just a few more degrees if you’re not actually going to be redlining it all the time?
—Patrick
I thought you meant you were considering the implementation of more exotic cooling solutions, not merely musing what might happen once your paste was done being cured. My bad.just by adding a couple of extra thermal cycles per day for a week or so
Well after some research I decided not to. Sort of.[I was going with] A "Tatlow" Xeon build.
And now they are also essentially the new minimum.the best values in the GPU market these days are either the 8GB AMD "Polaris" (RX 4x0/5x0) or NVIDIA 1060 6GB cards.
Wow, this chart really puts into perspective how much I need to upgrade. My poor 860M didn't even make it on hereAs a sort of addendum to the above, here is a chart assembled by someone who decided to gather together multiple years' worth of reviews from TomsHardware:
View attachment 40842
Handy for when you're shopping for a card and want to know how it compares to last year's/the year before's/etc. No hard data yet on RDNA3, Xe, or 40-series, so of course there are none present. The Y axis is performance relative to a GeForce GTX 3090.
Also a reminder that if you are building new, published game minimum requirements suggest you should probably not look at any GPU older than Pascal or GCN 4/5 (Or Gen9 "HD 6xx" in Intel, if you want to go that way). Coincidentally, all three of these graphics technologies were introduced in 2016, so it looks like the industry has designated that year's products as the "anchor" for the present time.
I wish their relative wattages had been included as well, but I suppose that's already a ridiculous amount of work.
--Patrick
It was a good value for its time, but on this chart it would be roughly where the RX 550 is.Wow, this chart really puts into perspective how much I need to upgrade. My poor 860M didn't even make it on here
Tell me about it. Roomie came into some money in Summer/Fall of 2015 and we (mostly me) built a fresh new system for him to replace his old Athlon 64 we built him in 2008. It's a Core i7-5775C with a GeForce GTX 960 4GB and a 400GB Intel 750 SSD, which was absolutely AH-MAY-ZING for 2015 but now "game compatibility checker" websites laugh at us like we've brought a Prius to a drag race.Wow, this chart really puts into perspective how much I need to upgrade. My poor 860M didn't even make it on here
Do y'all think the upcoming Intel ARC graphic cards will be worth looking into/early adopting?
The first benchmarks for Intel's Arc lineup are starting to hit the 'net, for the A380 model at least. The A380 appears to be an A370M (a laptop chip) with slightly better specs, and it seems to score right around the level of the GTX 960/R9 390X, which means that puts it on par with cards that were released in 2015. It does have the advantage of full DX12U(ltimate) and better video encode/decode support, but that's about the only things really going for it.Until I start seeing benchmarks for the ARC (first one will be called "Alchemist") up on videocardbenchmark.net, I'm going to take their claims that it will be similar to a 3070 with the world's hugest grain of salt.
Well, so much for that idea....until Intel comes out with a PCIe 4.0-enabled Optane controller, we're probably not going to see a significant advantage to 4.0 except in synthetic benchmarks.
I have a question, so is this basically, "you wont buy, so we will just throw it all away"Well, so much for that idea.
So...the absolute highest performing persistent storage on the market for gaming or other random-intensive workloads (seriously, at 3-6x the speed, it's not even close) is being discontinued and they are writing all the chips they have in storage off as a loss (rather than building them into actual drives that someone could buy)...all [in my opinion] because people didn't want to pay the inflated prices they were charging for the drives (5x or higher compared to "regular" flash drives). Yes, they could make money by making them into drives and selling them, but they wouldn't make as much money as they were hoping, and so guess better can the whole thing and just throw them all away, darn.Why the end of Optane is bad news for the entire IT world
The biggest new idea in computing for half a century was just scrappedwww.theregister.com
Oh sure, there is the promise of CXL on the horizon, but this could've been here NOW and it would mean better compatibility with "legacy" (i.e., pre-2024*) machines by just dropping a card into a machine rather than requiring a complete redesign of the motherboard, processor, RAM, etc., etc.
--Patrick
*Yes, I said "2024." CXL isn't supposed to really start getting popular until early 2024, and that's only in server deployments. No idea how long it'll take to trickle down to client (i.e., gaming) computers.
Not entirely. There is the very real chance that this is being done because they're having a bad year, and their Alchemist product is running into tons of VERY bad delays (more on that later), and they plan to go all-in on CXL because, aside from requiring you to build a brand new computer in 2025 or so, it actually IS significantly better (though more expensive--right now, maybe not so much in 3yrs), and so they just don't actually have the capacity to care about Optane right now, and so are just shelving/abandoning it.I have a question, so is this basically, "you wont buy, so we will just throw it all away"
It's later.their Alchemist product is running into tons of VERY bad delays (more on that later)
As long as you're not doing raytracing, VR, or resolutions higher than 1440p, it's probably still 60fps capable.Is the GTX1660TI good enough for most games nowadays?