Question about how computer components are priced to move.
March 27, 2023 10:48 AM

My computer died, forcing me to finally stop buying used or refurbished PCs. But I also didn't want to buy one of the new boxes where nothing can be changed out and if it breaks you just throw it away and get another one. So I decided to have a PC built.

Among the many questions I have, one involves the graphics card. I can't afford $1,100 graphics cards. But the GeForce 440 I got for $25 more than 5 years ago from Free Geek, I was told, also will not cut it with the new processor I'm getting.

I shopped around and found a card I could afford that was at the 50% performance level, i.e., all of its users considered it to be about average or slightly above average. It ranked 111th of 698 cards evaluated. The card is a Radeon RX 580. But CPU benchmark shows that this card came on the scene in 2017. I just bought it brand new today at Best Buy. Why would Best Buy be selling any 6-year-old graphics card when all of the processors are 12th or 13th generation?

And about that processor, an i7-12700KF. Originally, the builder recommended a 13th gen, but discovered the 12th was on sale at a $300 discount! It's only about 6 months old and practically no difference in clock speed. Should I just not look a gifted horse in the mouth?
posted by CollectiveMind to Computers & Internet (11 answers total)
Sorry ... not $300 discount but $130 discount. Threes stuck in my head ...
posted by CollectiveMind at 10:51 AM on March 27, 2023


The graphics card comparison at Logical Increments is a great resource. The RX 580 is a bit dated and is not what I'd put in a new build.

Why is Best Buy selling it? Partly because they're Best Buy, who (along with Think Geek) are notorious for fleecing customers on overpriced components and services.

There was, for quite a while, a massive shortage in graphics cards, leading to super high prices. Mostly driven by crypto-bros, and partly from the semiconductor shortage. Both of those issues have, one way or another, mostly sorted themselves out now and graphics cards are back to a more reasonable level, both in price and availability.

Clock speed itself means a lot less these days. The reasons are complicated, but the clock speed race is basically over, it's about efficiency now. More cores, more threads, for less thermal load. Especially if you're trying to build on a budget, I wouldn't sweat a 12th vs 13th-gen i7. For typical use, the price difference, spent in RAM, faster storage, or a better GPU, will get you far better performance gains.

Do you have the option of returning the RX 580? Have you paid for anything else? What's your total budget? If you're getting the rest of the build from Best Buy / Think Geek, you can almost certainly do better for less elsewhere.
posted by xedrik at 11:25 AM on March 27, 2023


(xedrik: OP mentioned Free Geek, not Think Geek. The former is a computer recycler/reuse vendor)
posted by scruss at 12:11 PM on March 27, 2023


Have you tried Newegg's PC builder? It shows which parts are compatible with other parts in your build.
I've watched my teenage son build 3 PC's with parts chosen this way and he has never had problems.
posted by OHenryPacey at 1:04 PM on March 27, 2023


scruss, yeah, they mentioned getting their old card from Free Geek, but the new one from Best Buy, and that's where my Think Geek concern kicked in.
posted by xedrik at 1:07 PM on March 27, 2023


Old GPUs are still being sold because the GPU manufacturers and board partners are still making them. They're doing this because new GPUs are way overpriced for what you get relative to those older generations. It's calming slightly what with the crypto crash, but GPU prices are still stupid high for the most part.

In general, the current budget GPUs (which are still themselves based on tech that's a few years old now) are cut down in ways that often make the high end parts of older generations a better choice, depending on what you want to do with them. They won't support raytracing or offloading the encoding and decoding of the very latest video codecs, but for general rasterized gaming performance they're still fine, especially at 1080p. The question is less if they're useful and more where they are priced relative to the performance you get.
posted by wierdo at 1:30 PM on March 27, 2023


I'd have no problem buying a used computer. Some of the most reliable gear I've had has been used. I write on a Thinkpad that runs Windows XP. If you don't need a lot of power, just about any machine will do. I do a lot of programming and computer design stuff, and the computer I use at work is a lot older than five years. The one I'm typing on is very powerful, and again, a lot older than five years. Computers are so fast nowadays that most people won't come close to using all the horsepower they have.
I had to get a video card because I got a monitor my onboard video didn't support. Until then I was happy with the onboard stuff. The new card barely fits in the case, has 8 gigs of ram, and is massive overkill for my rather high-end requirements. Fortunately it was very cheap, for reasons that escape me, and I have gamers in the house who love it.
I have never heard of a ... no, I have. I once had an AMD processor and it wouldn't work with an AST Vesa Local Bus video card. That was in 1995. I have not, since then, heard of a video card that doesn't work with a specific processor. I'm not saying that's impossible, but I'd be inclined to put everything together and try it. Failing that, look around on the net and see if that's a real problem.
It's hard to comment on what will be acceptable when you haven't stated what you need it for, but if it's to watch videos and surf the web, pretty much anything will do.
posted by AugustusCrunch at 6:31 PM on March 27, 2023


OHenryPacey, in Best Buy, the Geek Squad consultant pulled up NewEgg and CPUBenchmark and showed me components and made suggestions for building the system I want, which will focus heavily on video and audio editing and no gaming. Geek Squad's prices were comparable. After having two brand new WD_Blue SATA SSDs fail within a year, (the second time with ALL of my data since I was assured it couldn't happen twice), and them not knowing what in the old PC caused the fail, this is a path I haven't tried and I'm getting the best I can afford. I may not need all of the power or space or speed. It's certainly more that I've ever had. But I have to try to give myself the best I can get. If this fails, thru no fault of my own, I don't know what I'll do.
posted by CollectiveMind at 7:52 PM on March 27, 2023


Why would Best Buy be selling any 6-year-old graphics card when all of the processors are 12th or 13th generation?

I once saw a store selling some 6 year old electronics (think stuff like iPods) at the original MSRP when they should by now be priced at perhaps 25% of original MSRP... and the worst part is, I'm quite sure they manage to sell some of them to customers who aren't tech savvy.

I was told, also will not cut it with the new processor I'm getting.

There seems to be a contradiction, you say you "can't afford" a top of the line graphics card, but then why are you buying a top of the line CPU, the i7-12700KF? A budget builder would be going for an say, a mid range i5-12400F for half price rather than an i7. Going for the i7-12700KF series (enabling overclocking) would then push you into a much more expensive Z690 motherboard to take advantage of overclocking, when otherwise you would use a H670 or even B660. And if you were planning to buy the mid / low end motherboard in the first place then why go for an overclockable K series CPU in the first place?

What do you plan to use your PC for? I play League of Legends, Marvel Snap, and those all run perfectly fine on integrated graphics. The i5-12500 for example has a UHD770 GPU on-board that is more than fine for low level gaming tasks and general computing / video playing.

You even mention in a later comment "no gaming" so I suspect you don't even need a discrete graphics card in the first place!
posted by xdvesper at 11:00 PM on March 27, 2023


The truth is quite complicated.

1) Not all advances are toward more performance (but most are), some are for reduced cost, reduce power consumption, and so on. A "budget card" today, has the performance of a mainstream card a few years ago, and/or can be built far more cheaply and consume less power for the same performance, thus also require less cooling (which would ALSO let it be built more cheaply).

2) The RX580 did came out in 2017, but it was a mainstream card then. It compares to a GeForce 1060 quite well, which came out in 2016. It's now considered a budget card due to all the newer cards that succeeded it.

3) With that said, your GeForce 440 was an "entry-level" card even when it was first released in 2011. I am quite certain even the onboard graphics in the Intel 11/12gen CPUs can outmuscle your 440. :D

4) The RX580 is of some use, given the use case you've specified, as "heavily on video and audio editing and no gaming". Those *can* be GPU accelerated, depending on which editing software you are using and how it's configured. I would max out the RAM (32 or 64 GB) and make sure your editing software can use the GPU. If you need to save money, you can go without it, but your editing process may be a little slower.

5) With that said, how much are you paying for the RX580, when a RX6600 can be had for $250 (or a little less), that's faster, and came out in 2021?
posted by kschang at 1:50 AM on March 28, 2023


That RX6600 will be cooler and quieter than the RX580 with more ability to push pixels. There was a leap down in transistor size between the RX 580 generation and its successor, with the RX6600 being a second generation more advanced again. Cooler, quieter and more powerful.

The on-board graphics of the 12-series Intel Core chips allow no external GPU for even less power use and heat generated.
posted by k3ninho at 12:07 PM on March 31, 2023


« Older Ye olde gym shorts   |   Is this a reasonable price? Newer »
This thread is closed to new comments.