Press J to jump to the feed. Press question mark to learn the rest of the keyboard shortcuts

[Discussion] Has anyone noticed the whole conversation about hardware in recent months has moved towards low end equipment?

It seems like not long ago, basically everything was 1080Ti this or Titan that. Every rig anyone built always had the most expensive cards in it and the cheaper stuff was just not good enough as it couldn't run 1080P Ultra 144fps.

Now I see a lot of reviews with fairly praise worthy views towards cheaper equipment, like a 1050 Ti and a Ryzen 3 or i3 quad core.

I remember 2 or so years ago my graphics card died and I needed something, so I bought a 960 GTX 2gb when it was brand new and the reviews they got were so bad that basically implied they were already outdated, yet at the time it ran 1080P high just fine, now 2 years later cards like the 1050Ti which are no more powerful are being praised for being just fine for 1080P.

76% Upvoted
What are your thoughts? Log in or Sign uplog insign up

its probably because of crypto mining. not many people are buying high end or even mid range cards right now. they are just holding on to the older high end cards and waiting for prices to drop back down to sane levels. so, if you are a reviewer and need views, you review and praise the low end hardware budget gamers are actually buying. like right now, i would have no problem recommending one of AMD's APU's but i'd tell anyone to wait before buying a RX580 or Nvidia 1060 or higher. also, if you're not buying a high end or mid range card, why bother buying a high end cpu when you can just wait it out and upgrade everything together as new cpu's/gpu's come out this year.

6 points · 4 months ago

Definitely this but also the silver-linning of the shit sandwich of GPU prices is reviewers (and consumers) taking a bit more care in comparing available options rather than just assuming lower-end is all garbage.

This is just my perception, but it also seems like we get fewer games coming out that completely hammer gpus. Stuff like witcher 3 and FFXV push the boundaries sure, but they run fine on mid range hardware if you don't mind turning off some of the bleeding edge settings. A lot of pc devs used to hold it as a point of pride that they released a new game that melted gpus, even if that alienated people with mid or low end machines.

Yeah that last part hit me hard. I have a $600 Ryzen 5 1600 just sitting there built waiting for a graphic cards to put in. I was ignorant enough to believe that prices would have gone down by now but man was I mistaken.

4 points · 4 months ago

This is a big reason why I'm not even bothering with building a computer. What's the point, prices are so insane. I just hope that these increased prices don't become normalized (like phone prices have become) and stick after cryptomining is no longer having a huge impact on GPUs.

Thing is, most of these GPU companies will find a sweet spot and prices won't return to normal. They won't make them as high as they are now but prices will be increased. It will be blamed on Apple & Samsung for buying all the ram that is needed for the GPUs at a higher price which they need to "compete" for ram.

It will be blamed on Apple & Samsung for buying all the ram that is needed for the GPUs at a higher price which they need to "compete" for ram.

Sure... but, that's true, and you're saying it like it's not honest or something.

It isnt really about the "GPU companies" finding a sweet spot. Consumers were and are willing to pay the prices.

How long have you been waiting? At this point, if i was you, would spend as much time watching eBay Auctions for under priced items and/or go with a gtx 970 instead of the 10x0 series.... your best bet

Yeah that’s what I’ve been doing. It’s just hard biting the bullet sometimes and overpaying. But I guess it’s better then just letting the build sit there

As someone on a very tight budget, I bought a R7 270 mid last year for $60 USD.

Reallly not a great card, but enough to let me play wolfestine and doom with my R3 1200 at decent settings.

My point is, you should at least get a minimally functional for for $100 or something, it's worth at least being able to run the system and games at not shit levels.

That’s really interesting. I wonder if the reviewers are basing what they write on feedback from the audience, or just seeing the landscape and then choosing to review lower-end stuff.

39 points · 4 months ago · edited 4 months ago

I think you're exaggerating the number of people who built high-end systems even before the mining surge. Budget and mid-range cards are always way more popular, but sadly mid-range does not exist anymore which only leaves budget cards as a reasonable purchase.

My favorite example of this: how many review sites test GPUs at 4K? I would say many. 4K has enormous "mindshare" in the PC gaming industry.

How many people own 4K displays, according to the Steam Hardware Survey? 0.49%.

It's not just GTX 1080 Ti or Vega 64 reviews...but GTX 1060 reviews. A majority of the reviews on Google's first page of results (5 of the 8) report GTX 1060 results at 4K. Why? It's gotta be a fraction of a fraction of a fraction of users.

And testing a new resolution isn't easy: it's an entire slate of new benchmarks to be run and written about.

There are many uses for 4k displays outside of gaming, so it does not seem all that far-fetched that someone could pair a mid-range GPU with a 4k display (especially if their budget isn't the largest and their non-gaming workloads would get a bigger benefit by spending money elsewhere).

4 points · 4 months ago

That's me. Using it for programming to have crispier fonts, paired with a gtx 760 from my old rig because I can't afford current cards. Would go for am RX 580 once it's in range of 200€ because nvidia and Linux/Wayland don't pair well anymore (and they're dicks for requiring signed drivers which makes it impossible for people to add Wayland support without nvidia)

Last I heard Gnome is looking to support eglstreams and getting the Nvidia driver to work with x-wayland so you might be able to use your 760 with Wayland later this year

1 point · 4 months ago

That's great news, thank you for your comment.

The reason to test cards like a GTX1060 at 4K, even though you may not think them capable of running it, is to serve as a comparison point between last gen cards benched at 4K (I.e GTX980) and the higher end cards.

Plus a 1060 can play plenty of stuff at 4K. Especially if you target 30 FPS.

4 points · 4 months ago

30 fps

But why

Because its good enough to play a game?

5 points · 4 months ago

1440p at 60FPS+ would be a much nicer experience, though

Comment deleted4 months ago(2 children)

this is all nonsense if nobody specifies a game and what settings they're talking about.

4k 30 would be 1080p 120 datawise no?

serve as a comparison point between last gen cards benched at 4K (I.e GTX980) and the higher end cards

You can get the same "generational performance jump" from 1080p and 1440p numbers. Very few games are CPU-dependent even at 1080p, as these reviews show.

Scaling from the GTX 1060 to the GTX 1080 looks the same whether it's 1080P or 4K (2x as fast). Or the GTX 1060 to the GTX 980 (trade blows).

Plus a 1060 can play plenty of stuff at 4K. Especially if you target 30 FPS.

I think it depends on the age of the game and your quality settings. A lot of GPUs can run 4K if you reduce settings, rendering resolution, and FPS targets. That seems more relevant to a tweaking guide than full-blown reviews (that aren't going to be bothering with multiple game settings, usually).

Especially if you target 30 FPS

makes sense in like...civ 5. Other then that, why would you?

You’d be surprised what can run decently at 4K with slightly lower settings.

1 point · 4 months ago

At least not Civ 5 because HUD and some icons do not scale unfortunately. Otherwise it runs fine.

I remember running Civ V fine at 4K. I may have just been squinting though. I think there should be a UI scale setting somewhere in the menu.

Once of the disadvantages of 4K unfortunately is many devs don’t have scaling UI. It’s so strange because older games have no problem scaling UI and they specifically implement downwards scaling but it’s like they didn’t even think of it.

Just to play devil's advocate, by testing everything at 4K, viewers are able to gauge when hardware in a particular price range can run most games at 4k, and that might then prompt them to get a 4K screen at that point.

By testing and showing that average hardware can't run stuff at 4k, it let's consumers know that it's fine to wait. No point getting a 4K monitor if you can't actually do the things you want at 4k.

It also allows for comparative testing where you know for sure that there's no cpu bottleneck.

You understand that usually benchmarks are with the highest settings and essentially look to tax the gpu and cpu, right?

14 points · 4 months ago · edited 4 months ago

so I bought a 960 GTX 2gb when it was brand new and the reviews they got were so bad that basically implied they were already outdated, yet at the time it ran 1080P high just fine, now 2 years later cards like the 1050Ti which are no more powerful are being praised for being just fine for 1080P.

The 960 was criticized because people thought it wouldn't be a lot slower than the 970, but it was.

The 1050 Ti is a 75w card, it doesn't need a 6-pin power connector, so its performance was expected.

Top post right now is for raspberry pi. Things are desperate yo

As well as the GPU situation, RAM prices have contributed to make this into a monumentally shit time for PC building. Last I checked, 16GB of okay-grade RAM was up to £180 on my side of the pond, and that really is an absurd amount.

For myself, and a lot of us with higher-end needs, we've decided simply to hold off upgrading until things improve. For folks less in need, it makes a lot of sense to lower expectations slightly and make do with lower-grade GPUs and less RAM for now.

I paid about half that for 16GB of 1600MHz DDR3 in 2012.

And from what I've heard the prices for DDR4 weren't that dissimilar a year or two ago.

I swear I remember DDR3 16GB going low as 80CAD, it was a weird thing to see. As much as I wanted it I had 12GB of RAM at the time.

My Sandy Bridge PC was becoming unstable and I've been messing with VMWare, so in essence I was forced to buy it right after Boxing day(along with unreliable USB 3.0 ports). I could have likely gotten a 1700 instead of a 1600X. Really unhappy about seeing $240CAD for 16GB 2666MHz RAM. I was really thinking about just picking up a used workstation/server.

5 points · 4 months ago

I paid $50 for 16gb of DDR4 in July, 2016. Seems hard to believe.

yep my dad too... ramlight went on this year and we were crapping our pants but luckily his budget asus mobo went out and his now 200$ sticks are still ok. put them on a newer tuf 270 and all is perf.

And I paid C$99 for 8GB 2666 MHz in September 2017.

Just to put things into perspective.

I paid $120 for 32GB (4x8GB) quad channel 1600MHz DDR3 in 2012.

In April of 2016 I bought 16GB of DDR3 1600 to upgrade the RAM in my laptop and move the 8GB from my laptop to a ZOTAC ZBox I'd gotten. It cost me $44 for two 8GB SODIMMs. The exact same memory would cost me $138 if I ordered it from Amazon today.

I bought 64GB of RAM back in 2012 for around the price he paid. Crazy that 6 years later, my old dinosaur is still rocking it. sigh

How I can get a SSD with 4 times the capacity for the same price I bought my 850 EVO in 2016 seems like standard speed for technology to move.

That's the way RAM should have moved but we're stuck because one manufacturer has almost all the fabs. South Korea really needs some fucking sanctions on them so that they break up Samsung. It's starting to get to the point where an import ban threat is the only thing that would solve this bullshit. Remember that Samsung is a state sanctioned monopoly.

8 points · 4 months ago · edited 4 months ago

This is not at all surprising, seeing how high the prices have skyrocketed.

gtx 960 wasn't a very good deal when you looked at its competitors.

Now thew 1050ti is the only option for most as cards are crazy expensive

Even the 1050ti price is inflated a lot compared to other budget cards. It's just not as bad as the 1060/580.

Yeah, the AMD price equivalent was noticeably better. Perhaps he was criticized for buying the inferior option?

You heard about lots of people buying 1080Ti because back then they were new and there is always people waiting to buy the newest top of the line GPU.

I mean the 1050TI is more suitable to run FHD compared to your 960/2GB due to 4GB of VRAM.

But nonetheless You are right. At release time the 1060/6GB and RX580 were the go-to card for FHD Gaming. But now that they've reached price-levels beyond any good, gamers are probably looking for filler hardware until prices normalize if they need the upgrade or straight up are just staying on their current GPUs if they have a 970/980 or R9 390(X), as only power consumption would go down by an upgrade.

Original Poster3 points · 4 months ago

960 is gone now, but the point is mostly back then it wasn’t that bad either.

I think a 1050TI is more than capable of 1080P. But it is a shame prices have gone so high, as in an era of cheap quad core processors the overall system prices has actually gone up

As a low-end consumer myself, it's honestly pretty nice. I always get pretty disconcerted when all the posts are about 2000 dollar builds when mine was barely 600.

Used hardware is really popular on Youtube as well.

2 points · 4 months ago

Considering you could build a killer gaming rig for $1,000 in the summer of 2016 and that today will only get you a killer case, mobo, psu and ram before GPU, it makes sense.

There is not one game out there that the new quad core i3 can't handle.
I will soon be building a system for a friend around it, and for the time being, he will be keeping his GTX 670 until prices settle down.

the ryzen 2200g made low end interesting

One other minor reason may be that the Xbox and Playstation are starting to get a little long in the tooth and now a budget PC can beat them graphics wise pretty easily. since most games are designed to run well on the PS4 and xbox at 30 frames per second, buildit a PC that can run the same game at higher settings at 60 frames or higher is enough for most people to be happy with.

I think it’s pretty obvious why.

Original Poster1 point · 4 months ago

I understand why. It’s just curious to me that a year or so ago anything other than a high end card didn’t provide a good experience, but now suddenly budget cards are perfectly playable

PCMR elitism has been thwarted by ridiculous prices. I noticed the same thing too - 60hz used to be "unplayable". I hate to see the market like this, but in a way I am secretly enjoying all the smugness of the PCMR mindset get buttfukkd.

I remember when a guy couldn't hardly even give a 780 ti away, now those things are selling for over 200 bucks on ebay.

i don't recall 60 being unplayable. It was always go 144! But if you can't, go 60. 30 is for the console peasents.

Increase in RAM prices and especially GPU prices has pushed things to the lower end, because once again we're in the "it costs 2k+ to build a good computer" territory.

Oh, I've seen plenty of PCMR posters mention that they could "never go back to 60hz".

I've tried the high frame rate thing and it just wasn't a big deal to me. I'm at 4k60 with a Titan Xp and am more than happy with it.

Thank you for saying this. Im a photographer and eyeing the upcoming LG 5k ultrawide. But it's 60Hz. So Im a bit worried about missing something.

Budget cards have always been perfectly playable, but now people are just desperate to get hardware that can game. Anything high end is so stupidly expensive and hard to find the low end is the only option right now for a lot of people.

Hopefully it’s a good thing for the community. There was a period where all the recommendations were an i5+1060 at minimum, and everything below that was rarely discussed.

I get that it’s an enthusiast community, but I feel like people are still way overestimating what they need.

Ultra graphics settings, 1440p, 144hz are all nice, but you can eek out so much more value if you’re willing to turn down your settings.

I play on a 42” tv in my living room with a really great wireless keyboard and mouse. Honestly, from 12ft away, even 720 is okay, and it allows me to run everything on very high, even 800+ mods in Skyrim on a 1060 and old ivy bridge i5.

i5+1060 at minimum

this was for the longest time the sweet spot. The i5 was capable, with i7s often not being fully utilized by games, and the 1060 was a large step above the 1050ti, and best bang for buck, just like the 970 before it.

Makes perfect sense, and to be honest, still my go to recommendation.

I've built my current system in 2014 (i5 4690k, gtx 970, 8gb DDR3) for around 1k€, and I was hoping to update my system with Volta/Ampere but seeing how the prices are now and how things look going forward the GPU alone would cost me almost the same as my whole system.

I really would like to do a proper upgrade like: i7 9700k series, gtx 2080, 16gb DDR4 but I'm afraid that kind of a setup will cost 2k€+

When I got my 980, it was the best there was and it set me back some €550 or so. I thought it was hugely expensive and I only got it because I returned a faulty 970, got my money back and payed a bit more to step up to the 980.

Nowadays, the best there is is a 1080ti at the end of its lifecycle and it's more than twice as expensive. No wonder people don't talk about high end builds as much. No matter the reason, high end gaming cards are now more expensive than a Titan. There was never a huge market for (or interest in) Titan builds.

960 got good review what on earth are you talking about.

Example review


1 point · 4 months ago

Because people who wanted a 1080 Ti already bought one. I, for one, bought two.

No, not really noticing this at all. 1080Ti's and the like were always niche, super enthusiast level products. That hasn't changed. Most people still run with more modest setups just like before.

And a 960 was never good for 1080p/High settings if you wanted 60fps in everything. Just like a 1050Ti isn't today. Again, nothing has changed there. A 290/970/480/1060 is still the general 'floor' for 1080p/60fps gaming with higher settings.

I think we'll be right back to talking about higher end GPU's as soon as Nvidia's new line gets announced. There just hasn't been much news about high end GPU's since last summer and we all know Vega was mostly a bust.

Confirmation bias.

Whining about GPU prices is understandable.

But these babies whining about RAM prices is just fucking obnoxious and oblivious. I've spent more for mere megabytes of RAM than these whiners will ever pay for maxed-out GBs.

Community Details





/r/hardware is a technology subreddit for computer hardware news, reviews and discussion.

Create Post
r/hardware Rules
Follow Reddiquette
Posts should be about hardware
No memes or direct links to images
No tech support or PC building questions
Most questions don't belong here
Spam and self promotion policy
Misc. Rules
Rumor Policy
Cookies help us deliver our Services. By using our Services or clicking I agree, you agree to our use of cookies. Learn More.