For PC gaming news and discussion.
PCGamingWiki
Rules:
- Be Respectful.
- No Spam or Porn.
- No Advertising.
- No Memes.
- No Tech Support.
- No questions about buying/building computers.
- No game suggestions, friend requests, surveys, or begging.
- No Let’s Plays, streams, highlight reels/montages, random videos or shorts.
- No off-topic posts/comments, within reason.
- Use the original source, no clickbait titles, no duplicates.
(Submissions should be from the original source if possible, unless from paywalled or non-english sources.
If the title is clickbait or lacks context you may lightly edit the title.)
- 1 user online
- 12 users / day
- 227 users / week
- 848 users / month
- 3.27K users / 6 months
- 1 subscriber
- 5.86K Posts
- 40.6K Comments
- Modlog
It’s like how banks figured there was more money in catering to the super rich and just shit all over the rest of us peasants, GPU manufacturers that got big because of gamers have now turned their backs to us to cater to the insane “AI” agenda.
Also, friendly advice, unless you need CUDA cores and you have to upgrade, try avoiding Nvidia.
The good games don’t need a high end GPU.
Terraria minimum specs: “don’t worry bro”
Absolutely. True creative games are made by smaller dev teams that aren’t forcing ray tracing and lifelike graphics. The new Indianna Jones game isn’t a GPU-selling card, and is the only game that I’ve personally had poor performance on with my 3070ti at 1440p.
Clair obscur runs like shit on my 3090 at 4k :(
Problem is preordering has been normalized, as has releasing games in pre-alpha state.
Anyone that preorders a digital game is a dummy. Preorders were created to assure you got some of the limited physical stock.
plus, i have a 3060. and it’s still amazing.
don’t feel the need to upgrade at all.
me neither. best is a 1070. don’t play newer ‘demanding’ games, nor do i have a system ‘worthy’ of a better card anyway.
Happy with my used 4070ti super i paid 150 for after trading in my 2080ti. There is nothing to use your gpu on unless youre a kid and haven’t seen the pattern of every aaa game bring literally the same game every year.
Yeah, my 2080ti can run everything sans ray traced stuff perfectly, though I also haven’t had any issues with Indiana Jones or Doom: The Dark Ages.
Akschually, Doom DA needs to have raytracing enabled at all times, and your vcard is in the first nvidia gen that has it. While 10xx and 20xx haven’t shown much of a difference, and both series are still okay for average gaming, there’s the planned divide vcard producers wanted. RTX IS ON ads visuals were fancy at best (imho) while consuming too much resources, and now there’s the first game that doesn’t function without it, pushing consumers to either updgrade their hardware or miss out on big hits. Not the first time it happened, but it gives a sense why there were a lot of media noise about that technology in the beginning.
Data on GPU shipments and/or POS sales showing a decline would be much more reliable than a survey.
Surveys can at times suffer from showing what the respondents want to reply as opposed to what they do.
deleted by creator
That’s why it’s best to focus on absolute unit shipment numbers/POS.
If total units increased compared to the previous generation launch, then people are still buying GPUs.
deleted by creator
Shipment/POS do not telling you anything about unfulfilled demand or “unrealized supply”.
It’s just how unit were shipped into the channel and sales at retail respectively.
These are the best data points that we have to understand demand dynamic.
Gamers are also a notoriously dramatic demography that often don’t go through on what they say.
It really depends if they hired a professional cognitive psychologist to write the survey for them. I doubt they did…
I mean, as written the headline statement is always true.
I am horrified by some of the other takeaways, though:
I’m sure we’d all switch to room temperature fusion for power if we could, too, or use superconductors in our electronics.
That’s the problem with surveys, isn’t it? What’s “latency being eliminated”? On principle it’d be your streamed game responds as quickly as a local game, which is entirely achievable if your target is running a 30fps client on a handheld device versus streaming 60 fps gameplay from a much more powerful server. We can do that now.
But is that “latency free” if you’re comparing it to running something at 240Hz in your gaming PC? With our without frame generation and upscaling? 120 Hz raw? 60Hz on console?
The question isn’t can you get latency free, the question is at what point in that chain does the average survey-anwering gamer start believing the hype about “latency free streaming”?
Which is irrelevant to me, because the real problem with cloud gaming has zero to do with latency.
That last one is especially horrifying. You don’t own games when you cloud game, you simply lease them. We all know what that’s done for the preservation of games. Not to mention encouraging the massive amounts of shovel ware that we get flooded with.
I don’t know that cloud gaming moves shovelware in either direction, but it really sucks to see the percentage of people that don’t factor ownership into the process at all, at least on paper.
That’s also how it is with a game you purchased to play on your own PC, though. Unless you have it on physical media, your access could be revoked at any time.
Increasingly across many markets, companies are not targeting average or median consumers. They’re only chasing whales, the people who can pay the premium. They’ve decided that more mid tier customers aren’t worth it – just chase the top. It also means a lower need for customer support.
Is this a news story from 4 years ago?
I just paid $400 for a refurbished MSI Gaming Z Trio Radeon RX 6800. The most I’ve ever spent. I never want to spend that much again.
For the price of one 5090 you could build 2-3 midrange gaming PCs lol. It’s crazy that anyone would even consider buying it unless they’re rich or actually need it for something important.
And still have your house burn down due to it just being a 2080 that has 9.8 jiggawats pushed into it.
There isn’t a single reason to get any of the 5 series imo, they don’t offer anything. And i say that as a 3d artist for games.
Edit: nevermind i remember some idiots got roped into 4k for gaming and are now paying the price like marketing wanted them to.
What’s wrong with 4k gaming? Just curious
You pay ton more money for a screen thats ppi is too dense to matter only to to pay ton more money for a pc to still run it at terrible framerate with lowered settings and fake frames.
4k is a pure scam.
Have you tried 4k? The difference is definitely noticeable unless you play on like a 20" screen
Yes its pointless, most noticable is the low frame rate and lowered graphics to make the game playable. High fps is more noticable and useful. Blind tests confirmed that, even the one ltt did.
2k could be argued is solid but even then the ppi is so dense already it does not really matter.
Edit: then again there is some research showing people preceive fps and ppindifferently so it may be 4k makes sense for some while for others its really overpriced 2k that no pc can run.
Not arguing FPS here lol. Arguing 4k, which you can run in 144hz in a lot of games even without a 5090, you failed to mention if you had tried 4k which I assume you haven’t based on the switch to FPS instead of resolution
Fair.
I play in 1080p so can’t comment on 4k but I can confirm fps doesn’t seem to affect me after 30fps. I don’t perceive a noticeable difference between 30, 60, 120fps. Haven’t played higher than that. I suspect 4k would probably look better to me than a higher fps though. But I’m happy with 30-60fps and 1080p so…
I went to 2k 100hz uw from 1080p 144hz. I stopped noticing the increased framerate pretty quickly as the “mouse so smooth” effects wear off fast. But the ultrawide huge fov is a massive plus. I don’t notice the resolution increase at all beyond lower frames and more text on screen in docs.
Laptops 4k is just 1080p with extra battery drain and worse performance.
4K is an outrageously high resolution.
If I was conspiratorial I would say that 4K was normalized as the next step above 1440p in order to create a demand for many generations of new graphics cards. Because it was introduced long before there was hardware able to use it without serious compromises. (I don’t actually think it’s a conspiracy though.)
For comparison, 1440p has 78% more pixels than 1080p. That’s quite a jump in pixel density and required performance.
4K has 125% more pixels than 1440p (300% more than 1080p). The step up is massive, and the additional performance required is as well.
Now there is a resolution that we are missing in between them. 3200x1800 is the natural next step above 1440p*. At 56% more pixels it would be a nice improvement, without an outrageous jump in performance. But it doesn’t exist outside of a few laptops for some reason.
*All these resolutions are multiples of 640x360. 720p is 2x, 1080p is 3x, 1440p is 4x, and 4K is 6x. 1800p is the missing 5x.
Somehow 4k resolution got a bad rep in the computing world, with people opposing it for both play and productivity.
“You can’t see the difference at 50cm away!” or something like that. Must be bad eyesight I guess.
Does it really help gameplay on the average monitor? If it is a fast paced game Im not even paying attention to pixels
It’s just kind of unnecessary. Gaming in 1440p on something the size of your average computer monitor, hell even just good ol’ 1080 HD, is more than sufficient. I mean 1080 to 4k sure there’s a difference, but 1440p it’s a lot harder to tell. Nobody cares about your mud puddle reflections cranking along in a game at 120 fps. At least not the normies.
Putting on my dinosaur hat for a second, I spent the first decade of my life gaming in 8/16 bit and 4 color CGA, and I’ve probably spent the last thirty years and god only knows how much money trying to replicate those experiences.
I mean I play at 1440p and I think it’s fine… Well it’s 3440x1440, problem is I can still see the pixels, and my desk is quite deep. Do I NEED 4k? No. Would I prefer if I had it? Hell yes, but not enough to spend huge amount of money that are damaging to an already unrealistic market.
I bought a secondhand 3090 when the 40 series came out for £750. I really don’t need to upgrade. I can even run the bigger AI models locally as I have a huge amount of VRAM.
Games run great and look great. Why would I upgrade?
I’m waiting to see if Intel or AMD come out with something awesome over the next few years. I’m in no rush.
But then the Nvidia xx90 series have never been for the average consumer and I dont know what gave you that idea.
Fucking youtubers and crypto miners.
Crypto mining with GPUs is dead, the only relevant mining uses ASICs now, so it would be more accurate to say:
Fucking youtubers and AI.
Fuck I’m old.
If I keep playing the same games my current CPU and GPU will do me well for a long time
I’ve been waiting for a product that makes sense.
I’m still waiting. I can keep waiting
deleted by creator
Still rocking a GTX 1070 and I plan on using my Graphene OS Pixel 8 Pro till 2030 (only bought it (used ofc) bc my Huawei Mate 20 Pro died on me in October last year 😔)
It doesn’t help that the gains have been smaller, and the prices higher.
I’ve got a RX 6800 I bought in 2020, and nothing but the 5090 is a significant upgrade, and I’m sure as fuck not paying that kind of money for a video card.
I just picked up a used RX 6800 XT after doing some research and comparing prices.
The fact that a gpu this old can outperform or match most newer cards at a fraction of the price is insane, but I’m very happy with my purchase. Solid upgrade from my 1070 Ti
I have a 6700xt and 5700x and my pc can do vr and play star citizen, they are the most demanding things I do on my pc, why should I spend almost £1000 to get a 5070 or 9070 and an am5 board+processor?
Well that depends on your definition of significant. Don’t get me wrong, the state of the GPU market is not consumer friendly, but even an RX 9070 provides over a 50% performance uplift over the RX 6800.
I’m in the same boat.
In general, there’s just no way I could ever justify buying a Nvidia card in terms of cost per buck, it’s absolutely ridiculous.
I’ll fork over 4 digits for a gfx when salaries go up by a digit as well.
Not to mention the cards have gotten huge and you just about need a nuclear reactor to power them. Melting cables and all.
“When did it just become expected that everybody would upgrade GPU’s every year and that’s suppose to be normal?” - that’s a really good question because I don’t think normal PC gamers have ever, and still aren’t, like that. It’s basically part of the culture to stretch your GPU to the limit of time so idk who you’re complaining about. Yeah, GPU prices are bullshit rn but let’s not make up stuff
Nah, there was a time when you’d get a new card every two years and it’d be twice as fast for the same price.
Nowadays the new cards are 10% faster for 15% more money.
I bought a new card last year after running a Vega 64 for ages and I honestly think it might last me ten years because things are only getting worse.
I don’t think they’re actually expecting anyone to upgrade annually. But there’s always someone due for an upgrade, however long it’s been for them. You can compare what percentage of users upgraded this year to previous years.
I just finally upgraded from a 1080 Ti to a 5070 Ti. At high refresh-rate 1440p the 1080 Ti was definitely showing its age and certain games would crash (even with no GPU overclock). Fortunately I was able to get a PNY 5070 Ti for only ~$60 over MSRP at the local Microcenter.
5000 series is a pretty shitty value across the board, but I got a new job (and pay increase) and so it was the right time for me to upgrade after 8 years.
Sticking with 1440p on desktop has gone very well for me. 2160p isn’t worth the costs in money or perf.
It’s never been normal to upgrade every year, and it still isn’t. Every three years is probably still more frequent than normal. The issue is there haven’t been reasonable prices for cards for like 8 years, and it’s worse more recently. People who are “due” for an upgrade aren’t because it’s unaffordable.
If consoles can last 6-8 years per gen so can my PC.
That’s more than good enough for me.
I don’t remember exactly when I built this PC but I want to say right before covid, and I haven’t felt any need for an upgrade yet.
Somewhere around 1996 when the 3dfx Voodoo came out. Once a year was a relatively conservative upgrade schedule in the late 90s.
Those cards were like what though, $199?
That’s still not cheap when you account for inflation. Of course there’s a world of difference between “not cheap” and what they charge these days.
Colour me surprised
Resumes gaming with a 1000-series card
Still rocking an EVGA 980 here.
Back when building my PC, I actually considered getting a 980 Ti. Luckily I did go with the GTX 1070
(they were both similarly priced)
Fuck Nvidia anyways. #teamred
I’ve been on Linux since 2018 (my PC is from 2016) and my next GPUs will always be AMD, unless Intel somehow manages to produce an on par GPU
Temu Nvidia is so much better, true. Please support the “underdog” billion dollar company.
I support the lesser evil option, yes. It’s not like I have much other choices now, do I? Thanks to fucking Nvidia.
I wish AMD had something like CUDA that my video rendering software used so I could stop using nvidia.
Fuck those guys too honestly. AMD is fueling this bullshit just as much as Nvidia.
Nvidia is one of the most evil companies out there, responsible for killing nearly all other GPU producers destroying the market.
So is AMD with their availability of literally three video cards in stock for all of North America at launch. Which in turn just fuels the scalpers. Downvote this all you want guys, AMD is just as complicit in all of this, they’ve fuelled this bullshit just as much.
Nvidia is singlehandedly responsible for killing all competition but AMD. They destroyed all other GPU companies with the nastiest tactics to dominate the market, only AMD has been able to survive. You can’t blame AMD for chip shortages, it’s the after shock after the covid pandemic. Never ever has there been a higher demand for chips, especially thanks to the rising EV market.
You can’t say AMD is as bad as Nvidia, as Nvidia is the sole reason the market got ruined in the first place. They are the worst of the worst.
And don’t forget diaper Donny, who destroyed international trade with his fucking tariff wars.
My new gpu was a steam deck.
Heck yes, made more sense to buy a steam deck than upgrade my PC.
I had lost all interest in games for a while. Desktop just ended up with me tinkering in the homelab. Steam deck has been so great to fall in love with gaming again.
Still on a 1060 over here.
Sure, I may have to limit FFXIV to 30fps in summer to stop it crashing, but it still runs.
They are talking about skipping 1 or 2 generations not taking 10 years off
Hey, it’s not 2026 just yet!
Hey, I’m also on a 1060 still! Admittedly I hardly game anymore, although I am considering another Skyrim playthrough.
I’m running Linux for everything and my GTX 1070 is still chugging along trying to power my 1440p 144hz monitor ^^’
Well, I mostly just play strategy games and CS2 (which I do have to run on almost the lowest possible settings without FSR. I basically turn everything to lowest except for lowest still AA setting and dynamic shadows to not have a disadvantage and get 110 - 180 fps depending on the situation)
But I’m planning on buying a used Radeon 9070 XT and just inserting it into my current build (i7 6800k based lololol) and on eventually buying a new build around it
(A 750W 80 Plus Platinum PSU should be able to handle a new 970 XT)