Reminder that Bethesda is owned by Microsoft, the company that insists it’s going to end support for Windows 10 in October and wants everyone to move to Windows 11, which doesn’t officially support perfectly functional but somewhat old CPUs. So of course they don’t care about GPUs too old to support ray tracing.
At some point it was going to happen, this is just earlier than many thought. The real question is “when is AMD going to have an answer to Nvidia when it comes to RT performance?”
How long did they think it would take before RT was a requirement? It was introduced with the GeForce 20 series more than six years ago.
For technology, six years is vintage.
The only people this should affect is people still using GTX 10 and 16 series cards. I dunno what’s happening with AMD/Radeon. Since they were purchased by AMD the banking schemes have gotten to be more and more nonsensical, so I always have a hard time knowing WTF generation a card is from by the model number.
In any case. Yeah, people using 5+ year old tech are going to be unable to play the latest AAA games. And?
Has there ever been a time when a 5+ year old system can reasonably play a modern AAA title without it being a slide show?
I have a 20 series card, albeit one of the higher tier ones, and I probably won’t be upgrading this year. I probably also won’t be playing any new AAA titles either.
It’s fine to have an older card, but nobody in that position should be expecting to play the latest and greatest games at reasonable framerates, if at all.
It is the way of things.
I am personally rather miffed about the fact that if you want any performance from a GPU, you basically need to spend $800+. Even though some cards are saying they’re available for less, they almost never are, either due to scalping or greed (which are kind of the same thing), or something else like idiotic tariffs.
I don’t have nearly a grand I can burn every year to upgrade my GPU the last GPU I bought was a 1060, and my current card was a gift. I haven’t had a budget for a decent GPU in many, many years.
When I upgrade, I’m likely going Intel arc, because the value proposition makes sense to me. I can actually spend less than $600 and get a card that will have some reasonable level of performance.
Yeah, the gifted card I’m using is a 2080 Ti. My friend that gifted it, went from a dual 2080 ti SLI setup to a 4090 IIRC, he kept one for his old system so it’s still useful, but gave me one of the two since SLI is dead and he doesn’t need the extra card in a system he’s not frequently using.
11G of memory is an odd choice, but it was a huge uplift from the 3G I was using before then. I had a super budget GTX 1060 3G (I think it was made by palit?) before.
I still have to play on modest settings for anything modern, but my only real challenge has been feeding it with fresh air. My PC case puts the GPU on a riser with front to back airflow and very little space front-back and top/bottom. The card uses a side intake, which is fairly typical for GPUs, which is basically starved for air if I install the card normally. For now, I’ve got it on a riser, sitting on top of the system with the cover off, so my GPU is in open air. Not ideal. I need to work on a better solution… But it works great otherwise.
Euh, no. The intel battlemage cards are way way better than rtx 2070. They even beat the 4060… For 250$.
Intel battlemage gpu’s are really good cards if you dont need pure, raw, power because everything must be in 4k and on ultra etc.
Which is a good value since that raw, pure, power comes with an electricity bill i would not want to pay
Yeah, I got the cards wrong. They are around a 2080, which is around the same as a 4060. Still not much of an upgrade from an upper end 2000 series, which to me is 2070 and up.
I would actually want to see the actual performance differences, because the intel cards have a way faster memory bandwidth which is giving them the performance. Still, 250 for 4060 performance ( which is way way more ) is one hell of a good deal in comparison
They make gaming more and more elitist hobby, and then get surprised when indie games with pixel graphics that can run even on potato devices make a great success.
They’re perfectly functional still and capable of pretty much anything for a modern workload, spec depending… If they can run win 11 fine (and they should be able to if they can run 10), then the cutoff is arbitrary and will cause more systems to find their way to landfills sooner than they otherwise would have.
How many 8 year old computers still function halfway decently? Most of those people are probably due for an upgrade whether they know it or not.
Even if we go with this popular narrative, no one is actually going to immediately run to throw out their working Win10 PC when Microsoft cuts off updates. They’ll just continue to use it insecurely. Just like millions of people did and still do with Win7.
This is the issue with using a proprietary operating system in general. Eventually they’ll cut you off arbitrarily because there’s a profit motive to do so. Relying on them to keep your system updated and secured indefinitely is a naive prospect to begin with.
You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: [email protected]
No game suggestions, friend requests, surveys, or begging.
No Let’s Plays, streams, highlight reels/montages, random videos or shorts.
No off-topic posts/comments, within reason.
Use the original source, no clickbait titles, no duplicates.
(Submissions should be from the original source if possible, unless from paywalled or non-english sources.
If the title is clickbait or lacks context you may lightly edit the title.)
Reminder that Bethesda is owned by Microsoft, the company that insists it’s going to end support for Windows 10 in October and wants everyone to move to Windows 11, which doesn’t officially support perfectly functional but somewhat old CPUs. So of course they don’t care about GPUs too old to support ray tracing.
At some point it was going to happen, this is just earlier than many thought. The real question is “when is AMD going to have an answer to Nvidia when it comes to RT performance?”
Earlier than they thought?
How long did they think it would take before RT was a requirement? It was introduced with the GeForce 20 series more than six years ago.
For technology, six years is vintage.
The only people this should affect is people still using GTX 10 and 16 series cards. I dunno what’s happening with AMD/Radeon. Since they were purchased by AMD the banking schemes have gotten to be more and more nonsensical, so I always have a hard time knowing WTF generation a card is from by the model number.
In any case. Yeah, people using 5+ year old tech are going to be unable to play the latest AAA games. And?
Has there ever been a time when a 5+ year old system can reasonably play a modern AAA title without it being a slide show?
I’m still hearing from people that they’re using an Nvidia 1000 card. I was expecting to hear 2000 instead of 1000, and then it would happen.
I have a 20 series card, albeit one of the higher tier ones, and I probably won’t be upgrading this year. I probably also won’t be playing any new AAA titles either.
It’s fine to have an older card, but nobody in that position should be expecting to play the latest and greatest games at reasonable framerates, if at all.
It is the way of things.
I am personally rather miffed about the fact that if you want any performance from a GPU, you basically need to spend $800+. Even though some cards are saying they’re available for less, they almost never are, either due to scalping or greed (which are kind of the same thing), or something else like idiotic tariffs. I don’t have nearly a grand I can burn every year to upgrade my GPU the last GPU I bought was a 1060, and my current card was a gift. I haven’t had a budget for a decent GPU in many, many years.
When I upgrade, I’m likely going Intel arc, because the value proposition makes sense to me. I can actually spend less than $600 and get a card that will have some reasonable level of performance.
The current Intel GPUs aren’t better than a RTX 2070, so that won’t be an upgrade if you’re on a higher tier 2000 series.
I just went up to a 4070 Ti from a 2080 Ti, because it was the only worthwhile upgrade. $660 used. So you don’t need to spend $800.
Yeah, the gifted card I’m using is a 2080 Ti. My friend that gifted it, went from a dual 2080 ti SLI setup to a 4090 IIRC, he kept one for his old system so it’s still useful, but gave me one of the two since SLI is dead and he doesn’t need the extra card in a system he’s not frequently using.
11G of memory is an odd choice, but it was a huge uplift from the 3G I was using before then. I had a super budget GTX 1060 3G (I think it was made by palit?) before.
I still have to play on modest settings for anything modern, but my only real challenge has been feeding it with fresh air. My PC case puts the GPU on a riser with front to back airflow and very little space front-back and top/bottom. The card uses a side intake, which is fairly typical for GPUs, which is basically starved for air if I install the card normally. For now, I’ve got it on a riser, sitting on top of the system with the cover off, so my GPU is in open air. Not ideal. I need to work on a better solution… But it works great otherwise.
Euh, no. The intel battlemage cards are way way better than rtx 2070. They even beat the 4060… For 250$.
Intel battlemage gpu’s are really good cards if you dont need pure, raw, power because everything must be in 4k and on ultra etc.
Which is a good value since that raw, pure, power comes with an electricity bill i would not want to pay
Yeah, I got the cards wrong. They are around a 2080, which is around the same as a 4060. Still not much of an upgrade from an upper end 2000 series, which to me is 2070 and up.
I would actually want to see the actual performance differences, because the intel cards have a way faster memory bandwidth which is giving them the performance. Still, 250 for 4060 performance ( which is way way more ) is one hell of a good deal in comparison
They make gaming more and more elitist hobby, and then get surprised when indie games with pixel graphics that can run even on potato devices make a great success.
Somewhat old CPUs are 8 years old+ now? Windows 11 is crap, but I don’t think the hardware requirements are the reason.
Honestly? Yeah.
They’re perfectly functional still and capable of pretty much anything for a modern workload, spec depending… If they can run win 11 fine (and they should be able to if they can run 10), then the cutoff is arbitrary and will cause more systems to find their way to landfills sooner than they otherwise would have.
How many 8 year old computers still function halfway decently? Most of those people are probably due for an upgrade whether they know it or not.
Even if we go with this popular narrative, no one is actually going to immediately run to throw out their working Win10 PC when Microsoft cuts off updates. They’ll just continue to use it insecurely. Just like millions of people did and still do with Win7.
This is the issue with using a proprietary operating system in general. Eventually they’ll cut you off arbitrarily because there’s a profit motive to do so. Relying on them to keep your system updated and secured indefinitely is a naive prospect to begin with.