For PC gaming news and discussion.
PCGamingWiki
Rules:
- Be Respectful.
- No Spam or Porn.
- No Advertising.
- No Memes.
- No Tech Support.
- No questions about buying/building computers.
- No game suggestions, friend requests, surveys, or begging.
- No Let’s Plays, streams, highlight reels/montages, random videos or shorts.
- No off-topic posts/comments, within reason.
- Use the original source, no clickbait titles, no duplicates.
(Submissions should be from the original source if possible, unless from paywalled or non-english sources.
If the title is clickbait or lacks context you may lightly edit the title.)
- 1 user online
- 128 users / day
- 309 users / week
- 1.15K users / month
- 3.29K users / 6 months
- 1 subscriber
- 7.23K Posts
- 58.9K Comments
- Modlog
Given the history of bugs in the RDNA architecture, could it not be reasonable to assume these latter FSR features don’t run on the old architecture?
They are ML models after all. RDNA 3 to 4 saw FSR throughput double so I wouldn’t be surprised these new models need more CUs than available on previous cards.
Seems silly to pitchfork right now based on a rumour with little substance.
Given this history that FSR 4 was proven to have been able to be supported in RDNA 3 and 2 via a unintentional commit by AMD and the intrepid open source community implemented FSR 4 on these cards giving them much improved visual quality with a minor loss in frame rate and that this rumour is coming from a reliable source that has a history of being correct with their leaks, I’m inclined to distrust this corporation and start sharpening my tines.
And don’t get me started on how AMD has dropped linux support, another competitive advantage, and we’re relying on open source coders to give us FSR 4 capability while Nvidia, the current god father of fucking over gamers, has been increasing linux driver support.
But hey, we all have our opinions based on our observations.