The most interesting side of Nvidia's new RTX 50 series is not the GPUs themselves, not even close, it's DLSS 4 upscaling that steals the spotlight with...
But DLSS is making devs lazy. I remember playing Spider-Man 2 on my RTX 3070 on High, Raytracing off and dlss both on and off. It looked noticeably worse than Spider-Man 2014 on base PS4. Maybe it is just one game, but I was really not impressed that a 2 year old game looked worse than 10 year old game.
You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: [email protected]
No game suggestions, friend requests, surveys, or begging.
No Let’s Plays, streams, highlight reels/montages, random videos or shorts.
No off-topic posts/comments, within reason.
Use the original source, no clickbait titles, no duplicates.
(Submissions should be from the original source if possible, unless from paywalled or non-english sources.
If the title is clickbait or lacks context you may lightly edit the title.)
I mean it still works on 20 series onwards, but with a bit more performance impact.
But yeah it starting to become normal to have to use it, even if its to get better anti aliasing than the default vaseline options.
But DLSS is making devs lazy. I remember playing Spider-Man 2 on my RTX 3070 on High, Raytracing off and dlss both on and off. It looked noticeably worse than Spider-Man 2014 on base PS4. Maybe it is just one game, but I was really not impressed that a 2 year old game looked worse than 10 year old game.
I don’t like it either, the ghosting and movement artifacts are still very obvious even above 200 fps and once i notice them i cant unsee it.