• 4 Posts
  • 41 Comments
Joined 2Y ago
cake
Cake day: Jun 09, 2023

help-circle
rss

They shouldn’t have released a new architecture without dedicated AI accelerators as late as 2022 then, even though they had working AI accelerators at that point - which they were only selling to data centers. FSR 4 can’t be ported back to older AMD architectures for the same reason that DLSS can’t physically work on anything older than an RTX 20 series card (which came out in 2018, by the way). You can only get so much AI acceleration out of general-purpose cores.

AMD’s GPU division is the poster child for short-sighted conservatism in the tech industry and the results speak for themselves. What’s especially weird is that the dominant company is driving innovation (for now at least) while the underdog was trying to survive with brute forcing raster performance above all else, like we’re in some upside-down world. Normally, it’s the other way around. AMD have finally (maybe) caught up to one of Nvidia’s technologies from March of 2020, almost half a decade ago. Too bad they are 1) chasing a moving target and 2) have lost almost every other race in the GPU sphere as well, including the one for best raster performance. The fact that their upcoming generation is openly copying Nvidia’s naming scheme is not a good sign - you don’t do that when things are going well.

Things might change in the future and I hope for there finally being some competition in the GPU sector again, but for now, it’s not looking good and the recent announcements haven’t changed anything. A vocal minority of PC gamers dismissing ray tracing, upscaling and frame generation as a whole reflects neither what developers are doing nor how buyers are behaving - and the fact that AMD is finally trying to score in all of these areas tells us that the cries of fanboys were just that and not reflective of any reality. If the new generation of AMD GPUs ends up finally delivering decent ray tracing, upscaling and frame generation performance (which I hope, because fuck monopolies and those increasingly cringey leather jackets), I wonder if the same people will suddenly reverse their course and embrace these technologies. Or maybe I should stop worrying about fanboys.


Nvidia is active in more than just one sector and love them or hate them, but they are dominating in consumer graphics cards (because they are by far the best option there, with both competitors tripping over their own shoes at nearly every turn), professional graphics cards (ditto), automotive electronics (ditto) and AI accelerators (ditto). The company made a number very correct and far-reaching bets on the future of GPU-accelerated computing a few decades ago, which are now all paying off big time. While I am critical of many if not most aspects of the current AI boom, I would not blame them for selling shovels during a gold rush. If there is one company in the world that has a business model built around AI right, it’s them. Even if e.g. the whole LLM bubble bursts tomorrow, they’ve made enough money to continue their dominance in other fields. A few of their other bets were correct too, like building actual productive and long-lasting relationships with game developers, spending far more on building decent drivers than anyone else and correctly predicting two industry trends very early on that are now both out in full force by making sure that their silicon puts a heavy emphasis on supporting both ray-tracing and upscaling. They were earlier than AMD and Intel, invested more resources into these hardware features while also providing better software support - and crucially, they also encouraged developers to make use of these hardware features, which is exactly the right approach. Yes, it would have been nicer of them to open source e.g. DLSS like AMD did with FSR, but the economic incentives aren’t there for this approach, unfortunately.

The marketing claim that the 5070 can keep up with the 4090 is a bit misleading, but there’s a method to the madness: While the three instead of just one synthetic frames created by the GPU are not 100% equivalent to natively rendered frames, the frame interpolation is both far better than it has been in the past from the looks of it (to the point that most people will probably not notice it) and has also now reached a point - thanks to motion reprojection similar to tech previously found on VR headsets, but now with screen edges being AI generated - where it has a positive impact on input latency instead of merely making games appear more fluent. Still, it would have been more honest to claim that the “low-end” (at $600 - thanks scalpers!) model of the new lineup is more realistically half as fast as the previous flag ship, but I guess they felt this wasn’t bombastic enough. Huang isn’t just an ass kisser, but also too boastful for his own good. The headlines wrote themselves though, which is likely why they were fine with bending the truth a little.

Yes, their prices are high, but if there’s one thing they learned during COVID, it’s that there are more than enough people willing and able to pay out of their noses for anything that outputs an image. If I can sell the same number of items for $600 than for half the price, then it makes no sense to sell them for less. Hell, it would even be legally dangerous for a company with this much market share.

I know this kind of upscaling and frame interpolation tech is unpopular with a vocal subset of the gaming community, but if there is one actually useful application of AI image generation, it’s using these approaches to make games run as well as they should. It’s not like overworked game developers can just magically materialize more frames otherwise - we would be more realistically back to FPS rates in the low 20s like during the early Xbox 360 and PS3 era rather than having everything run at 4K/120 natively. This tech is here to stay, downright needed to get around the diminishing returns paradigm that has been plaguing the games industry for a while, where every small advance in visual fidelity has to be paid with a high cost in processing power. I know, YOU don’t need fancy graphics, but as expensive and increasingly unsustainable as they are, they have been a main draw for the industry for almost as long as it has existed. Developers have always tried to make their games look as impressive as they possibly could with the hardware that is available - hell, many have even created hardware specifically for the games they wanted to make (that’s one way to sum up e.g. much of the history of arcade cabinets). Upscaling and frame generation are perhaps a stepping stone towards finally cracking that elusive photorealism barrier developers have been chasing for many decades once and for all.

The usual disclaimer before people accuse me of being a mindless corporate shill: I’m using AMD CPUs in most my PCs, I’m currently planning two builds with AMD CPUs, the Steam Deck shows just how great of an option even current AMD GPUs can be, I was on AMD GPUs for most of my gaming history until I made the switch to Nvidia when the PC version of GTA V came out, because back then, it was Nvidia who were offering more VRAM at competitive prices - and I wanted to try out doing stuff with CUDA, which is how they have managed to hold me captive ever since. My current GPU is an RTX 2080 (which I got used for a pittance - they haven’t seen any money from me directly since I bought a new GTX 960 for GTA V) and they can hype up the 50 series as much as they want with more or less misleading performance graphs, the ol’ 2080 is still doing more than fine enough at 1440p that I won’t be upgrading for many years to come.


He’s an ass kisser, but the company is doing excellently under his watch and also treating its employees quite a lot better than most of Silicon Valley. Bad Linux drivers alone don’t make a company bad.



Have you used both or just one of them like you would a laptop touchpad? I think the tiny touchpad of the Lenovo can probably do this as well.


Eh, I kind of get it. I’ve had my Deck ever since it came out and used the touchpads maybe two or three times in total. They are pretty pointless in my eyes.


I’ve never gone below quality (never felt the need to), that’s why I was asking. Since lower DLSS settings render the game at a lower resolution, you might have unknowingly (probably to the developers as well) picked a setting that broke this particular puzzle.


I’m assuming you mean that you disable DLSS frame generation, since the upscaling and ray reconstruction DLSS results in an improved “real” frame rate, which means that the actual input latency is lowered, despite about 5% added input lag per frame. Since you’re almost always guaranteed to gain far more than 5% in performance with DLSS, these benefits eliminate the overhang.



I’ve only ever noticed slight shimmering on hair, but not movement artifacts. Maybe it’s less noticeable on high refresh rate monitors - or perhaps I’m blind to them, kind of how a few decades ago, I did not notice frame rates being in the single digits…

This hair shimmering is an issue even at native resolution though, simply due to the subpixel detail common in AAA titles now. The developers of Dragon Age The Veilguard solved the problem by using several rendering passes just for the hair:

This technique involves splitting the hair into two distinct passes, first opaque, and then transparent. To split the hair up, we added an alpha cutoff to the render pass that composites the hair with the world and first renders the hair that is above the cutoff (>=1, opaque), and subsequently the hair that is lower than the cutoff (transparent).

Before these split passes are rendered, we render the depth of the transparent part of the hair. Mostly this is just the ends of the hair strands. This texture will be used as a spatial barrier between transparent pixels that are “under” and “on top” of the strand hair.

Source:

https://www.ea.com/technology/news/strand-hair-dragon-age-the-veilguard


Huh, I’ve never heard of that before. Do you have screenshots? Which game at which resolution and DLSS setting?



DLSS without frame generation is at least equivalent (sometimes superior) to a native image though. If you’ve only ever seen FSR or PSSR with your own eyes, you might underestimate just how good DLSS looks in comparison. [Xess is a close second in my opinion - a bit softer - and depending on the game’s art style, it can look rather pleasing, but the problem is that it’s relatively rarely being implemented by game developers. It also comes with a small performance overhead on non-Intel cards.]

Frame generation itself has issues though, namely latency, image stability and ghosting. At least the latter two are being addressed with DLSS 4, although it has to be seen how well this will work in practice. They also claim, almost as a footnote, that while frame rates are up to eight times higher than before (half of that through upscaling, half through three generated frames per real frame, from one inserted frame on the previous generation - which might indicate the raw processing power of the 5070 is half of the 4090), latency is “halved”, so maybe they are incorporating user input during synthetic frames to some degree, which would be an interesting advance. I’m speculating though based on the fluff from their press release:

https://www.nvidia.com/en-eu/geforce/news/dlss4-multi-frame-generation-ai-innovations/

Before anyone accuses me of being a shill for big N, I’m still on an old 2080 (which has DLSS upscaling and ray reconstruction, but not frame generation - you can combine this with AMD’s frame generation though, not that I’ve felt the need to do this so far) and and will probably be using this card for a few more years, since it’s still performing very well at 1440p with the latest games. DLSS is one of the main reasons it’s holding up so well, more than six years after its introduction.




Just because you aren’t affected, this doesn’t mean the problem doesn’t exist.

https://www.windowslatest.com/2024/12/19/microsoft-confirms-windows-11-24h2-issue-is-breaking-games-yanks-update-for-more-pcs/

Reports of people having issues with various games crashing due to this update go back to August.


It’s not their fault, because they were blindsided by the update to Windows, just like everyone else. They are working on a patch, but this takes time.


There’s also Enderal, another total conversion based on Skyrim, with a completely original game world and mechanics. It’s free, yet better than many commercially released RPGs.


They switched from UE4 to UE5 during development, so make of that what you will.

Given that the core audience of the series is and always has been hardcore PC gamers and also considering that past entries had both high hardware requirements when they released and pushed technical boundaries, they’d probably be better advised to prioritize visual fidelity over low hardware requirements. Not to mention, this is kind of expected of a remake, since one of the main points of playing any remake over the original - apart from accessibility - is up to date visuals.

If you are a gamer, have a desktop PC and are living in the developed world in any economic situation other than having to worry about every single buck, there’s very little reason not to be playing on a dedicated GPU. Powerful, yet cheap used cards that can run literally every game in existence are very easy to come by. Meanwhile, in the console space, the Xbox Series S is an incredible bargain - and then there’s the Steam Deck, which is one of the cheapest PCs of sorts that can run the majority of games.

None of this excuses a lack of optimization, of course, but we have simply no idea about the technical state this game will have at release yet. This is one area where the developers would be well-advised to break with series tradition.


[Due to a Windows 11 update causing issues. ]

You can blame Ubisoft all you want, but 1) it’s not their fault and 2) this game isn’t considered terrible in any way.



Have you tried DS4Windows?

https://ds4-windows.com/

The fact of the matter is that xinput remains the standard for controllers on PC. Use any other type of controller and you will have to fiddle around. It’s the input equivalent to having an ultrawide display.



My conspiracy theory that at least one developer has covertly uploaded rule 34 content to boost their game.



I would argue that China is not a global superpower. It lacks both hard and soft power for this status. Their conventional force projection capabilities are pitiful and culturally, the increasingly nationalist and restrictive dictatorship is unable to exert more influence than many small nations.


I feel like I’ve read predictions like this one a million times since the 1990s…


I’ve seen some early demonstrations of this, like this one from three years ago that makes GTA V look photoreal:

https://www.youtube.com/watch?v=P1IcaBn3ej0

The potential of this tech is enormous. Imagine an alternative to RTX Remix that turns any game into a near photoreal experience. Genres like simulations and racing games - which tend to attempt photorealism instead of creative art styles anyway, are primarily featuring inanimate objects (avoiding most of the uncanny valley that way) and could be transformed with models based on relatively limited training data - would be ideal at first.


That’s how new GPU generations have been pushed for as long as GPUs have existed.

And no there is no overengineering of GPUs. You don’t want stagnation or underwhelming to nonexistent jumps as with other tech products, like smartphones, do you?


Interesting. This is good news for modders though, because this means that combining cells (similar to mods for previous Bethesda games) should be straightforward, even on current hardware.


Because everyone likes a redemption ark. For as much as Pitchfork is a dick, his studio has plenty of good games under its belt. Dismissing them entirely because of one game you didn’t like is rather foolish.




No, it can’t, because it is not even remotely as user friendly - and even if it was, the mere fact that its user experience is extremely different makes a switch quite difficult to anyone but the most basic users (who need little more than a web browser).


I was about to ask if any of them were any good. The last NASCAR game I’ve played was by Papyrus and featured cars consisting of huge squares (that could, rather impressively so, fly off in a collision).


No, it’s not. Communist East Germany also had higher suicide rates during holidays - but it was a state secret there, because suicides couldn’t possibly be happening in the perfect “workers’ and peasants’ paradise”. They had the highest suicide rate in Europe, ahead of every capitalist country. Even performing research on the topic was forbidden.

The East German suicide rate of 31 per 100k population would be the fourth highest in the world today. Compare it to modern-day capitalist Germany (8.3), USA (14.5) or even South Korea (21.2).



You might end up like me one day, with a case that’s over 20 years old and has seen many hardware upgrades. I never removed the Athlon 64 sticker on mine…


If case, power supply and storage are still okay, just reuse them and save a not insignificant amount of money.


A 2080 is still more powerful than base PS5 and Xbox Series X, so it’ll likely do fine for the rest of this console generation.

What’s the rest of your hardware like? I’m asking, because a slow CPU can rather drastically limit the performance of a GPU.


I bought an RTX 2080 last year for around 200 bucks. It’s still very competent at 1440p. There are only a few games that lock away the highest texture settings due to this card only having 8 GB of VRAM.


I would suggest looking into a used Nvidia card instead. Something like an RTX 2080 or 3080 (or one of their variants), depending on your budget. DLSS alone makes these vastly superior to anything AMD has on offer.


To be fair, the most powerful card on the market can be a bit bigger than normal. Complaining about the bulk of a high-end card is like bemoaning the poor fuel economy of a Bugatti Veyron.


You could read the article, but I’ll save you the trouble:

[T]he PC release will come enhanced with native 4K resolution at up to 144hz on compatible hardware, monitor support for both Ultrawide (21:9) and Super Ultrawide (32:9), HDR10 support, and full keyboard and mouse functionality. It will also feature NVIDIA DLSS 3.7 and AMD FSR 3.0, NVIDIA DLSS Frame Generation, adjustable draw distances, shadow quality settings and more.