• 4 Posts
  • 84 Comments
Joined 3Y ago
cake
Cake day: Jun 09, 2023

help-circle
rss

The advantage of having every DLSS feature (except for frame generation) in a low-end card like this one can’t be understated. You need bit of extra frame rate and image quality you can get with a card like this one. DLSS is both the most widely supported and best upscaling method - and even if you don’t like it for some reason, you can still use FSR or XeSS in games that support it instead.

Just make sure to get the 8 GB instead of the identically-named 6 GB version, because of course a much worse card has the exact same name. Not only is there more memory, it’s also faster (128 bit instead of a 96 bit bus), there are more CUDA cores (2560 instead of 2304) and the card runs at a much higher clock speed (1.78 GHz vs. 1.47). It needs a separate power connector, unlike the cut-down variant, but it’s still more power-efficient than OP’s current card.


I remember trying the single player campaign when it came out in 2012. To say that it wasn’t good would have been an understatement. Everything about it was abysmal: Visuals, sound design, controls, level design, AI, weapon feel, etc. It was also ridiculously buggy. I’m talking 1, maybe 2/10 at best. This surprised me, given the fanfare surrounding the project back then and the high hopes I had. I also quite like flawed, quirky Indie and AA titles, but this project wasn’t it back then, despite being free.

I hope the final release is vastly improved, but honestly, the trailer reminds me too much of what it was like 14 years ago.

It also reminds me of the recently released Timesplitters fan game, which is even worse, one of the most incompetently designed and programmed games I’ve ever had the displeasure of experiencing. I’m not exaggerating.

Sorry for the outdated impressions and negativity. I really do hope Renegade X has come a long way.





Sure, but human-written shit still had that human touch. It could be unintentionally funny, it could be a mixed bag that reaches unexpected heights at times. AI writing is just the bland kind of bad, not the interesting kind of bad.


after chroma subsampling

Cool info, but I’m not sure about this part. Do you mean downsampling instead? Because chroma subsampling doesn’t make sense in this context.


Steam also has a built-in streaming feature that doesn’t require any additional software.


What emerging players? You can’t just whip up a competitive GPU in a jiffy, even if you have Intel money.

Also, unless they are from a different planet that has its own independent supply chain, they’d have to deal with the very same memory shortage and the very same foundries that are booked out for years.


Gaming laptops are notorious for dying from overheating. These things need to be meticulously maintained if you want to use them for their intended purpose for long.


Which is totally fine. Not every game has to support older hardware. Games are allowed to use “newer” tech.

Worth noting that I played Indy at 1600p/60 on an RTX 2080, which is a card from 2018 that I bought used for 200 bucks two years ago. This card can still run every single game out there and most of them extremely well, despite only having 8 GB of VRAM.

The whole debate is way overblown. This doesn’t mean that there aren’t games that could run a whole lot better, but overall, PC gamers with old hardware are still eating good.


This heavily depends on the game. Which game were you testing?

In my experience at least, small Indies and last-gen or earlier ports from console are fine, but games with frequent loading times and those designed for SSDs benefit from being installed to the internal storage.



That’s a single risk you (hopefully) willingly accepted when you decided to use an extremely niche browser.

Been there, done that, eventually discovered the errors of my ways and returned to Firefox.


Somewhere above 720p upscaled to 1080p with the help of FSR should work for 60 fps. The lower render resolution is likely enough to compensate for the lack of VRAM. This is all theoretical, of course.

How’s your CPU situation?


Bit late, but in theory, it should work just fine, even at 60 fps, given that the Steam Deck’s display resolution is only about half of 1080p.


Use supersampling. Either at the driver level (works with nearly all 3D games - enable the feature there, then select a higher than native resolution in-game) or directly in games that come with the feature (usually a resolution scaling option that goes beyond 100 percent). It’s very heavy on your GPU depending on the title, but the resulting image quality of turning several rendered pixels into one is sublime. Thin objects like power lines, as well as transparent textures like foliage, hair and chain-link fences benefit the most from this.

Always keep the limits of your hardware in mind though. Running a game at 2.75 or even four times the native resolution will have a serious impact on performance, even with last-gen stuff.

Emulators often have this feature as well, by the way - and here, it tends to hardly matter, since emulation is usually more CPU-bound (except with very tricky to emulate systems). Render resolution and output resolution are often separate. I’ve played old console games at 5K resolution, for example. Even ancient titles look magnificent like that.


Someone should compare it to the unofficial port (also, known as “Brazil project”, which has been out for a while now) and see which is working more smoothly.


I vaguely recall playing one of the two about 20 years ago (looking at the screenshots, I think it was the second game). It was a bonus game on a CD of some computer or gaming magazine. Even two decades ago and this shortly after release, it felt unbelievably dated and clunky already. The PC port was also complete garbage, with lots of bugs, awful visuals even by PS1 port standards and poor controls.

If you’re nostalgic for these games, they might be worth revisiting (although you’re probably remembering them being more impressive than they actually were), but if you’re not, I doubt they are worth picking up, even with the improvements from gog.

Just to compare these two to another dinosaur game from that era that received similarly poor reviews as the PC version of Dino Crisis, Trespasser was far more sophisticated and fun, in my opinion at least - and certainly a technical marvel by comparison. It’s not just that it’s fully 3D, with huge open areas (not possible on PS1, of course), but also the way it pioneered physics interaction. My favorite unscripted moment was a large bipedal dinosaur at the edge of the draw distance stumbling - possible thanks to the procedural animations - and bumping into the roof of a half-destroyed building, resulting in its collapse. That’s outrageous for 1998! I’ve only ever seen this happen once at this spot in the game, so it’s certainly not scripted.




Looking at the screenshots, I thought it was a port of a mid-gen PS4 game, but apparently, it’s a one year old former PS5 exclusive. Then again, this might explain the modest hardware requirements. You don’t see the minimum GPU on a AAA open world game being a GTX 1060 6 GB (a card from 2016) very often anymore. Perhaps it’ll run well on the Steam Deck, which is always appreciated. Reviews are solid enough that I might pick it up on sale.


Maxwell came out in 2014, Pascal in 2016 and Volta in 2017. Pretty long run for Maxwell in particular.

I was curious about the AMD side of things: They reduced support for Polaris and Vega (both 2017) in September 2023 already.


Thanks for the PSA, I had kind of forgotten about this wretched DRM.


It’s based on the Xbox 360/PS3 console port of the game. People figured this out pretty quickly, because a VTOL flying mission that these consoles couldn’t handle was missing from the remaster as well (later added back in with a patch). Colors are oversaturated, texture, object and lighting quality are down- or sidegraded (many not worse on a purely technical level, but different without being better for no reason, as if the outsourced Russian devs had a quota of changed assets to fill), lots of smaller and larger physics interactions are gone, because they were never part of that old console port. AI (quite a bit selling factor on the original as well, on top of the graphics and physics) is simplified as well. The added sprinkles of ray-tracing features here and there, as well as some nicer water physics do not make up for the many visual and gameplay deficiencies. On top of that, there are game-breaking glitches that weren’t part of the original.

The biggest overall problem I have with it is that it just doesn’t look and feel like Crysis anymore and instead has the look of a generic tropical Unity Engine survival game. The original had a very distinct visual identity, a muted, realistic look, but with enough intentional artistic flourishes that made it more than just a groundbreaking attempt at photorealism. You can clearly see this if you compare the original hilltop sunrise to the remaster. Crysis also had an almost future-milsim-like approach to its gameplay that is now a shell of its former self.

I will admit that to the casual player, many of these differences are minor to unnoticeable. If you haven’t spent far too much time with the original, you’re unlikely to notice the vast majority of it and might just notice how ridiculously saturated everything looks.

At the very least you can still buy the original on PC. On gog, it’s easy to find, but on Steam, it’s hidden for some reason [insert speculation as to why here]: If you use Steam’s search function, only the remaster appears. You have to go to the store page of the stand-alone add-on Crysis Warhead (the only Crysis game that did not receive the remaster treatment, likely because it was never ported to console), which can be purchased in a bundle with the original as the Maximum Edition (this edition also does not appear in the search results): https://store.steampowered.com/sub/987/


I’m glad it’s based on the original game instead of the hideous remaster.



Are you seriously suggesting that in the age of rage-fueled campaigns against any game that even dares to show a non-male in a positive light or permits the player to change pronouns user scores are reliable?

You yourself are bringing this culture war bullshit up as if it was valid criticism, which means I have a hard time taking anything you’re writing even remotely seriously.



I know I wrote perhaps a bit too much, so it’s understandable that you glossed over it, but the thing with 4x DLSS 4 frame generation compared to what you’ve tried on your 4090 is that it should result in less felt latency.


Can we stop with the fake frame nonsense? They aren’t any less real than other frames created by your computer. This is no different from the countless other shortcuts games have been using for decades.

Also, input latency isn’t “sacrificed” for this. There is about 10 ms of overhang with 4x DLSS 4 frame gen, which however gets easily compensated for by the increase in frame rates.

The math is pretty simple on this: At 60 fps native, a new frame needs to be generated every 16.67 ms (1000 ms / 60). Leaving out latency from the rest of the hard- and software (since it varies a lot between different input and output devices and even from game to game, not to mention, there are many games where graphics and e.g. physics frame rate are different), this means that at three more frames generated per “non-fake” frame and we are seeing a new frame on screen every 4.17 ms (assuming the display can output 240 Hz). The system still accepts input and visibly moves the view port based on user input between “fake” frames using reprojection, a technique borrowed from VR (where older approaches are working exceptionally well already in my experience, even at otherwise unplayably low frame rates - but provided the game doesn’t freeze), which means that we arrive at 14.17 ms of latency with the overhang, but four times the amount of visual fluidity.

It’s even more striking at lower frame rates: Let’s assume a game is struggling to run at the desired settings and just about manages to achieve 30 fps (current example: Cyberpunk 2077 at RT Overdrive settings and 4K on a 5080). That’s one native frame every 33.33 ms. With three synthetic frames, we get one frame every 8.33 ms. Add 10 ms of input lag and we arrive at a total of 18.33 ms, close to the 16.67 ms input latency of native 60 fps. You can not tell me that this wouldn’t feel significantly more fluent to the player. I’m pretty certain you would actually prefer it over native 60 fps in a blind test, since the screen gets refreshed 120 times per second.

Keep in mind that the artifacts from previous generations of frame generation, like smearing and shimmering, are pretty much gone now, at least based on the footage I’ve seen, and frame pacing appears to be improved as well, so there really aren’t any downsides anymore.

Here’s the thing though: All of this remains optional. If you feel the need to be a purist about “real” and “fake frames”, nobody is stopping you from ignoring this setting in the options menu. Developers will however increasingly be using it, because it enables previously impossible to run higher settings on current hardware. No, that’s not laziness, it’s exploiting hardware and software capabilities, just like developers have always done it.

Obligatory disclaimer: My card is several generations behind (RTX 2080, which means I can’t use Nvidia’s frame gen at all, not even 2x, but I am benefiting from the new super resolution transformer and ray reconstruction) and I don’t plan on replacing it any time soon, since it’s more than powerful enough right now. I’ve been using a mix of Intel, AMD and Nvidia hardware for decades, depending on which suited my needs and budget at any given time, and I’ll continue to do use this vendor-agnostic approach. My current favorite combination is AMD for the CPU and Nvidia for the GPU, since I think it’s the best of both worlds right now, but this might change by the time I’m making the next substantial upgrade to my hardware.


Just because a far simpler type of interpolation has existed before, this doesn’t mean that this type is “decades old”. You know that.



Speaking as one of three people who didn’t hate the game, the rather wholesome atmosphere this game ended up having was quite pleasant to experience. Exactly what I needed when it came out.






It would probably work in Adobe Reader. Surprisingly, it works in Firefox built-in PDF viewer, which has iffy compatibility with some more complex PDFs, but it might not work in the very different iOS build of this browser.