Maybe if games consistently had some kind of really reliable auto-profiling mechanism that could go run various “stress test” scenes with a variety of settings to find reasonable settings for given hardware
…like most games from early 10’s? A lot of them had built-in benchmark that tested what your PC is capable of and then set things up for you.
Yeah, there are auto-calibration systems, but that’s why I’m emphasizing “reliably”. I’ve had some of them, for whatever reason, not ramp up quality settings on hardware a decade later even though it can run it smoothly, which is irritating. In fairness to the developers, they can’t test on future hardware, but I also don’t understand why that happens. Maybe there’s some degree of hard-coded assumptions that fall down for some reason down the line.
You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: [email protected]
Video game news oriented community. No NanoUFO is not a bot :)
Posts.
News oriented content (general reviews, previews or retrospectives allowed).
Broad discussion posts (preferably not only about a specific game).
No humor/memes etc…
No affiliate links
No advertising.
No clickbait, editorialized, sensational titles. State the game in question in the title. No all caps.
No self promotion.
No duplicate posts, newer post will be deleted unless there is more discussion in one of the posts.
No politics.
Comments.
No personal attacks.
Obey instance rules.
No low effort comments(one or two words, emoji etc…)
Please use spoiler tags for spoilers.
My goal is just to have a community where people can go and see what new game news is out for the day and comment on it.
…like most games from early 10’s? A lot of them had built-in benchmark that tested what your PC is capable of and then set things up for you.
Yeah, there are auto-calibration systems, but that’s why I’m emphasizing “reliably”. I’ve had some of them, for whatever reason, not ramp up quality settings on hardware a decade later even though it can run it smoothly, which is irritating. In fairness to the developers, they can’t test on future hardware, but I also don’t understand why that happens. Maybe there’s some degree of hard-coded assumptions that fall down for some reason down the line.