PC gaming itself will hardly change, because AMD cards work just fucking fine. They’ve only ever been a little bit behind on the high end. They’ve routinely been the better value for money, and offered a much lower low end. If they don’t have to keep chasing the incomparable advantages Nvidia pulls out of their ass, maybe they can finally get serious about heterogenous compute.
Or hey, maybe Nvidia ditching us would mean AMD finds the testicular fortitude to clone CUDA already, so we can end this farce of proprietary computation for your own god-damn code. Making any PC component single-vendor should’ve seen Nvidia chopped in half, long before this stupid bubble.
Meanwhile:
Cloud gaming isn’t real.
Anywhere after 1977, the idea that consumers would buy half a computer and phone in to a mainframe was a joke. The up-front savings were negligible and difference in capabilities did not matter. All you missed out on were your dungeon-crawlers being multiplayer, and mainframe operators kept trying to delete those programs anyway. Once home internet became commonplace even that difference vanished.
As desktop prices rose and video encoding sped up, people kept selling the idea you’ll buy a dumb screen and pay to play games somewhere else. You could even use your phone! Well… nowadays your phone can run Unreal 5. And a PS5 costs as much as my dirt-cheap eMachines from the AOL era, before inflation. That console will do raytracing, except games don’t use it much, because it doesn’t actually look better than how hard we’ve cheated with rasterization. So what the fuck is a datacenter going to offer, with 50ms of lag and compression artifacts? Who expects it’s going to be cheaper, as we all juggle five subscriptions for streaming video?
You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: [email protected]
Video game news oriented community. No NanoUFO is not a bot :)
Posts.
News oriented content (general reviews, previews or retrospectives allowed).
Broad discussion posts (preferably not only about a specific game).
No humor/memes etc…
No affiliate links
No advertising.
No clickbait, editorialized, sensational titles. State the game in question in the title. No all caps.
No self promotion.
No duplicate posts, newer post will be deleted unless there is more discussion in one of the posts.
No politics.
Comments.
No personal attacks.
Obey instance rules.
No low effort comments(one or two words, emoji etc…)
Please use spoiler tags for spoilers.
My goal is just to have a community where people can go and see what new game news is out for the day and comment on it.
PC gaming itself will hardly change, because AMD cards work just fucking fine. They’ve only ever been a little bit behind on the high end. They’ve routinely been the better value for money, and offered a much lower low end. If they don’t have to keep chasing the incomparable advantages Nvidia pulls out of their ass, maybe they can finally get serious about heterogenous compute.
Or hey, maybe Nvidia ditching us would mean AMD finds the testicular fortitude to clone CUDA already, so we can end this farce of proprietary computation for your own god-damn code. Making any PC component single-vendor should’ve seen Nvidia chopped in half, long before this stupid bubble.
Meanwhile:
Cloud gaming isn’t real.
Anywhere after 1977, the idea that consumers would buy half a computer and phone in to a mainframe was a joke. The up-front savings were negligible and difference in capabilities did not matter. All you missed out on were your dungeon-crawlers being multiplayer, and mainframe operators kept trying to delete those programs anyway. Once home internet became commonplace even that difference vanished.
As desktop prices rose and video encoding sped up, people kept selling the idea you’ll buy a dumb screen and pay to play games somewhere else. You could even use your phone! Well… nowadays your phone can run Unreal 5. And a PS5 costs as much as my dirt-cheap eMachines from the AOL era, before inflation. That console will do raytracing, except games don’t use it much, because it doesn’t actually look better than how hard we’ve cheated with rasterization. So what the fuck is a datacenter going to offer, with 50ms of lag and compression artifacts? Who expects it’s going to be cheaper, as we all juggle five subscriptions for streaming video?