Make the software work on ARM so people can do their day to day tasks, then the games will come.
Mac OS has had a decent push for games on their Apple Silicon Macs recently. With how powerful the iGPU is even a lowly MacBook air can run modern games. It’s just getting more and more developers to pay attention to anything outside of Windows (aka nothing has changed in the last 20 years). Proton is the only reason the steam deck is as good as it is. A lot of native Linux ports just straight up suck and the proton version runs better.
Power (heat) modern (especially high end) CPUs and GPUs can see some impressive reductions in power with not too much of a hit to performance.
My 4090 underclocked basically as far as it will go draws about 200 watts under normal loads, 250-300 under pretty extreme loads. But the performance penalty is only like 15%
Normally when you want to save power in a desktop device you just throw a laptop CPU or GPU in there because those are tuned for better performance per watt…. Or just not an Intel CPU because AMD is so much better in that regard.
High end graphics cards have become so expensive that people can’t afford gaming with good graphics
Not only that, but mid range cards just haven’t really moved that much in terms of performance. The ultra high end used to be a terrible value only for people who want the best and didn’t care about money. Now it almost makes sense from a performance per dollar standpoint to go ultra high end. At launch the 4090 was almost twice the performance of the 4080, but only cost about 1.5x. And somehow the value gets worse the lower end you go.
Meanwhile mid-high end cards like the 4060 and 7600 (which used to be some of the best values) are barely outperforming their predecessors.
I wonder if I need a separate burner for each suspicious app.
That’s going pretty far overboard. Just use an app like island to forcibly isolate and stop the app from running when you don’t want it.
This is a pretty good answer. https://android.stackexchange.com/questions/241281/how-exactly-do-apps-not-running-in-the-background-receive-notifications
Up until Android 14 I think. Android 13 for sure does not support it.
Unless an ADB trick counts https://tasker.joaoapps.com/userguide/en/help/ah_secure_setting_grant.html
I thought they still made games for the non series X/S/One but apparently they stopped in 2023.
Well either way I felt the same way about the original one. I got mine in 2018 and I’ve played less than 100 hours of games on it, and never actually bought any games for it (I got it second hand with about 6 games).
In general I think AAA games just aren’t worthwhile, let alone AAA exclusives. There’s a lot of great indie games, but that requires going through the sludge of indie games.
Is it really enshitification if it was like this from essentially the start?
Outside of like literally the very first few android phones when there wasn’t even apps for the platform, they came bundled with all sorts of shit. I remember my HTC from 2010 drove me insane with the junkware bundled in, and that was about 2 years post Android’s first phone. We’re at like more or less steady state shitification. (varies by phone/brand)
Like, we can obviously still follow news and whatnot
I stopped following the news first, then largely lost interest in new games after that. After TotalBiscuit passed I haven’t seen a single thing about video game news or reviews. If there’s something I’m interested in I might skim through a review, but that’s the most I do.
Original MSRP of the A770 was $330 so that is a big improvement. I assume intel is sticking with a reasonable launch MSRP to set expectations right.
https://www.tomshardware.com/news/intel-arc-a750-a770-full-pricing-revealed
It would be incredibly stupid for Intel to abandon the dGPU market after spending all this money on it. As long as Battlemage turns out alright (basically it’s only goal) I doubt it will go away.
They cut the die size nearly in half so they’re no longer blowing a fuck ton of money on a $200 GPU. As long as utilization of the silicon goes up it should be fine.
Blu rays still make up most of physical sales
I actually really dislike DLSS and FSR. At least to me the upscaling is pretty noticeable (but not the end of the world), but the artifacts that it causes drive me insane. I haven’t tested Onion Ring with it. But for example FH5 I get all sorts of ghost images on my screen and they go away as soon as I turn off DLSS.
Also weirdly on my laptop I got worse performance. I’m assuming that’s because I’m 100% CPU limited and there is a bit of CPU overhead to running DLSS.
Intel’s CEO says ‘large’ integrated GPUs are the way forward.
You didn’t even have to click on the article it was in the preview text. And that’s exactly what Intel has been doing with their 100 and 200 series CPUs (that’s what they’re called right?). The 140v that’s in the lunar lake while not cleanly beating AMD’s 890 is putting up a pretty good fight. And that’s in the super hamstrung for power Lunar Lake CPUs, with Arcs horribly unoptimized silicon and drivers. https://www.youtube.com/watch?v=eg74aUQGdSg
If Intel can figure out how to slim down the silicon for battlemage to make it more efficient (space and power wise) then they could have some actual competition for AMD.
plus
“One of the things about Horse Armour that you have to remember is Bethesda, I believe, was the very first company to do downloadable content expansions,” Nesmith told us. “Nobody had done that before for the platforms. We literally pioneered that. And so Bethesda didn’t know what the hell it was doing at the time. We didn’t know!”
PCIe tunneling.