
I mean, TES VI could be a rickroll mp4 and still sell millions of copies. There’s a megaton of nostalgia, and gamers are demonstrably… not the smartest shoppers, in aggregate.
Starfield and FO76 are not commercial failures, even if they aren’t hits either.
Point being, BGS is not short on time. I posit they have at least one “freebie” no matter what, or maybe a few more mediocre releases that will still sell big.

Actually this makes perfect sense.
Starfield is… trying to be part Mass Effect with big-budget cutscenes, but it has less charisma than Wrex has in his toe.
I’d argue it’s a bad “Bethesda wandering RPG,” without the quirky, charming side areas Oblivion or even Fallout 76 have.
But it’s an alright No Man’s Sky-like.
You want some crafting? Looting? A vast amount of chill exploration area? Reasonable “I’m in space” fidelity and tasks to tickle your brain? Starfield’s got it in droves. BGS games scratched this NMS kind of “looting exploration sandbox” itch for some, when there was no big-budget alternative back then, and I think Starfield leans into it more.
Hence my hypothesis is that gamers who love No Man’s Sky like Starfield, those who are looking more for “Mass Effect 2” loathe Starfield. And you and @[email protected] seem to be further datapoints supporting my observations.
The problem is Starfield’s expectation for most us internet dwellers was “Skyrim but Mass Effect.” And it’s kind of Bethesda’s fault for setting that expectation instead of leaning into Starfield’s real niche (and wasting cash on what BGS isn’t very good at).

There some some very efficient games using UE5, like Satisfactory.
On the contrary, I’m afraid of custom engine games. Even if they ultimately turn out okay, the dev hell required to get them there often sinks the game. See: ME: Andromeda, Cyberpunk 2077. And Distant Worlds 2 (even though it wasn’t technically fully custom).
IMO the best path is choosing the game engine for your niche. As an example, Cryengine was practically made for KCD2’s European forests and medieval towns. Larian’s Divinity engine is literally made for a D&D-type game like BG3.

Jank is alright, as pointed out:
“This is why I love PC gaming,” Zukowski says. “There’s just more acceptance of jank.” Whereas consoles, with their stricter approvals and more cumbersome patch pipelines, have “so much cert to go through.”
But:
“Big publishers charging $70 or $80 might find PC gamers less tolerant of jank.”
That, and mtx spam and being boring.
Like, I can look past a junky storefront if the game is sublimely written. I might run some anticheat. But AAAAs seem hell bent on achieving that miserable trifecta, and charging for it. I think the combination is more poisonous than the individual ingredients.
Yeah. Motorsport should have been right up my alley but… what the heck are they doing with MP?
Last I played, the only half-usable race was the mixed class one, as it was medium length instead of short. It meant soft tires weren’t the uber-end-all meta, that the start isn’t such an apocalypse, and that one troll who knocks you off doesn’t end your whole race because it’s like 3 laps. And that you actually have time to pass.
I think they did away with it, and with no reason to touch the SP campaign with the stupid AI… I just quit? Kinda with the feeling you get after mediocre fast food. “Why did I eat that?”
Which sucks, as some of the cars are so much fun. I love the can am monsters, the ancient Le Mans cars, the quirky supercompacts and such, all sharing a circuit. What a waste.
Horizon 5 rallying feels great, but only on long-travel suspensions that don’t bounce over the road like a cartoon.
Try the RJ Anderson #37 Pro2 truck, give it 4WD, soften the suspension/tires, fatten the rear tires and take it on that downhill mountain course. It’s utter bliss. I also like the Ford Ranger T6 on flatter courses, and the Rally Fighter for RWD fun.
…But yeah, FH5 is too arcadey. Cross country is just miserable outside of the slowest class. The campaign is so sycophantic and stupid, and MP matchmaking racing is utterly broken. I’ll probably skip 6 too.
Also, if y’all are interested, run local models!
It’s not theoretical.
The cost of hybrid inference is very low; You can squeeze Qwen 35B on a 16GB RAM machine as long as it has some GPU. Check out ik_llama.cpp and ubergarm’s quants in particular:
https://huggingface.co/ubergarm/models#repos
But if you aren’t willing to even try, I think that’s another bad omen for local models. Like the Fediverse, it won’t be served to you on a silver platter, you gotta go out and find it.
…Without cash, though?
We’ve had an obvious, somewhat proven path to uber fast local inference (bitnet), but no one has taken it. No one is willing to roll the dice with a few multi-million dollar training runs, apparently, and this is true of dozens of other incredible papers.
It seems like organization around local model tinkering is hanging by a thread, too. Per usual, client business will barely lift a finger to support it.
So while I’m a local acolyte, through and through, I’m a bit… disillusioned. It doesn’t feel like anyone is coming to save us.

That’s too far.
MSAA, SMAA, temporal AA, heaven forbid FXAA, they all suck. So does resolution scaling; if you can’t run native, you take a massive hit.
I don’t want to go back to that world of juggling between them or suffering with AA off or hacked in. It’s easy to say “oh, just code it better,” but all these solutions are inefficient on modern hardware; go back, and you leave performance on the table.
DLSS/XeSS/FSR4 and Unreal’s scaling are very convenient solutions. It antialiases perfectly, it scales to your monitor wonderfully. It’s not universal, but it looks fantastic as long as the hardware supports it base res and performance is alright.
Now, frame gen is too much of a mixed bag, and DLSS5 as demoed is obviously too far.

FYI y’all should check out Age of Wonders 4/Planetfall too.
It’s not the same, but it’s adjacent. Most of the game is moving your heroes around the map to explore, clear ruins, fight in hell and such, and the point if the civ-like part of the game is essentially to supply said heroes with units for the turn-based clashes.

Quite Mass Effect Andromeda-ish.
…Which is an under-rated game, IMO. Yes, the main quests and characters have the charisma of sticks in mud, but it has some neat side quests, kinda like a BGS RPG. I liked the Ryder-SAM dynamics, and counter to the “my face is tired” meme, the animations, graphics, combat and everything are all great. Mechanically, it feels infinitely better than Starfield.
So yeah, I’ll take more of that. Even if it doesn’t have ME trilogy-like characters, which is kinda an unreasonable expectation.
What I do not want is another Starfield.

Yeah, the ASICs in newer TVs are crazy powerful, and crazy good at it. They’re nothing like what you’d find in a phone or even a PC, and even a one-generation jump for our Sony TVs was an improvement.
That’s what I was trying to emphasize. I think interpolation on old TVs, and maybe early versions of SVP, left a bad taste in people’s mouths. Kind of like fake HDR.
…But I also think there’s a lot more sentiment against any kind of “processing” since the rise of AI slop.
As an example I often cite, there was this old TV show I helped touch up for a “fan” release, a long time ago. One small component in a very long pipeline was a GAN upscaler… It worked fine. The original TV release was broken as hell, and people loved the improvement.
Fast forward many years later, and I mention this was used in the “remaster” still floating around, and the same subreddit goes ballistic. They literally did not believe me, or cooed about the “flaws” of the original, or called it slop and against the rules and wanted me banned.
And I suspect frame interpolation and resolution scaling in other contexts get tossed in that same bucket. Not that I blame anyone. AI does suck.

I’d like an oled, but with the prices, I really have no need for it for gaming and the TV I have is fine for normal watching.
That is entirely fair. Electronics are all crazy expensive, really.
Yeah, LCDs went from bad to “mixed” and stayed that way for a long time. Granted, some things like absolute sharpness are not great on a CRT, but still.

I have. I A/B test it all the time. I pause and pixel peep.
And I don’t watch any sports, nor any marvel movies.
“huh, I guess the lag on my flat screen isn’t too bad for gaming”
I’ve had CRTs. And I have one of those “zero latency” overclocked LCD monitors with no internal scaler. As much as I like them, they feel sluggish compared to something newer.
Yeah sorry I’m not into high def TV myself.
In that case, I suspect you haven’t tried it on more modern displays, or when its baked into transcoded footage with one of the better filters.
Yes, it looks awful and artifacty processed by older LCDs. But it looks really good these days.

Agreed.
The effect is waaaaay too strong in those screenshots, but a more subtle version would be alright.
And yes. It’s definitely “sexifying” the woman in the shot. Transformers img2img models are notorious for basically:

I could speculate why. Could be that it’s (unfortunately) mostly male Tech Bros developing them? Or it could be that a massive fraction of the dataset is sexualized photos of women scraped from social media. But TBH, while I don’t know why this is the case, pretty much all diffusion models tend to “Instagram” women more than men.

Depends what you define as “AAA”
Baldurs Gate 3, for instance, has no nonsense, and every word out of the director’s mouth is “we made this decision because it’s what our developers wanted.” But while the dev team is “AAA” large, Larian doesn’t really fit the mould of Ubisoft, EA, Rockstar, Blizzard or whatever.
I think we need a new designation for what are basically megacorp operations.

So the next consoles would be cloud/streaming consoles only.
They very well could be.
The hardware is near-identical though, or at least it was for PS Now. So the barrier to re-use game streaming hardware for a physical console is fairly low.
I think you’re being quite a bit disingenuous here. AMD hasn’t made a “highest end GPU variant” in a literal decade. They’ve never had a competitor to the Titan cards nor the *90 variants, and with the *80 variant slowly taking over the top-end consumer spec(because the *90 took over the TItan classification), all of this isn’t because of AI. It’s just AMD lagging behind the entire time. And I love AMD, but they’ve never been known for highest end. And Intel has NEVER made a highest end GPU variant. So not sure where that claim is coming from.
It’s about silicon size to me. Even if a bit behind Nvidia’s mega dies, AMD made “big die” cards consistently, like the 6970, 7970, 290, Fiji, Vega 64, the 6900, 7900 XTX. But the 9000 series is different. The top-end 9070 XT is “only” 356.5 mm2 and 256-bit; a mid-range size. The only recent precedent for that is the RX 480, but those were cheaper and sold alongside higher end GPUs.
And with Arc Battlemage, Intel allegedly had a bigger die in the works, but canceled it. Presumably because they didn’t think it was financially viable.
You make fair points. I’m probably panicking and being a little dramatic here… Custom SoCs would probably be questionable if regular graphics are.
But I still don’t like the trajectory. It feels like AMD/Intel are struggling to even stay alive in the space, while Nvidia seems to think it’s not so important, and I don’t like where that goes.

And the same can’t be said for consoles?
Console chips are high volume, single SoC, ordered by one reliable customer (Sony), and can make the transition to cloud gaming if they have to. Sony’s already experimented with this, actually.
Discrete PCIe GPUs and “desktop” CPUs, on the other hand, are:
Mostly consumed by gaming laptops, which OEMs could very well abandon.
And partially go to workstations/gaming desktops.
But repurposed server chips can serve workstations, while tablets and thin clients can eat the desktop/gaming laptop market from the bottom up, too. The niche that assembles higher end gaming PCs isn’t enough to amortize the massive cost of such gaming GPUs by themself (hence AMD and Intel already abanonded their highest end GPU variants).

Yeah.
TBH there are too many PC games. It’s overcrowded. Sony has some great studios, but it’s not like the platform will wither because they leave.
But like someone said, I’m more worried Sony thinks PC hardware won’t be viable anymore, and is exiting a dying platform. I know that seems inconceivable now, but a few years AMD/Intel/Nvidia could easily decide higher end gaming hardware is just not worth developing.
It’s already started, seeing AMDs and Intel already cut some GPUs and Nvidia is allegedly pondering the same.
It’s discounted on GoG!
https://www.gog.com/en/game/clair_obscur_expedition_33
Which is where it should be bought anyway, as it will be DRM free and the lightest to run.