• 0 Posts
  • 385 Comments
Joined 2Y ago
cake
Cake day: Mar 22, 2024

help-circle
rss

So the next consoles would be cloud/streaming consoles only.

They very well could be.

The hardware is near-identical though, or at least it was for PS Now. So the barrier to re-use game streaming hardware for a physical console is fairly low.

I think you’re being quite a bit disingenuous here. AMD hasn’t made a “highest end GPU variant” in a literal decade. They’ve never had a competitor to the Titan cards nor the *90 variants, and with the *80 variant slowly taking over the top-end consumer spec(because the *90 took over the TItan classification), all of this isn’t because of AI. It’s just AMD lagging behind the entire time. And I love AMD, but they’ve never been known for highest end. And Intel has NEVER made a highest end GPU variant. So not sure where that claim is coming from.

It’s about silicon size to me. Even if a bit behind Nvidia’s mega dies, AMD made “big die” cards consistently, like the 6970, 7970, 290, Fiji, Vega 64, the 6900, 7900 XTX. But the 9000 series is different. The top-end 9070 XT is “only” 356.5 mm2 and 256-bit; a mid-range size. The only recent precedent for that is the RX 480, but those were cheaper and sold alongside higher end GPUs.

And with Arc Battlemage, Intel allegedly had a bigger die in the works, but canceled it. Presumably because they didn’t think it was financially viable.


You make fair points. I’m probably panicking and being a little dramatic here… Custom SoCs would probably be questionable if regular graphics are.

But I still don’t like the trajectory. It feels like AMD/Intel are struggling to even stay alive in the space, while Nvidia seems to think it’s not so important, and I don’t like where that goes.


And the same can’t be said for consoles?

Console chips are high volume, single SoC, ordered by one reliable customer (Sony), and can make the transition to cloud gaming if they have to. Sony’s already experimented with this, actually.

Discrete PCIe GPUs and “desktop” CPUs, on the other hand, are:

  • Mostly consumed by gaming laptops, which OEMs could very well abandon.

  • And partially go to workstations/gaming desktops.

But repurposed server chips can serve workstations, while tablets and thin clients can eat the desktop/gaming laptop market from the bottom up, too. The niche that assembles higher end gaming PCs isn’t enough to amortize the massive cost of such gaming GPUs by themself (hence AMD and Intel already abanonded their highest end GPU variants).


Yeah.

TBH there are too many PC games. It’s overcrowded. Sony has some great studios, but it’s not like the platform will wither because they leave.

But like someone said, I’m more worried Sony thinks PC hardware won’t be viable anymore, and is exiting a dying platform. I know that seems inconceivable now, but a few years AMD/Intel/Nvidia could easily decide higher end gaming hardware is just not worth developing.

It’s already started, seeing AMDs and Intel already cut some GPUs and Nvidia is allegedly pondering the same.


Yeah.

That’s my sad read. They think gaming PCs are going to die as a market; mind as well get out now and push PlayStation instead.


Sounds like fud, from all angles. No one but Tech Bros and their bots care about crypto, porn has always been porn. There are literally thousands of great, dirt cheap games to play.

But anecdotally, I observed YouTube suck away a lot of attention from games and TV in my family. It’s lower brainpower, so if one is (say) dog tired from work, the algorithm has a lot of appeal vs a hardcore KCD2 session or an intense TV drama.

Discoverability is certainly an issue too. If game advertising boils down to “The TikTok algorithm and YouTube influencers,” then of course the predatory casino games are going to win that war.


Not sure why you’re so sure that cloud would be the next winner either.

Because, in aggregate, gamers are stupid consumers.

I hate to be so blunt, but they have, repeatedly and demonstrably, made uninformed purchases. They buy bad games on launch day, complain, then turn around and do it again. They buy hardware known to be a lemon. Heck, they’ll hardly even look at AMD or Intel GPUs now simply because there’s isn’t a minimum amount of effort made to shop around.

They are going to just buy the cloud gaming subscriptions if that’s all that’s financially viable, and it’s what’s popular in their YouTube feeds or Discord channels or whatever.

Keep in mind that I’m talking about the bulk market. Sure, plenty of us will turn our nose up. But the R&D required to develop consumer hardware requires volume, so updates will get slimmer with less money in the pool. It’s already happened with the AMD 9000 GPUs (as shrinking sales could not justify a big-die 7900 successor).


Which would you rather have as the dominant platform. Consoles, or cloud gaming?

Because if “market conditions” kill consoles, they will shrink PC gaming hardware sales too, and I don’t want a world where devs target cloud gaming first.

I’m not trying to defend consoles and their predatory practices, but you can’t separate them out. If subsidized console hardware is too pricey to sell, then PC gaming components will absolutely atrophy too.


1st party engine devs have been stuck in dev hell, mostly. There are some exceptions, like you said; I’d cite Decima as another success.

But think of EA’s Frostbite, Cyberpunk 2077, Halo Infinite, Clausewitz, BGS, many more. Especially indies that try.

It’s not just that old games crunched, but making a new engine that supports modern platforms and modern hardware is just an immensely complex task. There’s just too much to worry about.

The best success seems to either come from:

  • Hyperfocusinf one’s engine’s scope to one game niche. Larian’s divinity engine, for example, makes BG3-likes; that’s it, that all it does. It cannot make an FPS or even a different RPG.

  • Engine shop very, very carefully. For instance, KCD2 leaned into CryEngine’s strengths hard, especially that dense, well-lit European foilage.

And either case needs a lucky roll of the dice anyway. See: Cyberpunk 2077 in utter dev hell (even if they eventually pulled out) from wrangling their engine. Or the latest Borderlands being a technical wreck even though they basically invented Unreal Engine alongside Epic.


AAA expectations are astronomical, AAs take some extra time to keep up, and indies that actually make it take the time to do their own thing, otherwise they’re almost certainly part of the vas, unseen sea of failed indies.

Also, oldschool game dev was toxic. It had some serious crunch culture, just to start. But I think it also attracted talented devs into “sweet spot” dev team sizes; not too big or too small.

And now, if you do software and want to make any money or provide for a family… well, you don’t do game dev. And that phenomenon has gotten worse and worse.



Your processor is fully configurable. Set it to whatever TDP you want; the lower you go, the more efficient it gets.

This is why AMD’s X3D chips are perceived as efficient. They aren’t actually that much more power efficient, but they’re configured to stay out of absolutely crazy voltage/clock zones most processors boost to. Cap a regular AM5 chip to the same power level, and it’s pure task efficiency wouldn’t be too far off.


DO NOT TRUST APPS.

iOS is more strict about “reining in” apps that burn battery tracking/spamming you in the background, but (last time I used it) Android is more unruly. Restrict permissions and backgrounding for everything unless it’s absolutely essential.


I’d disagree on Firefox. It’s just so much slower than Chromium on Android, and burns more power.

Of course don’t use Google Chrome, but look into forks like Cromite.


Shame they didn’t go Intel. Arc is good, and they could have gotten around TSMC supply constraints.


…I actually wouldn’t be against this.

But it isn’t even genuine. They’ve forked llama.cpp into a broken clone like about 500 other corporations, instead of just contributing to shit that actually works and is used, and… that’s about it.

That’s about par for the AI industry.


Who fucking cares?

Credit card companies.

And their ad buyers, maybe.



Ah. Then I don’t know.

Maybe a bigger dev just has more leverage?


Neither was particularly sexually explicit IIRC.

I dunno, though. Maybe censorship was an “easier” ask since the setting was already there.


To be fair, there would be a loooot of nipples on Twitch otherwise. It’d be nipple sea.

Still stupid, though.


In other news you all will be thrilled to hear, I finally switched my KDE desktop to Nvidia.

For years, I ran it off my AMD CPU’s graphics, and completely disabled Nvidia display out. It was just less trouble. But I undid all that yesterday, and… stuff just works, as far as I can tell. It even fixed an HDR issue I was having, and KDE’s VRAM usage isn’t so egregious anymore.



It’s one of those “reporting on social media without actually adding anything” articles, but in this case it’s pretty cool.


It’s all just show anyway. All the Nvidia chip restriction did is teach Chinese devs to do more with less, and now they’re running circles around other labs that have 100X the hardware. They don’t need the H200s.

You ask me? If the US wants to seed AI development: restrict Nvidia GPU sales in the US. It’d force labs to get smarter with less, and branch out to more diverse hardware, instead of monopolizing and scaling up.


Imagine where Epic would be if they had just censored Tim Sweeney’s Twitter account.

It’s like he’s hell bent on driving people away from Epic. I’m not sure I could be more abrasive if I tried, without losing the plausible deniability of not trying to troll.



I mean… It functioned as a CPU.

But a Phenom II X6 outperformed it sometimes, single thread and multithreaded. That’s crazy given Pildriver’s two generation jump and huge process/transistor count advantage. Power consumption was awful in any form factor.

Look. I am an AMD simp. I will praise my 7800X3D all day. But there were a whole bunch of internet apologist for Bulldozer back then, so I don’t want to mince words:

It was bad.

Objectively bad, a few software niches aside. Between cheaper Phenoms and the reasonably priced 2500K/4670K, it made zero financial sense 99% of the time.


Even the 6-core Phenom IIs from 2010 were great value.

But to be fair, Sandy Bridge ended up aging a lot better than those Phenom IIs or Bulldozer/Piledriver.


AMD almost always had the better price/performance

Except anything Bulldozer-derived, heh. Those were more expensive and less performant than the Phenom II CPUs and Llano APUs.


IDK, lots of the TW3 sidequests seem to be very good… from my time watching them on YouTube, heh.


It might be a “watch the cutscenes and scenery highlights in a YT play though” game.


I like this take.

It seems like there are ‘victims’ caught up in the hype and sinking way too much money into SC. But if the gameplay is enjoyable, and fits your budget? Enjoy it. Hell yes.


Heh, that’s correct.

This meme video about sums it up:

https://youtu.be/n42JQr_p8Ao

The answer is “you play at release and buy them over time, like a crab in slowly boiling water,” though the absolutely incredible rate they introduce bugs into the games kinda knocks you out of the habit.


Whoa, this sounds like drama. Though over what, I don’t know. I didn’t know CDPR’s co-founder left awhile ago.

What’s going on there at CDPR, and why would GoG want to get out from under them?


Some “DLC happy” games seem to work in niches while mostly avoiding the micro-transaction trap. I’m thinking of Frontier’s “Planet” games, or some of Paradox’s stuff.

I’m confused at some games not taking the DLC happy route, TBH. 2077, for instance, feels like it’s finally fixed up, and they could make a killing selling side quests smaller in scope than the one they have.


CUDA is actually pretty cool, especially in the early days when there was nothing like it. And Intel/AMD attempts at alternatives have been as mixed as their corporate dysfunction.

And Nvidia has long has a focus on other spaces, like VR, AR, dataset generation, robotics, “virtual worlds” and such. If every single LLM thing disappeared overnight in a puff of smoke, they’d be fine; a lot of their efforts would transition to other spaces.

Not that I’m an apologist for them being total jerks, but I don’t want to act like CUDA isn’t useful, either.


Yeah I mean you are preaching to the choir there. I picked up a used 3090 because rocm on the 7900 was in such a poor state.


That being said, much of what you describe is just software obstinacy. AMD (for example) has had hardware encoding since early 2012, with the 7970. Intel quicksync has long been a standard on laptops. It’s just a few stupid propriety bits that never bothered to support it.

CUDA is indeed extremely entrenched in some areas, like anything involving PyTorch or Blender’s engines. But there’s no reason (say) Plex shouldn’t support AMD, or older editing programs that use OpenGL anyway.


Well, same for me TBH. I’d buy a huge Battlemage card before you could blink, but the current cards aren’t really faster than an ancient 3090.

But most people I see online want something more affordable, right in the 9000 series or Arc range, not a 384-bit GPU.


Fact is, Nvidia has the vast majority of market share: https://www.techpowerup.com/337775/nvidia-grabs-market-share-amd-loses-ground-and-intel-disappears-in-latest-dgpu-update

It’s inexplicable to me. Why are folks ignoring Battlemage and AMD 9000 series when they’re so good?

Alas, for whatever reason they aren’t even picking existing alternatives to Nvidia, so Nvidia suddenly becoming unavailable would be a huge event.


AFAIK, the H100 and up (Nvidia’s bestselling data center GPUs) can technically game, but they’re missing so many ROPs that they’re really bad at it. There will be no repurposing all those AI datacenters for cloud gaming farms.