• 0 Posts
  • 88 Comments
Joined 1Y ago
cake
Cake day: Mar 22, 2024

help-circle
rss

Vincke says the team finds DLC boring to make, so they don’t really want to make it anymore.

I find this driveby comment rather significant.

It means they are trying to conform to the developers’ strengths, desires, interests. They’re shaping huge business decisions around them. That’s just good for everyone, as opposed to devs inefficiently, dispassionately grinding away at something they don’t like.

That’s huge. I’d also posit “happy devs means happy business.” And Larian has repeatedly expressed similar things.



Yeah, I read some, and I am worried about it being a little barebones. Runescape nostalgia is a huge draw though.



They seem to love writing cities and fantasy-tech too, going by some of the stuff in BG3.

Looks like Shadowrun’s licensing is a complicated mess though, with Microsoft at least involved, so I guess it’s unlikely :(


Oh man, imagine if they did a Shadowrun game. Take their fantasy credentials/writing and mix it with cyberpunk…


Awesome!

I wonder if things will organize around a “unofficial” modding API like Harmony for Rimworld, Forge for Minecraft, SMAPI for Stardew Valley, and so on? I guess it depends if some hero dev team does it and there’s enough “demand” to build a bunch of stuff on it. But a “final” patch (so future random patches don’t break the community API) and community enthusiasm from Larian are really good omens.

Skyrim and some other games stayed more fragmented, others like CP2077 just never hit critical mass I guess. And the format of the modded content just isn’t the same for mega RPGs like this.


How is the modding scene these days? Seems like there’s a lot in the patch addressing that, but are things still more aesthetic?


Yeah you are correct, I was venting lol.

Another factor is that fab choice design decisions were made way before the GPUs launched, when everything you said (TSMC’s lead/reliability, in particular) rang more true. Maybe Samsung or Intel could offer steep discounts for the lower performance (hence Nvidia/AMD could translate that to bigger dies), but that’s quite a fantasy I’m sure…

It all just sucks now.


Heh, especially for this generation I suppose. Even the Arc B580 is on TSMC and overpriced/OOS everywhere.

It’s kinda their own stupid fault too. They could’ve uses Samsung or Intel, and a bigger slower die for each SKU, but didn’t.


The chips going to datacenters could have been consumer stuff instead.

This is true, but again, they do use different processes. The B100 (and I think the 5090) is TSMC 4NP, while the other chips use a lesser process. Hopper (the H100) was TSMC 4N, Ada Lovelace (RTX 4000) was TSMC N4. The 3000 series/A100 was straight up split between Samsung and TSMC. The AMD 7000 was a mix of older N5/N6 due to the MCM design.

Local AI benefits from platforms with unified memory that can be expanded.

This is tricky because expandable memory is orthogonal to bandwidth and power efficiency. Framework (ostensibly) had to use soldered memory for their Strix Halo box because it’s literally the only way to make the traces good enough: SO-DIMMs are absolutely not fast enough, and even LPCAMM apparently isn’t there yet.

AMD’s Ryzen AI MAX 300 chip

Funny thing is the community is quite lukewarm to the AMD APUs due to poor software support. It works okay… if you’re a python dev that can spend hours screwing with rocm to get things fast :/ But it’s quite slow/underutilized if you just run popular frameworks like ollama or the old diffusion ones.

It’s the main reason why I believe Apple’s memory upgrades cost a ton so that it isn’t a viable option financially for local AI applications.

Nah, Apple’s been gouging memory way before AI was a thing. It’s their thing, and honestly it kinda backfired because it made them so unaffordable for AI.

Also, Apple’s stuff is actually… Not great for AI anyway. The M-chips have relatively poor software support (no pytorch, MLX is barebones, leaving you stranded with GGML mostly). They don’t have much compute compared to a GPU or even an AMD APU, the NPU part is useless. Unified memory doesn’t help at all, it’s just that their stuff happens to have a ton of memory hanging off the GPU, which is useful.


Unfortunately, no one is buying a 7900 XTX for AI, mostly not a 5090 either. The 5090 didn’t even work till recently and still doesn’t work with many projects, doubly so for the 7900 XTX.

The fab capacity thing is an issue, but not as much as you’d think since the process nodes are different.

Again, I am trying to emphasize, a lot of this is just Nvidia being greedy as shit. They are skimping on VRAM/busses and gouging gamers because they can.


The Nvidia GPUs in data centers are separate (and even on separate nodes than, with different memory chips than) gaming GPUs. The sole exception is the 4090/5090 which do see some use in data center forms, but at low volumes. And this problem is pretty much nonexistent for AMD.

…No, it’s just straight up price gouging and anti competitiveness. It’s just Nvidia being Nvidia, AMD being anticompetitive too (their CEOs are like cousins twice removed), and Intel unfortunately not getting traction, even though Battlemage is excellent.

For local AI, the only thing that gets sucked up are 3060s, 3090s, and for the rich/desperate, 4090s/5090s, with anything else being a waste of money with too little VRAM. And this is a pretty small niche.


Misleading headline.

Emphasis mine:

Musk died to one of the game’s tutorial bosses due to a bad connection

Not due to being bad at the game, as the title implies. It doesn’t refute the core “message” of the facts, but I despise clickbait headlines.


Some people spend a lot of time, money in mobile games.

Occam’s Razor. I think it’s just the “default device” and placed in front of their eyes, so it’s what most people choose?


Yeah, you and @[email protected] have a point.

I am vastly oversimplifying a lot, but… Perhaps mobile gaming, on aggregate, is too shitty for its own good? It really looks that way whenever I sample the popular ones.


live service games make up a significant amount of what the average consumer wants, and those customers largely play on PC for all sorts of reasons

You are leaving out the elephant in the room: smartphones.

So, so, so many people game on smartphones. It’s technically the majority of the “gaming” market, especially live service games. A large segment of the population doesn’t even use PCs and does the majority of their computer stuff on smartphones or tablets, and that fraction seems to be getting bigger. Point being the future of the Windows PC market is no guarantee.


Hear me out:

  • 6 core CCD. Clocked real slow, but with 3D cache like the 5600x3d.

  • The slightly cut 32 CU GPU. Clocked real slow.

  • 32GB of that LPDDR5X. 24GB, if the config is possible?

  • …OLED? I feel like there’s a much better selection of tablet screens to borrow now. If not, use whatever SKU the switch does.

I can dream, can’t I? But modern laptop GPUs/CPUs are absurdly efficient if you underclock them a little.


Heh, Ampere is 2020. The Steam Deck’s Vang Gogh RDNA2 chip is largely newer.

If valve ever stuffs Strix Halo into a more premium steam deck, it would be like an entire console generation jump.


Yeah, 45% tariffs on Japan and 32% on Taiwan is going to make pricing brutal in the US. I wouldn’t be surprised if $450 and $80 for Mario Cart effectively goes up.

Was there anything about the SoC it uses? What architecture is it?


It’s supposed to be immersive, I think, so as not to force a voice that doesn’t match the roleplaying in your head.

I’m with you, though, I’d much prefer VA.


FemV in CP2077 totally killed it. Her voice acting was one of my favorite parts of the game.

AC Odyssey didn’t have as many emotional beats, but Kassanda was still way better than her brother.

And of course Jennifer Hale as FemShep… I’m starting to see a pattern here, lol.


The problem is the way they’re pushing the tools as magic lamps, and shoving them down everyone’s throats.

AI is a really neat tool that got dragged into an incredibly toxic system that enshittified it. Not a useful tool to help development, no, skip straight to replacing employees even if it doesn’t freaking work.


I mean, Starfield should have been my dream game, but after that…

I dunno. All my enthusiasm for BGS has been sapped, and I adored Oblvion.



Funny thing is SWTOR has some great art, heartfelt voice acting and quests, great soundtrack and such, but at the end of the day it’s buried in a grindy.

On the other hand, I tried Fallout 76 (after it was patched up) drunk with friends, and it was boring as heck. The quests were so dull, gameplay so arbitrarily janky and grindy. Drunk! With friends! Do you know how low a bar that is :/


I’m a sucker for wandering lush bucolic landscapes though.

You should play KC Deliverance 2 if you haven’t. Its forests and rural villages are freaking gorgeous, especially for how “easy” it is to run.


Funny thing about AMD is the MI300X is supposedly not selling well, largely because they priced gouge everything as bad as Nvidia, even where they aren’t competitive. Other than the Framework desktop, they are desperate to stay as uncompetitive in the GPU space as they possibly can, and not because the hardware is bad.

Wasn’t the Intel B580 a good launch, though? It seems to have gotten rave reviews, and it’s in stock, yet has exited the hype cycle.


It’s polished and undoubtedly one of the best games of all time.

My only gripe is that I find the pause-based combat lengthy, though not bad.



When do you think that stopped though?

There’s a lot of love for Skyrim, but I feel like there was already deterioration in the side quest writing, even strictly looking at Oblivion/FO3, not Morrowind.

As for BioWare, even ME3 was starting to show some cracks, even if you set the ending aside. And I loved Mass Effect to death. Heck, I’m even a bigger Andromeda fan than most.

…Point being I think we clung to BioWare/Bethesda a little too hard even when the signs of deoxygenation were there.


People like to write off CP2077, which is such a shame.

…And maybe this makes me a black sheep, but I bounced off Witcher 2/3? I dunno, I just didn’t like the combat and lore, and ended up watching some of the interesting quests on YouTube.


People understandably love to hate Oblivion and Fallout 3, but I feel the side quest writing had heart, like groups of devs got to go wild within their own little dungeons. Their exploitable mechanics were kinda endearing.

…And I didn’t get that from Starfield? I really tried to overlook the nostalgia factor, but all the writing felt… corporate. Gameplay, animation, Bethesda jank without any of the fun. I abandoned it early and tried to see what I was missing on YouTube, but still don’t “get” what people see in that game.

If you want a big walking sandbox in that vein, I feel like No Man’s Sky would scratch the itch far better, no?

Meanwhile, BG3 and KC2 completely floored me. So did Cyberpunk 2077, though I only experienced it patched up and modded. Heck, even ME Andromeda felt more compelling to me.



I found a combat mod completely changed the game for me. By making it brutally damaging instead of so bullet spongy and deleveling it, it simplifies all that crap away. Perks and guns are for play styles, and it lets one enjoy the game instead of worrying about them.


Ace Combat Zero! Or 4, 5, and Zero (in that order) if you want a lot of it.


It doesn’t seem that big, right?

If WW was stuck in development hell, cutting Monolith makes sense I guess. PFG only did Multiversus, WB SF seemed to only work on/support mobile games, with no recent credits.




It’s probably not big if it’s included in the driver download and run in real-time so quickly. Not big enough to worry about anyway.