• 0 Posts
  • 359 Comments
Joined 2Y ago
cake
Cake day: Mar 22, 2024

help-circle
rss

I mean… It functioned as a CPU.

But a Phenom II X6 outperformed it sometimes, single thread and multithreaded. That’s crazy given Pildriver’s two generation jump and huge process/transistor count advantage. Power consumption was awful in any form factor.

Look. I am an AMD simp. I will praise my 7800X3D all day. But there were a whole bunch of internet apologist for Bulldozer back then, so I don’t want to mince words:

It was bad.

Objectively bad, a few software niches aside. Between cheaper Phenoms and the reasonably priced 2500K/4670K, it made zero financial sense 99% of the time.


Even the 6-core Phenom IIs from 2010 were great value.

But to be fair, Sandy Bridge ended up aging a lot better than those Phenom IIs or Bulldozer/Piledriver.


AMD almost always had the better price/performance

Except anything Bulldozer-derived, heh. Those were more expensive and less performant than the Phenom II CPUs and Llano APUs.


IDK, lots of the TW3 sidequests seem to be very good… from my time watching them on YouTube, heh.


It might be a “watch the cutscenes and scenery highlights in a YT play though” game.


I like this take.

It seems like there are ‘victims’ caught up in the hype and sinking way too much money into SC. But if the gameplay is enjoyable, and fits your budget? Enjoy it. Hell yes.


Heh, that’s correct.

This meme video about sums it up:

https://youtu.be/n42JQr_p8Ao

The answer is “you play at release and buy them over time, like a crab in slowly boiling water,” though the absolutely incredible rate they introduce bugs into the games kinda knocks you out of the habit.


Whoa, this sounds like drama. Though over what, I don’t know. I didn’t know CDPR’s co-founder left awhile ago.

What’s going on there at CDPR, and why would GoG want to get out from under them?


Some “DLC happy” games seem to work in niches while mostly avoiding the micro-transaction trap. I’m thinking of Frontier’s “Planet” games, or some of Paradox’s stuff.

I’m confused at some games not taking the DLC happy route, TBH. 2077, for instance, feels like it’s finally fixed up, and they could make a killing selling side quests smaller in scope than the one they have.


CUDA is actually pretty cool, especially in the early days when there was nothing like it. And Intel/AMD attempts at alternatives have been as mixed as their corporate dysfunction.

And Nvidia has long has a focus on other spaces, like VR, AR, dataset generation, robotics, “virtual worlds” and such. If every single LLM thing disappeared overnight in a puff of smoke, they’d be fine; a lot of their efforts would transition to other spaces.

Not that I’m an apologist for them being total jerks, but I don’t want to act like CUDA isn’t useful, either.


Yeah I mean you are preaching to the choir there. I picked up a used 3090 because rocm on the 7900 was in such a poor state.


That being said, much of what you describe is just software obstinacy. AMD (for example) has had hardware encoding since early 2012, with the 7970. Intel quicksync has long been a standard on laptops. It’s just a few stupid propriety bits that never bothered to support it.

CUDA is indeed extremely entrenched in some areas, like anything involving PyTorch or Blender’s engines. But there’s no reason (say) Plex shouldn’t support AMD, or older editing programs that use OpenGL anyway.


Well, same for me TBH. I’d buy a huge Battlemage card before you could blink, but the current cards aren’t really faster than an ancient 3090.

But most people I see online want something more affordable, right in the 9000 series or Arc range, not a 384-bit GPU.


Fact is, Nvidia has the vast majority of market share: https://www.techpowerup.com/337775/nvidia-grabs-market-share-amd-loses-ground-and-intel-disappears-in-latest-dgpu-update

It’s inexplicable to me. Why are folks ignoring Battlemage and AMD 9000 series when they’re so good?

Alas, for whatever reason they aren’t even picking existing alternatives to Nvidia, so Nvidia suddenly becoming unavailable would be a huge event.


AFAIK, the H100 and up (Nvidia’s bestselling data center GPUs) can technically game, but they’re missing so many ROPs that they’re really bad at it. There will be no repurposing all those AI datacenters for cloud gaming farms.


Nah, I’m a hardcore Rimworld fan who’s been with it since the forum Alphas, and that’s completely true, heh. IIRC was explicitly inspired by DF, literally part of the description in the early days.

That being said, I’ve never played DF. I’m more into sci fi and tech than Tolkien-esque fantasy, hence I’ve passed on stuff like Necesse and… what’s that massive city builder? That, and many I’ve forgotten the names of.

Maybe I will give DF a look though. Maybe it has automation/tech mods that would make it more appealing, like Minecraft and Rimworld?


Again, they’re tools. Some of the most useful applications for LLMs I’ve worked on are never even seen by human eyes, like ranking, then ingesting documents and filling out json in pipelines. Or as automated testers.

Another is augmented diffusion. You can do crazy things with depth maps, areas, segmentation, mixed with hand sketching to “prompt” diffusion models without a single typed word. Or you can use them for touching up something hand painted, spot by spot.

You just need to put everything you’ve ever seen with ChatGPT and copilot and the NotebookLM YouTube spam out of your head. Banging text into a box and “prompt engineering” is not AI. Chat tuned decoder-only LLMs are just one tiny slice that a few Tech Bros turned into a pyramid scheme.


An OpenAI subscription does not count.

Otherwise, yeah… but it helps them less, proportionally. AAAs still have the fundamental Issue of targeting huge audiences with bland games. Making them even more gigantic isn’t going to help much.

AAs and below can get closer to that “AAA” feel with their more focused project.


Then most just won’t go on the Game Awards, and devs will go on using Cursor or whatever they feel comfortable with in their IDE setup.

I’m all against AI slop, but you’re setting an unreasonably absolute standard. It’s like saying “I will never use any game that was developed in proximity to any closed source software.” That is possible, technically, but most people aren’t gonna do that. It’s basically impossible on a larger team. Give them some slack with the requirement; it’s okay to develop on Windows or on Steam, just open the game’s source.

Similarly, let devs use basic tools. Ban slop from the end product.


Now my blood boils like everyone else’s when it comes to being forced to use AI at work, or when I hear the AI Voice on Youtube, or the forced AI updates to Windows and VS Code

You don’t hate AI. You hate Big Tech Evangelism. You hate corporate enshittification, AI oligarchs, and the death of the internet being shoved down your throat.

…I think people get way too focused on the tool, and not these awful entries wielding them while conning everyone. They’re the responsible party.

You’re using “AI” as a synonym for OpenAI, basically, but that’s not Joel Haver’s rotoscope filter at all. That’s niche machine learning.


As for the exponential cost, that’s another con. Sam Altman just wants people to give him money.

Look up what it takes to train (say) Z Image or GLM 4.6. It’s peanuts, and gets cheaper every month. And eventually everyone will realize this is all a race to the bottom, not the top… but it’s talking a little while :/


Yeah.

Maybe a technicality too. The rule said “no AI,” and E33 used AI.

I get their intent: keep AI slop games out. But in hindsight, making the restriction so absolute was probably unwise.


If we’re banning games over how they make concept art… I’m not sure how you expect to enforce that. How could you possibly audit that?

Are you putting coding tools in this bucket?


Then you’re going to get almost no games.

Or just get devs lying about using cursor or whatever when they code.

If that’s the culture of the Game Awards, if they have to lie just to get on, that… doesn’t seem healthy.


That’s just not going to happen.

Nearly any game with more than a few people involved is going have someone use cursor code completion, or use one for reference or something. They could pull in libraries with a little AI code in them, or use an Adobe filter they didn’t realize is technically GenAI, or commission an artist that uses a tiny bit in their workflow.

If the next Game Awards could somehow audit game sources and enforce that, it’d probably be a few solo dev games, and nothing elsex

Not that AI Slop should be tolerated. But I’m not sure how it’s supposed to be enforced so strictly.


Oh, yes. Big publisher will try it on a huge scale. They cant help themselves.

And they’re going to get sloppy results back. If they wanna footgun themselves, it’s their foot to shoot.


Some mid sized devs may catch this “Tech Bro Syndrome” too, unfortunately.


I think AI is too dumb, and will always be too dumb, to replace good artists.

I think most game studios can’t afford full time art house across like 30 countries, nor should they want the kind of development abomination Ubisoft has set up. That’s what I’m referring to when I say “outsourced”; development that has just gotten too big, with too many people and too generic a target market. And yes, too many artists working on one game.

I think game artists should have a more intimate relationship with their studio, like they did with E33.

And it’d be nice for them have tools to make more art than they do now, so they can make bigger, richer games, quicker, with less stress and less financial risk. And no enshittification that happens when their studio gets too big.


That’s fair.

But the Game Awards should reconsider that label next year. The connotation is clearly “AI Slop,” and that just doesn’t fit for stuff like cursor code completion, or the few textures E33 used.

Otherwise studios are just going to lie. If they don’t, GA will be completely devoid of bigger projects.

…I don’t know what the threshold for an “AI Slop” game should be through. It’s clearly not E33. But you don’t want a sloppy, heavily marketed game worming its way in, either.


I understand the principle. Even if E33 is not slop, people should fear a road that leads to dependence on “surveillance state AI” like OpenAI. That’s unacceptable.

That being said, I think a lot of people don’t realize how commoditized it’s getting. “AI” is not a monoculture, it’s not transcending to replace people, and it’s not limited to corporate APIs. This stuff is racing to the bottom to become a set of dumb tools, and dirt cheap. TBH that’s something that makes a lot of sense for a game studio lead to want.

And E33 is clearly not part of the “Tech Bro Evangalism” camp. They made a few textures, with a tool.


More that an existing smaller studio doesn’t have to sell their soul to a publisher (or get lucky) to survive. They can more safely make a “big” game without going AAA.

My observation is that there’s a “sweet spot” for developers somewhere around the Satisfactory (Coffee Stain) size, with E33 at the upper end of that, but that limits their audience and scope. If they can cut expensive mocap rigs, a bunch of outsourced bulk art, stuff like that with specific automation, so long as they don’t tether themselves to Big Tech AI, that takes away the advantage AAAs have over them.

A few computer generated textures is the first tiny step in that direction.

So no. AI is shit at replacing artists. Especially in E33 tier games. But it’s not a bad tool to add to their bucket, so they can do more.


So is the source.

If they’re paying a bunch of money to OpenAI for mega text prompt models, they are indeed part of the slop problem. It will also lead to an art “monoculture,” Big Tech dependence, code problems, all sorts of issues.

Now, if they’re using open weights models, or open weights APIs, using a lot of augmentations and niche pipelines like, say, hand sketches to 3D models, that is different. That’s using tools. That’s giving “AI” the middle finger in a similar way to using the Fediverse, or other open software, instead of Big Tech.



Yeah.

A lot of devs may do it personally, even if it’s not a company imperative (which it shouldn’t be).


And little tools like that give studios like this an edge over AAAs. It’s the start of negating their massive manpower advantage.

In other words, the anti-corpo angle seems well worth the “cost” of a few generations. That’s the whole point of AI protest, right? It really against the corps enshittifying stuff.

And little niche extensions in workflows is how machine learning is supposed to be used, like it was well before it got all the hype.


Seems excessive.

There’s AI slop games, the new breed of lazy asset flips. There’s replacing employees with slop machines.

And then there’s “a few of our textures were computer generated.” In a game that is clearly passionately crafted art.

I get it’s about principle, but still.


Y’all got any recs for “AI sandboxes” in the vein of Rimworld and Stellaris? Stellaris is just borked now, and I’ve modded/played the hell out of Rimworld.


Horizon 5 has one heck of a Hot Wheels expansion, if y’all don’t already have it.


They can’t react in 2026 unless they already have something in the pipe, and they don’t (yet) make their own RAM.

…That being said, it could be a good year for big APUs, which use system RAM you already have to have.


Because, to be blunt, a lot of apps seem to get away with it without getting sued. Especially mobile clones


To be blunt: were you holding it wrong? Was the game on an HDD? Did you tweak the inis in a dangerous way?

It’s been ages since I played FO4. It was janky for sure. But it seemed to run okay, and load fast, on a toaster compared to what I have now. And I never had a loading screen last close to a minute.

Starfield was a whole nother level, though. It felt like a game trying to look like 2077, but with the engine “feel” of something from 2006.


Do folks enjoy Starfield these days?

Or 76?

I’ve been playing BGS since Oblivion, and my experience was:

  • 76 was boring, even with coop. That’s saying something. The world was interesting, but the main and side (fetch) quests were the dullest, buggiest things that kept trying to sell us some anti grind stuff; and this was well after launch.

  • Starfield was… well, even more boring. It felt like Fallout 3 with all the jank, 10X the production budget, 100X the graphics requirements (as smooth as a cactus on my 3090), yet somehow, none of the charm. I only played for a bit, but I don’t remember a single character name. Whereas I can still recall little side quests from Oblivion and FO3. Quirks persisted all the way from Oblivion, yet all the fun bugs were patched out. Basically, ME: Andromeda was better in every way.

But, you know, whatever floats peoples boats. I’m curious if these games have grown a following over whatever I was missing.


Setting criticism of BGS aside, Fallout isn’t as “hard” with its lore as some. There are little inconsistencies between the games and other media that are basically written off as “gameplay mechanics things” or simple oversights.

…Hence there will probably be conflicts with the TV show. But that’s fine. It’s nothing earth shattering for the IP.