• 0 Posts
  • 243 Comments
Joined 2Y ago
cake
Cake day: Mar 22, 2024

help-circle
rss

Why not both? One section for owners, one for pre-purchasers (maybe it has to be in their wishlist?)

It would give owners a clean space, allow pre-purchasers to ask questions, but rob trolls of attention (which is the most important thing).


That’s the issue, isn’t it?

I see this on the internet a lot. People posit things like “wouldn’t it be awesome if these fired devs got together” or “Why don’t they make good stuff anymore? Wouldn’t it be great if somone made a thing like this old beloved thing…”

…Except it’s already happening. Or happened.

And there’s just so much noise on the internet, it’s largely unknown to the folks who’d be interested.

To be clear, I’m not blaming OP, and I’ve done the exact same thing myself. But I still find it kind of… sad.


Anyway, thanks, I am bookmarking Exodus and Archetype Entertainment now.


This makes me think of the Ellisons buying Star Trek and Avatar. Why wouldn’t they shutter or castrate two notoriously ‘woke’, expensive, questionably profitable franchises?

Same here :(. Though to be fair, the Saudi’s political leanings aren’t a perfect parallel.


Yeah. There’s domestic pressure for this anyway, unfortunately.


Gundam

The physics of the mechs (from my very sparse knowledge of Gundam) are pretty questionable, lol, which is fine because they’re there to be spectacular.


Hard, relativistic STL sci fi can still get super weird, see: https://www.orionsarm.com/eg-article/48545a0f6352a


To add to this, Jason Schreier is a well known, and well sourced, gaming journalist.

https://en.wikipedia.org/wiki/Jason_Schreier

But you aren’t wrong. There’s no way to know that via Bluesky unless you’ve happen to read his stuff from Bloomberg and before. It’s almost like Twitter is a terrible format for news or something…


It would be interesting if EA pulled away from lootboxes for their owner’s ideological reasons.


RockPaperShotgun is my go-to, but I also tend to use ‘sorting’ features in stores and stuff.

For instance, on steam, you can filter by tags you like, like ‘co-op’ or ‘base building’ or whatever, then sort by review score to float the best to the top. And sometimes there are external sites like GamePasta (for Gamepass) with similar features for other platforms:

https://store.steampowered.com/search/

I may have better advice if there’s a certain ‘type’ of game you like. For instance, do you prefer coop with mates or an SO or something? Do you like RTSes or sandbox games or what?


You literally named a bunch of old games that absolutely have modern alternatives. From indie ‘retro’ RTS games to Mass Effect (or more dramatic MGS) feeling RPGs/shooters that flew under the radar to great and original puzzle games in the vein of Portal. Have you ever played the Talos Principle or Antichamber, for instance?

Discoverability is a huge issue, because there are so many games. AAAs do skew towards generic MTX junk, but the other side of that is their marketing sucks up finite attention.


I’m sorry, but gamers are so entitled.

We’re flooded with an incredible back catalog and a sea of gems, yet the sentiment is “small devs are fine” is totally ignorant of how, literally the vast majority of the time per the article, these small devs barely make ends meet on their genuinely good passion project.

Or they generalize that all games are junk because they haven’t even made a bare minimum attempt to shop around the sea of excellently organized stores and review sites/databases the industry has, like they expect absolute perfection in a personal TikTok/YouTube feed directed at them, then turn around and complain about paying a few bucks for an indie after dropping $600 on a GPU.


…There really are too many games because it’s so many passion projects now, and that’s… fine. It’s a lot better than the cinema situation now, for example, where indie makers are getting squeezed so hard.

But I still don’t like the entitled culture that hurts the discoverability of these smaller games and feeds the AAA slop conveyer belts.


Not really.

It may be “feel good nice” if you make a few bucks to a few hundred good reviews on a passion project, but it’s not enough to help you eat and pay rent.

And making a game is a pretty massive time sink. Not to belittle other artists, but the bare minimum time/financial investment for one game is higher than, say, a digital art portfolio or an album.


I hate to be rude, but there are literally thousands of great games cheaply accessible to you.

They aren’t gonna be spoon fed to your eyeballs; you have to shop and dig.


I see a lot of folks trying to blame this on Unreal, but that makes no sense in light of other Unreal games being smooth for the visual fidelity, and Gearbox having worked with Unreal for literally forever.

This is all on Gearbox, and their CEO/devs throwing gas in the fire via Twitter.

It’s honestly insane. There is clearly internal dysfunction at Gearbox, yet their CEO and leads are allowed to damage their brand to their hearts content with… no repercussions? WTF is Embracer (their parent) even doing to miss that?



but it has to involve some scaling down of both the size and cost of these projects and also player expectations, who are always demanding more, more, more, and anything less, from visual fidelity to playtime to map size, is viewed as an inexcusable downgrade, especially for something like a sequel, which is most of what the industry produces now. Something has to give, and after a lot of bending, we are on the verge of this whole thing breaking.

I kind of hate that line.

It might be true with how ridiculous some gamers’ expectation seem to be, but I have to wonder if its like a ‘Twitter mirage’.

Are people looking at, like, KCD II and BG3 or even older stuff like GTA V and thinking ‘man, if the graphics aren’t better and the world isn’t even bigger, the next game is going to suck!’


I’m not… I mean, even if I adored a particular title, I’d be happy to buy expansions instead of $300m+ gigagame, and perfectly happy with cheaper graphics as long as it looks cool.

And that’s different than jank. As a good example, Starfield technically looked high fidelity and expensive, yet felt clunky and ugly.


Heh, Hades II isn’t small at all. The bar for a “big” project has just risen to a kind of ridiculous, mostly unsustainable level.

IMO it’s a sweet spot size. Small enough to survive in a niche genre, small enough for development to not run off the rails, yet big, big enough to have a big budget and feel like a huge game.


Yeah.

The ‘dreamer’ part of me pictures this as enabling solo devs with masterpieces in their heads to finally make the game they want. Or small studios to undermine AAAs even more.


…But grifters gonna grift.

And apparently people buying slop can’t help themselves, whether the slop is AI or not.


Gearbox has developed on Unreal Engine since 2005. They have ~1,300 employees.

I’m sorry, I know game dev is hard. But if small, new studios can get it to work, Gearbox should get it to fly. They have no excuse.


Honestly Cyberpunk’s raytracing runs like poo compared to Lumen (or KCD2 Crytek) compared to how good it looks. I don’t like any of the RT effects but RT Reflections; both RT shadows options flicker, RT lighting conflicts with the baked-in lighting, yet doesn’t replace it if you mod it out.

Most of Cyberpunk’s prettiness is there from good old rastarization, more than most people realize.

PTGI looks incredible, but it’s basically only usable with mods and a 4090+.


Trying to run Borderlands at 4K sounds about as stupid to me as…

On the contrary, it should be perfectly runnable at 4K because its a 2025 FPS game and the cel-shaded graphics should be easy to render.

‘Unreal Engine’ is no excuse either. Try something like Satisfactory rendering literally thousands of dynamic machines on a shoestring budget with Lumen, like butter, on 2020 GPUs, and tell me that’s a sluggish engine.

This is on Gearbox, who’ve developed on Unreal for 2 decades. And ‘sorry, we’ll work on it’ would have been a fine response…


On the contrary, custom engines have been bombing.

Look at Starfield or Cyberpunk 2077 or basically any custom engine AAA. Look at what happened to things like ME Andromeda.

…Then look at KCD2. It looks freaking fantastic, looks like raytacing with no raytracing, runs like butter, and it’s Crytek.

Look at something like Satisfactory, rendering tons of stuff on a shoestring budged and still looking fantastic thanks to Unreal Lumen.


There’s a reason the next Cyberpunk is going to be Unreal, and its because building a custom engine just for your game is too big an undertaking. Best to put that same budget in optimizing a ‘communal’ engine, polish, bugfixing and such.

Borderlands 4 is slow because the botched optimization, not because its Unreal.



And any divergence from that is “ruining games” or “being woke” to the point that we don’t even GET those games outside of the rare case of a game nobody cared about becoming popular

I would argue the origin is sales. E.G. the publisher wants the sex appeal to sell, so that’s what they put in the game. Early ‘bro’ devs may be a part of this, but the directive from up top is the crux of it.

And that got so normalized, it became what gamers expect. And now they whine like toddlers when anyone tries to change it, but that just happens to be an existing problem conservative movements jumped on after the fact.


TL;DR the root cause is billionares.

Like aways.


What’s interesting is they file this in the US.

Is there a reason they don’t file patents in Japan instead?



It’s not unexpected. Paradox’s business model is basically being DLC happy.


Well, it’s no mystery:

https://www.jonpeddie.com/news/q225-pc-graphics-add-in-board-shipments-increased-27-0-from-last-quarter/

It’s specifically desktop addin boards:

AMD’s RX 9070 XT and RX 9070 represent AMD’s new RDNA 4 architecture, competing with Nvidia’s midrange offerings. Nvidia introduced two new Blackwell-series AIBs: the GeForce RTX 5080 Super and the RTX 5070. The company also announced the RTX 500 workstation AIB. Rumors have persisted about two new AIBs from Intel, including a dual-GPU model.

It is including workstation cards like the Blackwell Pro. But this is clearly not including server silicon like the B200, H200, MI325X and so on, otherwise they would have mentioned updates. They are not AIBs.

I hate to obsess over such a distinction, but it’s important: server sales are not skewing this data, and workstation sales volumes are pretty low. It’s probably a accurate chart for gaming GPUs.


I’m not sure the bulk of datacenter cards count as ‘discrete GPUs’ anymore, and they aren’t counted in that survey. They’re generally sold socketed into 8P servers with crazy interconnects, hyper specialized to what they do. Nvidia does sell some repurposed gaming silicon as a ‘low end’ PCIe server card, but these don’t get a ton of use compared to the big silicon sales.


Basically, consumer VRAM is dirt cheap, not too far from DDR5 in $/gigabyte. And high VRAM (especially 48GB+) cards are in high demand.

But Nvidia charges through the nose for the privilege of adding more VRAM to cards. See this, which is almost the same silicon as the 5090: https://www.amazon.com/Blackwell-Professional-Workstation-Simulation-Engineering/dp/B0F7Y644FQ

When the bill of materials is really only like $100-$200 more, at most. Nvidia can get away with this because everyone is clamoring for their top end cards


AMD, meanwhile, is kind of a laughing stock in the prosumer GPU space. No one’s buying them for CAD. No one’s buying them for compute, for sure… And yet they do the same thing as Nvidia: https://www.amazon.com/AMD-Professional-Workstation-Rendering-DisplaPortTM/dp/B0C5DK4R3G/

In other words, with a phone call to their OEMs like Asus and such, Lisa Su could lift the VRAM restrictions from their cards and say 'you’re allowed to sell as much VRAM on a 7900 or 9000 series as you can make fit." They could pull the rug out from under Nvidia and charge a $100-$200 markup instead of a $3000-$7000 one.

…Yet they don’t.

It makes no sense. They’re maintaining an anticompetitive VRAM ‘cartel’ with Nvidia instead of trying to compete.

Intel has more of an excuse here, as they literally don’t manufacture a GPU that can take more than 24GB VRAM, but AMD literally has none I can think of.


I’ve kinda lost this thread, but what does that have to do with consumer GPU market share? The servers are a totally separate category.

I guess my original point was agreement: the 5000 series is not great for ‘AI’, not like everyone makes it out to be, to the point where folks who can’t drop $10K for a GPU are picking up older cards instead. But if you look at download stats for these models, there is interest in running stuff locally instead of ChatGPT, just like people are interested in internet free games, or Lemmy instead of Reddit.


Who the fuck buys a consumer GPU for AI?

Plenty. Consumer GPU + CPU offloading is a pretty common way to run MoEs these days, and not everyone will drop $40K just to run Deepseek in CUDA instead of hitting an API or something.

I can (just barely) run GLM-4.5 on a single 3090 desktop.


We will see. 'Cause for all Civ 6’s (and every Civ’s) controversy, this feels different… It’s like people have forgotten about 7.







No shame in that; AMD and Nvidia traded between ‘optimal buys’ forever. There were times where buying AMD was not the best idea, like with how amazing the Nvidia 900/1000 series was while AMD Vega was very expensive.

Others, it wasn’t obvious at the time. The old AMD 7000 series was pricey at launch, for instance, but aged ridiculously well. A 7950 would still function alright these days.

This market’s such a caricature now though. AMD/Intel are offering these obvious great values, yet being looked over through pure ignorance; I can’t remember things ever being like this, not all the way back to Nvidia Fermi at least.


7900xtx is 24 gb, the 9700 pro has 32 gb as far as high end consumer/prosumer goes.

The AI Pro isn’t even availible! And 32GB is not enough anyway.

I think you underestimate how desperate ML (particularlly LLM) tinkerers are for VRAM; they’re working with ancient MI50s and weird stuff like that. If AMD had sold the 7900 with 48GB for a small markup (instead of $4000), AMD would have grassroots support everywhere because thats what devs would spend their time making work. And these are the same projects that trickle up to the MI325X and newer.

I was in this situation: I desperately wanted a non Nvidia ML card awhile back. I contribute little bugfixes and tweaks to backends all the time; but I ended up with a used 3090 because the 7900 XTX was just too expensive for ‘only’ 24GB + all the fuss.

There’s lingering bits of AMD support everywhere: vulkan backends to popular projects, unfixed rocm bugs in projects, stuff that works but isn’t optimized yet with tweaks; the problem is AMD isnt’ making it worth anyone’s while to maintain them when devs can (and do) just use 3090s or whatever.


They kind of took a baby step in this direction with the AI 395 (effectively a 110GB VRAM APU, albeit very compute light compared to a 7900/9700), but it’s still $2K, effectively mini PC only, and kinda too-little-too-late.