• 0 Posts
  • 110 Comments
Joined 3Y ago
cake
Cake day: Jul 14, 2023

help-circle
rss

Honestly, the difficulty curve on HZD feels like a sine wave sometimes.

Cuz:

  • You do simply get stronger over time
  • You also improve your use of tactics over time
  • You also gain access to new tactics
  • But they introduce much stronger machines too
  • But a lot of a machine’s strength can be mitigated by approaching it with the right tactics
  • But they also deliberately put you in situations where you can’t use the easy tactics
  • But they also put you in situations where you can use the easy tactics, against a ton of very strong machines

So depending on how quickly you hit the skill ceiling on using your available tactics, how much you like to grind, how reliable your multi-tasking is, and your basic “twitch skills”, you might get a skewed perspective at any point along the way.

I don’t think there’s any harm in changing the difficulty back and forth (not sure). So maybe just go with whatever feels comfy at the moment?

Edit: I’ll say, overall, they do an excellent job of making you feel like the stuff that used to be scary is no longer scary, but for the right reasons. That is: not because of dmg++ and armor++ buffs (although there is some of that), but because you’ve become so much more proficient at handling dangerous situations. For the most part, even mid-to-late game, the simple stuff is still deadly if you let your guard down.


Dragon Quest Builders 2, inspired by FOMO over Pokopia.

Also continuing Horizon Zero Dawn for the third time. Just hard mode, but I kinda feel like I could do very hard.



They don’t want to save capitalism. They believe capitalism is about to be over, and they want to be in control of whatever it is that comes next.


They’re worried they’re not spending enough on AI.

Classic MLM tactics. “If you’re not seeing a return on Herbalife, it’s cuz you’re not spending enough on it!”



Ehh, x86 SoC consoles will always have an advantage vs x86 SoC PCs, because PCs need to treat iGPUs as PCIe peripherals rather than co-processors, which has significant performance penalties and a low ceiling due to bottlenecked heat dissipation.

The Steam Frame should get you worried about x86 consoles, because if devs start publishing native ARM builds for desktop then this whole accidental iGPU performance moat goes away.

Buuut it should also get you excited about ARM consoles.


I dig it. Wishlisted.

I have a rare disease where I can only buy Steam games that are on sale, but the next time that happens I’m all over it.


Beautifully put.

I especially like that they called out the “it’s just a tool” BS:

Yet technological artefacts cannot be separated from the conditions under which they are created, or from the realities of who controls and profits from them. Today, developing these technologies expands racial capitalism, intensifies imperialist extraction, and reinforces the divide between the global North and South. The technology is inseparable from the labour that produces it — the expropriation of work by writers, artists, programmers, and peer-production communities, as well as the highly exploitative crowdwork of data annotation.


There is a big difference, and I’d argue the Claude refactoring is worse. Content was already pursuing the common denominator. But open source was a place where you could actually bring some nuance, examine things in detail, and build a shared understanding of deeper truths. But why bother with the icky social factors of working together to build something with people all around the world that can evolve and last for 10+ years, when you can boil a swimming pool to produce a half-baked one-off solution instead?


You absolutely may. Intel is valid too.


Buy a used Optiplex.

Gets you an acceptable case, mobo, CPU, RAM, and probably SSD for about $200. Add a used 3050/3060/4060 GPU and an upgraded PSU.

It’s not gonna knock your socks off, but it gets you going without over-spending and you can carry some of that forward when you upgrade later.


It says it’s multiple studios, which I assume were acqui-hired. So it’s not just “VR developers”, but also UI designers, concept artists, QA, PMs, HR, IT, tech writers, community managers, sales people — maybe even localization, reception, janitors… who knows. The structure of these things can vary wildly.


Trying desperately to keep the ponzi scheme going, but his biggest customers already have warehouses full of GPUs that will never get connected.

The bubble is full, dude. Just try to minimize the damage from the pop so we don’t try to figure out what size pitchfork your dumb leather jacket is.


It also discouraged you from finding/starting an open source solution for those problems, thus undermining the high-quality open knowledge ecosystem that it relied on in the first place.


Yeah… Linear increases in performance appear to require exponentially more data, hardware, and energy.

Meanwhile, the big companies are passing around the same $100bn IOU, amortizing GPUs on 6-year schedules but burning them out in months, using those same GPUs as collateral on massive loans, and spending based on an ever-accelerating number of data centers which are not guaranteed to get built or receive sufficient power.


I like the way Ted Chiang puts it:

Some might say that the output of large language models doesn’t look all that different from a human writer’s first draft, but, again, I think this is a superficial resemblance. Your first draft isn’t an unoriginal idea expressed clearly; it’s an original idea expressed poorly, and it is accompanied by your amorphous dissatisfaction, your awareness of the distance between what it says and what you want it to say. That’s what directs you during rewriting, and that’s one of the things lacking when you start with text generated by an A.I.

There’s nothing magical or mystical about writing, but it involves more than placing an existing document on an unreliable photocopier and pressing the Print button.

I think our materialist culture forgets that minds exist. The output from writing something is not just “the thing you wrote”, but also your thoughts about the thing you wrote.



Not to be confused with SOLID, SolidJS, or Solidity.

It’s a neat idea. Because of the need to operate on data close to web servers and backend services for potentially long timeframes, I think we’ll need a widely-adopted CRDT solution in order for something like Solid to really take off from a technical standpoint.

And from a business standpoint, there’s really no upside. Sure, you delegate some cost for storage, but compute tends to be the more expensive aspect, and if you’re spending more time to interact with these external data stores, it may be more expensive in the end.



Got Megabonk working on my Retroid, and can’t stop playing it. I thought I would try getting some other games going, but I just play Megabonk instead.


Opus Magnum

That game scratches my brain in such a satisfying way



Sure, but do you need a discrete video card if you’re gaming on an ARM SoC? And we’ve seen from the struggles of x86 iGPUs that graphics APIs pretty much have to choose whether they’re going to optimize for dedicated VRAM or shared memory, cuz it has inescapable implications for how you structure a game engine. ARM APIs will probably continue optimizing for shared memory, so PCI-E GPUs will always be second-class citizens.


Nvidia does not care about the ISA of the CPU at all.

That’s kinda my point. They’re stuck communicating over PCI-E instead of being a first-class co-processor over AMBA.



  1. Nvidia abandons x86 desktop gamers
  2. The only hardware that gamers own are ARM handhelds
  3. Some gamers stream x86 games, but devs start selling ARM builds since the x86 market is shrinking
  4. AI bubble pops
  5. Nvidia tries to regain x86 desktop gamers
  6. Gamers are almost entirely on ARM
  7. Nvidia pulls an IBM and vanishes quietly into enterprise services and not much else

I’m down, but RISC-V has a looooot of ground to make up first. Last I checked, total number of RISC-V devices in existence was an order of magnitude less than what Qualcomm produces in a year.



You can run Linux on ARM. I do. And let’s not act like x86 wasn’t full of Microsoft-led efforts to undermine Linux. Anyone who’s had to disembowel their BIOS settings to the tune of “Your PC will be unsafe! Are you sure you want to run a LEGACY OS???” is familiar.

I’m not a huge fan of the idea of buying CPU+GPU+RAM+mobo all as one unit. But like… that’s what tends to happen. Audio cards, SATA drives, network cards, these things all used to be separated until motherboards offered features to streamline things.

The real problem is not form factor, but lack of competition. If there were 10-15 Qualcomms out there, offering different combos and a la carte options, there’d be no problem. It’s only because there are a tiny number of dominant players in the space that technical consolidation automatically translates to abusing consumers.


Well… modularity is kinda coming to an end anyway, regardless of supply chain moves. Apple’s M series has shown that op decoders and unified memory are the low-hanging fruit for overall system performance improvements, and that means less modularity.

I think Valve sees the writing on the wall and is trying to get ahead of the game via FEX and the Steam Frame. Intel and AMD are pretty much stuck playing Nvidia’s game at this point, and Qualcomm has an incredible opportunity here. I’m still rooting for RISC-V, and I think it may end up being the long-term winner in like 10-15 years time.

But either way, x86-style modularity is not long for this world. From a purely technical standpoint, I think that’s good. Adding the political and economic situation into the mix… well… fuck, we’re mega-fucked. About the only thing we have going for us as consumers is the fact that this is already headed towards a reset. So if we do gain some leverage, we can make a big change all at once. If we don’t though… things will get much worse.




The seal looks like this:

Code completion is probably a gray area.

Those models generally have much smaller context windows, so the energy concern isn’t quite as extreme.

You could also reasonably make a claim that the model is legally in the clear as far as licensing, if the training data was entirely open source (non-attribution, non-share-alike, and commercial-allowed) licensed code.

That said, I think the general sentiment is less “what the technology does” and more “who it does it to”. Code completion, for the most part, isn’t deskilling labor, or turning experts into accountability sinks.

Like, I don’t think the Luddites would’ve had a problem with an artisan using a knitting frame in their own home. They were too busy fighting against factories locking children inside for 18-hour shifts, getting maimed by the machines or dying trapped in a fire.


At the time, retailers were the customers. Before online storefronts took off, there was no way to sell one copy to Alice, one copy to Bob… you had to sell 50,000 copies to Best Buy with a promise to buy them back in a year if they don’t sell out. If they tell you up front, “we won’t buy that”, what are you gonna do?



Peter Molyneux Studios presents, a Peter Molyneux production: Peter Molyneux’s Masters of Albion, by Peter Molyneux, featuring Peter Molyneux, and special guest Peter Molyneux




Makes sense they bought Star Wars, so they can legally say “I am altering the deal. Pray I do not alter it further.”

[User was moused for this comment]