nickwitha_k (he/him)
  • 0 Posts
  • 79 Comments
Joined 2Y ago
cake
Cake day: Jul 16, 2023

help-circle
rss

Someone downvoted because they don’t want to remember Bethesda’s shenanigans?


It’s really not just that it is/was cheaper. There are cases where, all costs considered, it was actually measurably more expensive. The main reason for off-shoring is purely ideological. Amercan capital has nothing but disdain for workers and hatred for organized labor. Off-shoring was intended to crush unions, while giving a temporarily lower price to goods to prevent the populace from understanding how much they were getting screwed.

Chip production is a highly specialized field, where workers could readily demand concessions from capital, were they on anything resembling stable ground. That was not too be allowed.


It’s the principle of committing wage theft, engaging in a contract (possibly verbal) and then keeping the promised compensation instead.


Lifetime plans are no longer available so, we’ve done you a favor and enrolled you in the pay-per-joke plan without your consent.


There should be criminal penalties for that kind of shit. Letting management off the hook without any personal consequences just encourages it.


I’m in a similar boat. I actively dislike incest/fauxcest porn and porn games. I don’t find it “naughty taboo” but gross and often fetishizing straight-up abuse. I’m not sad to see such games and videos, that I can never quite filter out because they are never consistently tagged, go. I am worried that they will try pulling the same shit as Tumblr and OF.



You caught me. I meant this, but was thinking backwards from the bottom up. Like building the logic and registers required to satisfy the CISC instruction.

Yeah. I’m from more of a SysAdmin/DevOps/(kinda)SWE background so, I tend to think of it in a similar manner to APIs. The x86_64 CISC registers are like a public API and the ??? RISC-y registers are like an internal API and may or may not even be accessible outside of intra-die communication.

This mental space is my thar be dragons and wizards space on the edge of my comprehension and curiosity. The pipelines involved to execute a complex instruction like AVX loading a 512 bit word, while two logical cores are multi threading with cache prediction, along with the DRAM bus width limitations, to run tensor maths – are baffling to me.

Very similar to where I’m at. I’ve finally gotten my AuADHD brain to get Vivado setup for my Zynq dev board and I think I finally have everything that I need to try to unbrick my Fomu (it doesn’t have a hard USB controller so, I have to use a pogo pin jig to try to load a basic USB softcore that will allow it to be programmed normally).

I barely understood the Chips and Cheese article explaining how the primary bottleneck for running LLMs on a CPU is the L2 to L1 cache bus throughput. Conceptually that makes sense, but thinking in terms of the actual hardware, I can’t answer, “why aren’t AI models packaged and processed in blocks specifically sized for this cache bus limitation”. If my cache bus is the limiting factor, duel threading for logical cores seems like asinine stupidity that poisons the cache. Or why an OS CPU scheduler is not equip to automatically detect or flag tensor math and isolate threads from kernel interrupts is beyond me.

Mind sharing that article?

Adding a layer to that and saying all of this is RISC cosplaying as CISC is my mental party clown cum serial killer… “but… but… it is 1 instruction…”

I think that it’s like the above way of thinking of it like APIs but, I could be entirely incorrect. I don’t think I am though. Because the registers that programs interact with are standardized, those probably are “actual” x86, in that they are to be expected to handle x86 instructions in the spec defined manner. Past those externally-addressable registers is just a black box that does the work to allow the registers to act in an expected manner. Some of that black box also must include programmable logic to allow microcode to be a thing.

Its a crazy and magical side of technology.


Like much of the newer x86 stuff is RISC-like wrappers on CISC instructions under the hood

I think it’s actually the opposite. The actual execution units tend to be more RISC-like but the “public” interfaces are CISC to allow backwards compatibility. Otherwise, they would have to publish new developer docs for every microcode update or generational change.

Not necessarily a bad strategy but, definitely results in greater complexity over time to translate between the “external” and “internal” architecture and also results in challenged in really tuning the interfacing between hardware and software because of the abstraction layer.


Could someone explain the “mouse mode”? Is it at all different from the Steam Deck’s ability to map trackpads and thumbsticks as mouse inputs?


I haven’t play Call of Lootbox in a long while. Bet they still haven’t done anything to fix spawns.


Epic signed a contract with the union stating that they would bargain (and then willfully violated it).




I think that tunnel magneto-resistance (TMR) are more favored for 3rd-party sticks. They’ve significant advantages over Hall Effect sensors in latency, power consumption, and, apparently, resolution. Plus, they operate on more similar electrical principles to the traditional pot-based sticks, so, they require less effort to design around.


SAG et al have already more or less abandoned voice actors.

As one who has a spouse who is a member, yup. I think that it’s elitism sabotaging the solidarity. Silver screen actors want to see themselves as better than voice/video game actors so, instead of pushing for massive membership drives and organizing an industry that desperately needs it, they sell them down the river with contracts with AI companies. It’s a bit infuriating. SAG has the resources to destroy non-union acting in video games but seldom shows any signs of being inclined to stand by they colleagues.


As a preface, I used to do this a lot on Reddit. My hobby (sounds odd) was to make a little old-school-blog-style post, detailing what I found interesting in gaming in the last week or so. I got a name for it, for a time, but having long-since abandoned reddit I thought I might try the same thing here, if you’ll indulge me!

Happy to have you posting!


I get that when you spending 100m+ on game development, but a game needs to have actual value to the consumer, it has to be entertainment, and entertainment is art.

A side point on this: maybe some accounting transparency would help too. We know that that $100M+ isn’t going to the developers as they are some of the most underpaid tech workers. How much of a given game’s budget is actually going to compensate those directly contributing to it vs administration/execs?


Most investors are going to care about what kind of return they’re making. It’s the capital they provide that pays the paychecks.

Maybe that’s the problem. Valve did pretty well for themselves, even before steam, without putting investors in charge of their direction.

If you want to do volunteer work on video games – I have – then that’s not an issue.

I have indeed worked on my own and others projects without financial gain but that’s orthogonal to my point.

But typically games are made by paid workers, and those workers won’t work without their paychecks.

The games industry is full of chronically under-compensated workers. Again, nowhere did I advocate for people to work for free for commercial enterprises or anything of the like.

So they’re going to need to attract investors.

That’s a pretty good example of the False Dichotomy fallacy. There are numerous alternatives that don’t involve prioritizing profit over the product or service that a business produces.


… If he wants to be a hedge fund exec, he should just go do that. The point of a business, contrary to the Chicago School MBA nonsense, is not to generate profit. It is to make a good or service that would otherwise be impractical for an individual, in a financially sustainable manner.


When I worked at a web host, we had people like that. Being support sucked. Like, yes, it sucks that your e-commerce site that uses horrifically outdated software is offline but, we don’t offer quad nines, especially not on a $35/year shared hosting plan. And, honestly Drew, your site gets single-digit visits per month and sells erotica based upon the premise of Edgar Allen Poe being transported to 1990s Brooklyn and working as an apartment building super. At best, you’re breaking even on that hosting bill.



The velocity that RISC-V development is seeing is remarkable. The first commercial ARM processor (ARM1) started design in 1983 and was released in 1985. The first Linux-capable ARM processor was the ARM2, released in 1986. The first 64-bit variant was Armv8-A, released in 2011, with Armv9.6-A in 2024.

RISC-V was first released in 2014 and the stable privileged and unprivileged ISAs were released 2021 and 2019 (including the first stable rv64I), respectively. The first Linux-capable RISC-V processor released was the SiFive Freedom U540, which came out in 2018. The current rv64I variant of RISC-V is at 2.1, released in 2022.

I’m optimistic that RISC-V can and will compete, given its compressed development timeframe and mass adoption in MCUs and coprocessors. The big hurdles really are getting rid of the hardware implementation bugs (ex. failure to correctly implement IEEE754 floats in THead C906 and C910 CPUs), and getting software support and optimizations.

There are several HPC companies iterating towards commercial datacenter deployment, of special note being Tenstorrent, which both has an interesting, novel architecture and Jim Keller (know for AMD K8, AMD64, and Apple M-series) as CTO. They may be able to displace NVIDIA a bit in the DC AI/ML space, which could help to force GPU prices to get more reasonable, which would be nice.

Overall, yeah, rv64 has a good deal of catching up to do but, with the ISA not requiring special, exorbitant licensing, hardware development is moving much faster that expected and may be competitive with ARM in more spaces soon, if they don’t succeed in using governments as an anti-competitive bludgeon.








Sith are a fictional sect of religious space wizards from a space opera. While they may have inspiration from religious sects of reality, they are very much not real. So, whether or not they deal in absolutes has absolutely no consequences to reality outside of the Star Wars fandom.


And there’s also resilience against natural disasters. Having processor manufacturing limited to one place is just a bad idea.


As for useful implementation, my cousin is an orthopedic surgeon and they use VR headset and 3D x-ray scanner, 3d printers and a whole bunch of sci-fi stuff to prep for operation, but they are not using a meta quest2, we’re talking 50k headset and million dollar equipment. None of that does anything to the gaming market.

That’s really awesome and I love seeing that the tech is actually seeing good uses.

Yeah. A lot of what you’re saying parallels my thoughts. The PC and console gaming market didn’t exist until there were more practical, non-specialty uses for computing and, importantly, affordability. To me, it seems that the manufacturers are trying to skip that and just try to get to the lucrative software part, while also skipping the part where you pay people fair wages to develop (the games industry is super exploitative of devs) or, like The Company Formerly-known as Facebook, use VR devices as another tool to harvest personal information for profit (head tracking data can be used to identify people, similar to gait analysis), rather than having interest in actually developing VR long-term.

Much as I’m not a fan of Apple or the departed sociopath that headed it, a similar company to its early years is probably what’s needed; people willing to actually take on some risk for the long-haul to develop the hardware and base software to make a practical “personal computer” of VR.

When I can code in one 10 hours a day without fucking up my eyes, vomiting myself, sweating like a pig and getting neck strain it will have the possibility to take over the computer market, until then, it’s a gimmick.

Absolutely agreed. Though, I’d note that there is tech available for this use case. I’ve been using Xreal Airs for several years now as a full monitor replacement (Viture is more FOSS friendly at this time). Bird bath optics are superior for productivity uses, compared to waveguides and lensed optics used in VR. In order to have readable text that doesn’t strain the eyes, higher pixels-per-degree are needed, not higher FOV.

The isolation of VR is also a negative in many cases as interacting and being aware of the real world is frequently necessary in productivity uses (both for interacting with people and mitigating eye strain). Apple was ALMOST there with their Vision Pro but tried to be clever, rather than practical. They should not have bothered with the camera and just let the real world in, unfiltered.


I think that the biggest problem is the lack of investment and willingness to take on risk. Every company just seems to want a quick cash grab “killer app” but doesn’t want to sink in the years of development of practical things that aren’t as flashy but solve real-world problems. Because that’s hard and isn’t likely to make the line go up every quarter.


The company that is being struck because it tried to sneak around the union contracts by making a shell company to do casting calls below the contact rates? That Riot Games? I’m not shocked.


Yeah. It’s crazy to think that USB can now handle 240W. And yes, the naming conventions are terribad but, at least the standards are actually open, unlike VESA’s.


Generally, I find that the discs often little more than licenses and require the actual have to be downloaded.


I mean, if it has USB PD, that could be a not so unreasonable thing. That is, supposing USB PD supplies as outlets become common.


Yeah… The singleplayer games being unplayable is extremely stupid. Was going to play Dying Light, which came out a long time ago but zero games, disc or not, will start.


Best psychological horror game in years.