nickwitha_k (he/him)
  • 0 Posts
  • 72 Comments
Joined 2Y ago
cake
Cake day: Jul 16, 2023

help-circle
rss

You caught me. I meant this, but was thinking backwards from the bottom up. Like building the logic and registers required to satisfy the CISC instruction.

Yeah. I’m from more of a SysAdmin/DevOps/(kinda)SWE background so, I tend to think of it in a similar manner to APIs. The x86_64 CISC registers are like a public API and the ??? RISC-y registers are like an internal API and may or may not even be accessible outside of intra-die communication.

This mental space is my thar be dragons and wizards space on the edge of my comprehension and curiosity. The pipelines involved to execute a complex instruction like AVX loading a 512 bit word, while two logical cores are multi threading with cache prediction, along with the DRAM bus width limitations, to run tensor maths – are baffling to me.

Very similar to where I’m at. I’ve finally gotten my AuADHD brain to get Vivado setup for my Zynq dev board and I think I finally have everything that I need to try to unbrick my Fomu (it doesn’t have a hard USB controller so, I have to use a pogo pin jig to try to load a basic USB softcore that will allow it to be programmed normally).

I barely understood the Chips and Cheese article explaining how the primary bottleneck for running LLMs on a CPU is the L2 to L1 cache bus throughput. Conceptually that makes sense, but thinking in terms of the actual hardware, I can’t answer, “why aren’t AI models packaged and processed in blocks specifically sized for this cache bus limitation”. If my cache bus is the limiting factor, duel threading for logical cores seems like asinine stupidity that poisons the cache. Or why an OS CPU scheduler is not equip to automatically detect or flag tensor math and isolate threads from kernel interrupts is beyond me.

Mind sharing that article?

Adding a layer to that and saying all of this is RISC cosplaying as CISC is my mental party clown cum serial killer… “but… but… it is 1 instruction…”

I think that it’s like the above way of thinking of it like APIs but, I could be entirely incorrect. I don’t think I am though. Because the registers that programs interact with are standardized, those probably are “actual” x86, in that they are to be expected to handle x86 instructions in the spec defined manner. Past those externally-addressable registers is just a black box that does the work to allow the registers to act in an expected manner. Some of that black box also must include programmable logic to allow microcode to be a thing.

Its a crazy and magical side of technology.


Like much of the newer x86 stuff is RISC-like wrappers on CISC instructions under the hood

I think it’s actually the opposite. The actual execution units tend to be more RISC-like but the “public” interfaces are CISC to allow backwards compatibility. Otherwise, they would have to publish new developer docs for every microcode update or generational change.

Not necessarily a bad strategy but, definitely results in greater complexity over time to translate between the “external” and “internal” architecture and also results in challenged in really tuning the interfacing between hardware and software because of the abstraction layer.


Could someone explain the “mouse mode”? Is it at all different from the Steam Deck’s ability to map trackpads and thumbsticks as mouse inputs?


I haven’t play Call of Lootbox in a long while. Bet they still haven’t done anything to fix spawns.


Epic signed a contract with the union stating that they would bargain (and then willfully violated it).




I think that tunnel magneto-resistance (TMR) are more favored for 3rd-party sticks. They’ve significant advantages over Hall Effect sensors in latency, power consumption, and, apparently, resolution. Plus, they operate on more similar electrical principles to the traditional pot-based sticks, so, they require less effort to design around.


SAG et al have already more or less abandoned voice actors.

As one who has a spouse who is a member, yup. I think that it’s elitism sabotaging the solidarity. Silver screen actors want to see themselves as better than voice/video game actors so, instead of pushing for massive membership drives and organizing an industry that desperately needs it, they sell them down the river with contracts with AI companies. It’s a bit infuriating. SAG has the resources to destroy non-union acting in video games but seldom shows any signs of being inclined to stand by they colleagues.


As a preface, I used to do this a lot on Reddit. My hobby (sounds odd) was to make a little old-school-blog-style post, detailing what I found interesting in gaming in the last week or so. I got a name for it, for a time, but having long-since abandoned reddit I thought I might try the same thing here, if you’ll indulge me!

Happy to have you posting!


I get that when you spending 100m+ on game development, but a game needs to have actual value to the consumer, it has to be entertainment, and entertainment is art.

A side point on this: maybe some accounting transparency would help too. We know that that $100M+ isn’t going to the developers as they are some of the most underpaid tech workers. How much of a given game’s budget is actually going to compensate those directly contributing to it vs administration/execs?


Most investors are going to care about what kind of return they’re making. It’s the capital they provide that pays the paychecks.

Maybe that’s the problem. Valve did pretty well for themselves, even before steam, without putting investors in charge of their direction.

If you want to do volunteer work on video games – I have – then that’s not an issue.

I have indeed worked on my own and others projects without financial gain but that’s orthogonal to my point.

But typically games are made by paid workers, and those workers won’t work without their paychecks.

The games industry is full of chronically under-compensated workers. Again, nowhere did I advocate for people to work for free for commercial enterprises or anything of the like.

So they’re going to need to attract investors.

That’s a pretty good example of the False Dichotomy fallacy. There are numerous alternatives that don’t involve prioritizing profit over the product or service that a business produces.


… If he wants to be a hedge fund exec, he should just go do that. The point of a business, contrary to the Chicago School MBA nonsense, is not to generate profit. It is to make a good or service that would otherwise be impractical for an individual, in a financially sustainable manner.


When I worked at a web host, we had people like that. Being support sucked. Like, yes, it sucks that your e-commerce site that uses horrifically outdated software is offline but, we don’t offer quad nines, especially not on a $35/year shared hosting plan. And, honestly Drew, your site gets single-digit visits per month and sells erotica based upon the premise of Edgar Allen Poe being transported to 1990s Brooklyn and working as an apartment building super. At best, you’re breaking even on that hosting bill.



The velocity that RISC-V development is seeing is remarkable. The first commercial ARM processor (ARM1) started design in 1983 and was released in 1985. The first Linux-capable ARM processor was the ARM2, released in 1986. The first 64-bit variant was Armv8-A, released in 2011, with Armv9.6-A in 2024.

RISC-V was first released in 2014 and the stable privileged and unprivileged ISAs were released 2021 and 2019 (including the first stable rv64I), respectively. The first Linux-capable RISC-V processor released was the SiFive Freedom U540, which came out in 2018. The current rv64I variant of RISC-V is at 2.1, released in 2022.

I’m optimistic that RISC-V can and will compete, given its compressed development timeframe and mass adoption in MCUs and coprocessors. The big hurdles really are getting rid of the hardware implementation bugs (ex. failure to correctly implement IEEE754 floats in THead C906 and C910 CPUs), and getting software support and optimizations.

There are several HPC companies iterating towards commercial datacenter deployment, of special note being Tenstorrent, which both has an interesting, novel architecture and Jim Keller (know for AMD K8, AMD64, and Apple M-series) as CTO. They may be able to displace NVIDIA a bit in the DC AI/ML space, which could help to force GPU prices to get more reasonable, which would be nice.

Overall, yeah, rv64 has a good deal of catching up to do but, with the ISA not requiring special, exorbitant licensing, hardware development is moving much faster that expected and may be competitive with ARM in more spaces soon, if they don’t succeed in using governments as an anti-competitive bludgeon.








Sith are a fictional sect of religious space wizards from a space opera. While they may have inspiration from religious sects of reality, they are very much not real. So, whether or not they deal in absolutes has absolutely no consequences to reality outside of the Star Wars fandom.


And there’s also resilience against natural disasters. Having processor manufacturing limited to one place is just a bad idea.


As for useful implementation, my cousin is an orthopedic surgeon and they use VR headset and 3D x-ray scanner, 3d printers and a whole bunch of sci-fi stuff to prep for operation, but they are not using a meta quest2, we’re talking 50k headset and million dollar equipment. None of that does anything to the gaming market.

That’s really awesome and I love seeing that the tech is actually seeing good uses.

Yeah. A lot of what you’re saying parallels my thoughts. The PC and console gaming market didn’t exist until there were more practical, non-specialty uses for computing and, importantly, affordability. To me, it seems that the manufacturers are trying to skip that and just try to get to the lucrative software part, while also skipping the part where you pay people fair wages to develop (the games industry is super exploitative of devs) or, like The Company Formerly-known as Facebook, use VR devices as another tool to harvest personal information for profit (head tracking data can be used to identify people, similar to gait analysis), rather than having interest in actually developing VR long-term.

Much as I’m not a fan of Apple or the departed sociopath that headed it, a similar company to its early years is probably what’s needed; people willing to actually take on some risk for the long-haul to develop the hardware and base software to make a practical “personal computer” of VR.

When I can code in one 10 hours a day without fucking up my eyes, vomiting myself, sweating like a pig and getting neck strain it will have the possibility to take over the computer market, until then, it’s a gimmick.

Absolutely agreed. Though, I’d note that there is tech available for this use case. I’ve been using Xreal Airs for several years now as a full monitor replacement (Viture is more FOSS friendly at this time). Bird bath optics are superior for productivity uses, compared to waveguides and lensed optics used in VR. In order to have readable text that doesn’t strain the eyes, higher pixels-per-degree are needed, not higher FOV.

The isolation of VR is also a negative in many cases as interacting and being aware of the real world is frequently necessary in productivity uses (both for interacting with people and mitigating eye strain). Apple was ALMOST there with their Vision Pro but tried to be clever, rather than practical. They should not have bothered with the camera and just let the real world in, unfiltered.


I think that the biggest problem is the lack of investment and willingness to take on risk. Every company just seems to want a quick cash grab “killer app” but doesn’t want to sink in the years of development of practical things that aren’t as flashy but solve real-world problems. Because that’s hard and isn’t likely to make the line go up every quarter.


The company that is being struck because it tried to sneak around the union contracts by making a shell company to do casting calls below the contact rates? That Riot Games? I’m not shocked.


Yeah. It’s crazy to think that USB can now handle 240W. And yes, the naming conventions are terribad but, at least the standards are actually open, unlike VESA’s.


Generally, I find that the discs often little more than licenses and require the actual have to be downloaded.


I mean, if it has USB PD, that could be a not so unreasonable thing. That is, supposing USB PD supplies as outlets become common.


Yeah… The singleplayer games being unplayable is extremely stupid. Was going to play Dying Light, which came out a long time ago but zero games, disc or not, will start.


Best psychological horror game in years.


I agree with you, to an extent. I would say it’s a lot more complicated than that with World of Warcraft, which is an MMO, and does not revolve on gambling except in the aspect of random number generated loot.

The way that the drops are is literally the same approach as a slot machine but with more steps to take up your time with boring shit and require more of your life to be dedicated to it so that there is less risk of you getting distracted by things like hobbies or games with finite stories with quality writing. A one-armed bandit might snag a handful of whales that spend all of their time feeding the machine. The Wrath of the Lich Bandit gets a much larger percentage of its users in front of it for a larger amount of their time, increasing the ratio of addicts/whales caught. Add in expansions, real money auctions, etc and you’ve got something much more fucked up than anything on a Vegas casino floor.


Agreed. Once they realized how much money they could make by designing MMOs to be addictive, and handed the reins to MBAs, they went down the crapper.



I think a big problem is (as of the last time I checked) the complete lack of anyone making practical things for VR. Not saying that everything needs to be practical to justify its existence but, I think that VR companies have been continually trying to skip ahead to the equivalent of where computing is now, ignoring the history of computers being primarily targeted to research and practical applications before they were adopted en masse and provided a lucrative market. So, instead, they just keep making glorified tech demos, hoping that someone else will do the hard work and they can rake in easy money by forcing them through app stores.

TL;DR: I think that short-sighted, profit-driven decision making is the reason that VR isn’t yet anything more than a niche.


Disclosure: I don’t play CoD anymore (I also think the series is overrated) and would like to see Activision/Blizzard burn.

You are, unfortunately, partially misperceiving and/or mischaracterizing the game and genre. Most are not murder simulators. Some certainly are (ex. Hitman and the skippable single player bits of one of the CoD games is) but those are the minority - the plots are generally revolving around military conflicts (whether military conflicts are by definition murder or not is another thing altogether though I would personally say that they are in the same ethical place) and the multiplayer is basically technological sports. Since the early-2000s at least, they have been propaganda supporting imperialism and normalizing military conflict, though GenZ seems to have wised up on that.

For the “real world guns” thing, they aren’t anymore with limited exceptions where a firearms company explicitly partners with them.

Additionally, the correlation between individuals playing violent video games and taking part on violence just does not exist in any research that has been conducted. Violent video games, in fact, allow people to work out aggression and frustration in healthy, non-destructive ways. Your anger is pointed in the wrong direction. If you want to target something that will have an actual impact, dedicate some energy to pushing fixes for wealth inequality and poverty. Yes, that’s harder to pin down but most things worth doing aren’t easy.


Shared libraries/dynamically-linked libraries, along with faster storage solve a lot of the historical optimization issues. Modern compilers and OSes general take care of that, if the right flags are used. With very few AAA games using in-house engines, it’s even less work for the studio, supposing the game engine developers are doing their jobs.

That said, you do still have a bit of a point. Proper QA requires running the software on all supported platforms, so, there is a need for additional hardware, if not offloading QA to customers via “Early Access”. Adding to that, there are new CPU architectures in the wild (or soon to be) that weren’t there 5 years ago and may not yet be well-supported with the toolchains.

Gaben is absolutely correct on practice though, it’s a distribution problem. EA, Epic, and the rest trying to force their storefront launchers and invasive DRM that makes the experience worse for the end users drives people to pirate more.


Fallout 2 really is the best game not just in the West-coast saga but the entire series.