nickwitha_k (he/him)
  • 0 Posts
  • 63 Comments
Joined 2Y ago
cake
Cake day: Jul 16, 2023

help-circle
rss

SAG et al have already more or less abandoned voice actors.

As one who has a spouse who is a member, yup. I think that it’s elitism sabotaging the solidarity. Silver screen actors want to see themselves as better than voice/video game actors so, instead of pushing for massive membership drives and organizing an industry that desperately needs it, they sell them down the river with contracts with AI companies. It’s a bit infuriating. SAG has the resources to destroy non-union acting in video games but seldom shows any signs of being inclined to stand by they colleagues.


As a preface, I used to do this a lot on Reddit. My hobby (sounds odd) was to make a little old-school-blog-style post, detailing what I found interesting in gaming in the last week or so. I got a name for it, for a time, but having long-since abandoned reddit I thought I might try the same thing here, if you’ll indulge me!

Happy to have you posting!


I get that when you spending 100m+ on game development, but a game needs to have actual value to the consumer, it has to be entertainment, and entertainment is art.

A side point on this: maybe some accounting transparency would help too. We know that that $100M+ isn’t going to the developers as they are some of the most underpaid tech workers. How much of a given game’s budget is actually going to compensate those directly contributing to it vs administration/execs?


Most investors are going to care about what kind of return they’re making. It’s the capital they provide that pays the paychecks.

Maybe that’s the problem. Valve did pretty well for themselves, even before steam, without putting investors in charge of their direction.

If you want to do volunteer work on video games – I have – then that’s not an issue.

I have indeed worked on my own and others projects without financial gain but that’s orthogonal to my point.

But typically games are made by paid workers, and those workers won’t work without their paychecks.

The games industry is full of chronically under-compensated workers. Again, nowhere did I advocate for people to work for free for commercial enterprises or anything of the like.

So they’re going to need to attract investors.

That’s a pretty good example of the False Dichotomy fallacy. There are numerous alternatives that don’t involve prioritizing profit over the product or service that a business produces.


… If he wants to be a hedge fund exec, he should just go do that. The point of a business, contrary to the Chicago School MBA nonsense, is not to generate profit. It is to make a good or service that would otherwise be impractical for an individual, in a financially sustainable manner.


When I worked at a web host, we had people like that. Being support sucked. Like, yes, it sucks that your e-commerce site that uses horrifically outdated software is offline but, we don’t offer quad nines, especially not on a $35/year shared hosting plan. And, honestly Drew, your site gets single-digit visits per month and sells erotica based upon the premise of Edgar Allen Poe being transported to 1990s Brooklyn and working as an apartment building super. At best, you’re breaking even on that hosting bill.



The velocity that RISC-V development is seeing is remarkable. The first commercial ARM processor (ARM1) started design in 1983 and was released in 1985. The first Linux-capable ARM processor was the ARM2, released in 1986. The first 64-bit variant was Armv8-A, released in 2011, with Armv9.6-A in 2024.

RISC-V was first released in 2014 and the stable privileged and unprivileged ISAs were released 2021 and 2019 (including the first stable rv64I), respectively. The first Linux-capable RISC-V processor released was the SiFive Freedom U540, which came out in 2018. The current rv64I variant of RISC-V is at 2.1, released in 2022.

I’m optimistic that RISC-V can and will compete, given its compressed development timeframe and mass adoption in MCUs and coprocessors. The big hurdles really are getting rid of the hardware implementation bugs (ex. failure to correctly implement IEEE754 floats in THead C906 and C910 CPUs), and getting software support and optimizations.

There are several HPC companies iterating towards commercial datacenter deployment, of special note being Tenstorrent, which both has an interesting, novel architecture and Jim Keller (know for AMD K8, AMD64, and Apple M-series) as CTO. They may be able to displace NVIDIA a bit in the DC AI/ML space, which could help to force GPU prices to get more reasonable, which would be nice.

Overall, yeah, rv64 has a good deal of catching up to do but, with the ISA not requiring special, exorbitant licensing, hardware development is moving much faster that expected and may be competitive with ARM in more spaces soon, if they don’t succeed in using governments as an anti-competitive bludgeon.








Sith are a fictional sect of religious space wizards from a space opera. While they may have inspiration from religious sects of reality, they are very much not real. So, whether or not they deal in absolutes has absolutely no consequences to reality outside of the Star Wars fandom.


And there’s also resilience against natural disasters. Having processor manufacturing limited to one place is just a bad idea.


As for useful implementation, my cousin is an orthopedic surgeon and they use VR headset and 3D x-ray scanner, 3d printers and a whole bunch of sci-fi stuff to prep for operation, but they are not using a meta quest2, we’re talking 50k headset and million dollar equipment. None of that does anything to the gaming market.

That’s really awesome and I love seeing that the tech is actually seeing good uses.

Yeah. A lot of what you’re saying parallels my thoughts. The PC and console gaming market didn’t exist until there were more practical, non-specialty uses for computing and, importantly, affordability. To me, it seems that the manufacturers are trying to skip that and just try to get to the lucrative software part, while also skipping the part where you pay people fair wages to develop (the games industry is super exploitative of devs) or, like The Company Formerly-known as Facebook, use VR devices as another tool to harvest personal information for profit (head tracking data can be used to identify people, similar to gait analysis), rather than having interest in actually developing VR long-term.

Much as I’m not a fan of Apple or the departed sociopath that headed it, a similar company to its early years is probably what’s needed; people willing to actually take on some risk for the long-haul to develop the hardware and base software to make a practical “personal computer” of VR.

When I can code in one 10 hours a day without fucking up my eyes, vomiting myself, sweating like a pig and getting neck strain it will have the possibility to take over the computer market, until then, it’s a gimmick.

Absolutely agreed. Though, I’d note that there is tech available for this use case. I’ve been using Xreal Airs for several years now as a full monitor replacement (Viture is more FOSS friendly at this time). Bird bath optics are superior for productivity uses, compared to waveguides and lensed optics used in VR. In order to have readable text that doesn’t strain the eyes, higher pixels-per-degree are needed, not higher FOV.

The isolation of VR is also a negative in many cases as interacting and being aware of the real world is frequently necessary in productivity uses (both for interacting with people and mitigating eye strain). Apple was ALMOST there with their Vision Pro but tried to be clever, rather than practical. They should not have bothered with the camera and just let the real world in, unfiltered.


I think that the biggest problem is the lack of investment and willingness to take on risk. Every company just seems to want a quick cash grab “killer app” but doesn’t want to sink in the years of development of practical things that aren’t as flashy but solve real-world problems. Because that’s hard and isn’t likely to make the line go up every quarter.


The company that is being struck because it tried to sneak around the union contracts by making a shell company to do casting calls below the contact rates? That Riot Games? I’m not shocked.


Yeah. It’s crazy to think that USB can now handle 240W. And yes, the naming conventions are terribad but, at least the standards are actually open, unlike VESA’s.


Generally, I find that the discs often little more than licenses and require the actual have to be downloaded.


I mean, if it has USB PD, that could be a not so unreasonable thing. That is, supposing USB PD supplies as outlets become common.


Yeah… The singleplayer games being unplayable is extremely stupid. Was going to play Dying Light, which came out a long time ago but zero games, disc or not, will start.



I agree with you, to an extent. I would say it’s a lot more complicated than that with World of Warcraft, which is an MMO, and does not revolve on gambling except in the aspect of random number generated loot.

The way that the drops are is literally the same approach as a slot machine but with more steps to take up your time with boring shit and require more of your life to be dedicated to it so that there is less risk of you getting distracted by things like hobbies or games with finite stories with quality writing. A one-armed bandit might snag a handful of whales that spend all of their time feeding the machine. The Wrath of the Lich Bandit gets a much larger percentage of its users in front of it for a larger amount of their time, increasing the ratio of addicts/whales caught. Add in expansions, real money auctions, etc and you’ve got something much more fucked up than anything on a Vegas casino floor.


Agreed. Once they realized how much money they could make by designing MMOs to be addictive, and handed the reins to MBAs, they went down the crapper.



I think a big problem is (as of the last time I checked) the complete lack of anyone making practical things for VR. Not saying that everything needs to be practical to justify its existence but, I think that VR companies have been continually trying to skip ahead to the equivalent of where computing is now, ignoring the history of computers being primarily targeted to research and practical applications before they were adopted en masse and provided a lucrative market. So, instead, they just keep making glorified tech demos, hoping that someone else will do the hard work and they can rake in easy money by forcing them through app stores.

TL;DR: I think that short-sighted, profit-driven decision making is the reason that VR isn’t yet anything more than a niche.


Disclosure: I don’t play CoD anymore (I also think the series is overrated) and would like to see Activision/Blizzard burn.

You are, unfortunately, partially misperceiving and/or mischaracterizing the game and genre. Most are not murder simulators. Some certainly are (ex. Hitman and the skippable single player bits of one of the CoD games is) but those are the minority - the plots are generally revolving around military conflicts (whether military conflicts are by definition murder or not is another thing altogether though I would personally say that they are in the same ethical place) and the multiplayer is basically technological sports. Since the early-2000s at least, they have been propaganda supporting imperialism and normalizing military conflict, though GenZ seems to have wised up on that.

For the “real world guns” thing, they aren’t anymore with limited exceptions where a firearms company explicitly partners with them.

Additionally, the correlation between individuals playing violent video games and taking part on violence just does not exist in any research that has been conducted. Violent video games, in fact, allow people to work out aggression and frustration in healthy, non-destructive ways. Your anger is pointed in the wrong direction. If you want to target something that will have an actual impact, dedicate some energy to pushing fixes for wealth inequality and poverty. Yes, that’s harder to pin down but most things worth doing aren’t easy.


Shared libraries/dynamically-linked libraries, along with faster storage solve a lot of the historical optimization issues. Modern compilers and OSes general take care of that, if the right flags are used. With very few AAA games using in-house engines, it’s even less work for the studio, supposing the game engine developers are doing their jobs.

That said, you do still have a bit of a point. Proper QA requires running the software on all supported platforms, so, there is a need for additional hardware, if not offloading QA to customers via “Early Access”. Adding to that, there are new CPU architectures in the wild (or soon to be) that weren’t there 5 years ago and may not yet be well-supported with the toolchains.

Gaben is absolutely correct on practice though, it’s a distribution problem. EA, Epic, and the rest trying to force their storefront launchers and invasive DRM that makes the experience worse for the end users drives people to pirate more.


Fallout 2 really is the best game not just in the West-coast saga but the entire series.


Because they algorithmically manipulate people to cause outrage and suppress left-of-center views, while also being complicit in mass murder and right-wing voter manipulation campaigns.



Git isn’t very good with large binary files, git blame doubly so.

Good point. I’d hope that there is some equivalent version control though.


Do game devs not have git blame ? Honest question. Seems like not having version control on development of 3d assets is a bit of an oversight.



That’s a pretty unexpected surprise. If it provides Linux-compatible usage, I might get one afterall.


That’s one of the few models that I see as promising. Likely not easy though. Have you worked with software engineers? It’s like herding cats much of the time, even when I’m a peer position.


An opportunity RISC-V will offer to anyone with a billion dollars lying around.

Exactly this. Nvidia and Seagate, among others, have already hopped on this. I hold out hope for more accessible custom processors that would enable hobbyists and smaller companies to join in as well, and make established companies more inclined to try novel designs.

x86 market share is 99.999% driven by published software. Microsoft already tried expanding Windows, and being Microsoft, made half a dozen of the worst decisions simultaneously.

Indeed. I’ve read opinions that that was historically also a significant factor in PowerPC’s failure - noone is going to want to use your architecture, if there is no software for it. I’m still rather left scratching my head at a lot of MS’s decisions on their OS and device support. IIRC, they may finally be having an approach to drivers that’s more similar to Linux, but, without being a bit more open with their APIs, I’m not sure how that will work.

Linux dorks (hi)

Hello! 0/

What’s really going to threaten x86 are user-mode emulators like box86, fex-emu, and qemu-user. That witchcraft turns Windows/x86 binaries into something like Java: it will run poorly, but it will run.

Hrm…I wonder if there’s some middle ground or synergy to be had with the kind of witchcraft that Apple is doing with their Rosetta translation layer (though, I think that also has hardware components).

Right now those projects mostly target ARM, obviously. But there’s no reason they have to. Just melting things down to LLVM or Mono would let any native back-end run up-to-date software on esoteric hardware.

That would be brilliant.


I would’ve had my doubts, until Apple somehow made ARM competitive with x86. A trick they couldn’t pull off with PowerPC.

Yeah. From what I’ve pieced together, Apple’s dropping PowerPC ultimately came down to perf/watt and delays in delivery from IBM of a suitable chip that could be used in a laptop and support 64-bit instructions. x86 beat them to the punch and was MUCH more suitable for laptops.

Interestingly, the mix of a desire for greater vertical integration and chasing perf/watt is likely why they went ARM. With their license, they have a huge amount of flexibility and are able to significantly customize the designs from ARM, letting them optimize in ways that Intel and AMD just wouldn’t allow.

I guess linear speed barely ought to matter, these days, since parallelism is an order-of-magnitude improvement, and scales.

It is definitely a complicated picture, when figuring out performance. Lots of potential factors come together to make the whole picture. You’ve got ops power clock cycle per core, physical size of a core (RISC generally has fewer transistors per core, making them smaller and more even), integrated memory, on-die co-processors, etc. The more that the angry little pixies can do in a smaller area, the less heat is generated and the faster they can reach their destinations.

ARM, being a mature, and customizable RISC arch really should be able to chomp into x86 market share. RISC-V, while younger, has been and to grow an advance at a pace not seen before, to my knowledge, thanks to its open nature. More companies are able to experiment and try novel architectures than under x86 or ARM. The ISA is what’s gotten me excited again about hardware and learning how it’s made.