It’s about time they ported their Deck performance viewer back to other platforms. It’s still a bit touch and go whether it picks up some things. No GPU readout under Linux, for example, as far as I can tell, at least with an Nvidia GPU.
The DLSS stuff is interesting, but it wasn’t much of a secret before. They took the way they present it from the generally amazing Lossless Scaling and, if anything, I like that you can now compare their solution to DLSS apples-to-apples. I’m a bit confused about their graph display, though. I’m guessing the red line is supposed to be native frames and green is all frames? That’s a bit weird, since the color coding on the text is backwards from that.
As a side note, it’s weird and has always been weird that Steam’s performance monitor has a way better time picking up apps than Nvidia’s on Windows. You’d think owning the drivers would give you the edge, but nope.
Well, no, it’s a concise way to say some objections are logical and sound and some are stemming from a moral panic.
Whether I agree with the objections on each camp is, again, irrelevant.
I disagree with some of the non-moral panic objections, too, and I’m happy to have that conversation.
Four possible types of objections in this scenario, if you want to be “logical” about it:
I think there aren’t any in that last group, but there are certainly at least some objections in all other three.
Neither of those things happened here.
The examples people found include a monitor showing random technical text that someone asked a LLM to write (presumably the writer who goofed is getting paid) and some localized subtitles that were left with a machine localization (the rest of the localization was contracted out).
Even assuming a bunch of other stuff in the game was AI generated and just went undetected, which is likely, if it’s all iterations on what people noticed it definitely doesn’t fit your description.
I hadn’t clicked through to the Reddit thing (for obvious reasons). The example in the article proper is in a Portuguese subtitle, but now that you pointed me at it and I did check the Reddit thread… well, that text is not legible in game unless you really try, so yeah, I hadn’t read it. I’m guessing that’s the only English instance?
As a non-native English speaker, let me tell you, terrible localization was very much a thing that happened well before machine translation, so that by itself (and more subtle typos or one-off errors) was definitely not enough to infer that someone had forgotten to fix a machine-translated line once.
You can definitely tell when something has been machine-translated and not fixed, but the real challenge is lack of context. This leads to nonsensical localization even today, whether it’s human or automated, especially in crowdsourced localizations, which are frequent in open source software. I contribute to some on occassion and maaaan, do I wish well intentioned people in that space would stop contributing to projects they don’t use/lines they haven’t seen in situ.
That is correct.
It is also correct that someone disagreeing with me can be doing so because of a moral panic. Our agreement is entirely disconnected to whether there is a moral panic at play or not.
For the record, I think “AI” is profoundly problematic in multiple ways.
This is also unrelated to whether there is a moral panic about it. Which there absolutely is.
For the record, the word as a general noun is widely recognized to mean what everybody thinks it means:
Luddite noun Ludd·ite ˈlə-ˌdīt : one of a group of early 19th century English workmen destroying laborsaving machinery as a protest broadly : one who is opposed to especially technological change
One of the weirder annoyances of the AI moral panic is how often you see this spiral of pedantry about the historical luddites whenever someone brings up the word as a pejorative.
I mean, fair rhetorical play, I suppose, in that it creates a very good incentive to not bring it up at all. If the goal was to avoid being called a luddite as an insult or as shorthand for dismissing AI criticism as outright technophobia I suppose that is mission accomplished, disingenuous as it is.
Just so we’re clear, the first pass of localization of every game you’ve played in the past decade has been machine-generated.
Which is not to say the final product was, people would then go over the whole text database and change it as needed, but it’s been frequent practice for a while for things like subtitles and translations to start from a machine generated first draft, not just in videogames but in media in general. People are turning around 24h localization for TV in some places, it’s pretty nuts.
Machine generated voices are also very standard as placeholders. I’m… kinda surprised nobody has slipped up on that post-AI panic, although I guess historically nobody noticed when you didn’t clean up a machine-translated subtitle, but people got good at ensuring all your VO lines got VOd because you definitely notice those.
As with a lot of the rest of the AI panic, I’m confused about the boundaries here. I mean, Google Translate has used machine learning for a long time, as have most machine translation engines. The robot voices that were used as placeholders up until a few years ago would probably be fine if one slipped up, but newer games often use very natural-sounding placeholders, so if one of those slips I imagine it’d be a bit of drama.
I guess I don’t know what “AI generated” means anymore.
I haven’t bumped into the offending text in the game (yet), but I’m playing it in English, so I guess I wouldn’t have anyway? Neither the article nor the disclosure are very clear.
That said, the game is pretty good, if anybody cares.
The nunchuck was super comfortable. I think the boxy stick idea for the remote itself made sense conceptually but you probably would want to smooth it out a bit otherwise.
But yeah, agreed, it was surprisingly functional, all things considered. And cheaply made in the right ways. Everything I said about joycon would still apply if the thing was wired. Razer, I think, did a wired split controller for PC. I never got to buy one before they were discontinued, and they’re a bit of a collectible these days.
I love them and if PC connectivity with them wasn’t so terrible and laggy they’d be my go-to.
Since they don’t play nice with PC I’d take a literal carbon copy. Love the button d-pad, love the size, love the layout.
I get most people don’t, and that makes sense, but for my particular set of appendages they are pretty much ideal. I’d pay good money for that exact thing just better built and for other platforms.
Instead whenever somebody tries they either make them even smaller for phones or they try to make them as big as a console controller, which doesn’t work at all. For a split controller you need to be able to wrap your hand all the way around the thing and there is no need for a handle-like bump (those are so you can leverage both hands on the controller at once using your palm).
The only other form factor I’d consider for a split controller is… the Wii-mote.
The Wii-mote was cool and did the thing I need it to do. Unfortunately it doesn’t work with a symmetrical dual stick config, so it’s not very practical. The Joycon are more plausible as a modern controller replacement.
In the meantime I just use the least offensive PS3-layout controllers I can find for 3D games (with lots of braces and frequent pauses) and a leverless controller for 2D games.
Ergonomics. It adds the option of my broken old wrists to not have to hold themselves at whatever arbitrary angle the probably younger and certainly healthier designer decided wasn’t causing immediate pain to them.
So yeah, having specific accessibility needs can suck and for my specific brand of those the joycon are an amazing option.
Also, playing a game with your hands by your sides or under a blanket or whatever adds a TON to the sense of immersion.
If the split controller works screw waiting for the Steam Controller, I’ve been waiting for a good PC joycon set.
This has some of the same issues as the Legion Go, in that it’s too chunky to play split. Also the superfluous touchpads that make it be too large (I’ll be taking no questions on this issue at this point).
But I do really want something like this for PC commercially.
I think maybe I’m spoiled by the movies, but… I kind of hate it? I hate all the ways they had to cherry pick Dune stuff to turn it into a survival crafting MMO like Conan, especially in the parts where the lore fits worse than Conan. And the story is extremely videogamey. I think the new films are already a bit overly literal when it comes to choosing between the politics and the psychedelia, but man, does Dune Awakening do videogame-ass videogame dream sequences.
The disconnected, patchy reality of the original Cryo Dune got to the right feel accidentally, but there’s something to seeing the setting reduced to a skin over Conan Exiles that seriously rubs me the wrong way.
I guess “every new game” is more accurate. I don’t know if they are in much of a hurry to go back to the old catalogue. Also, pretty sure by now that there’s a bunch of contract blockers in the FromSoft deal preventing the ports. That’s not to say they won’t eventually sort them out, but that’s clearly not a Sony-only thing. For the same reason I wonder if they can get Astro Bot out of the PS5, given all the third party IP thrown all around that game. We’ll see, I guess.
I think it’s telling that you’re still thinking back to ME2 when this comes up. It’s such a stale debate, but people who got into PC gaming in the aughts seem to be a bit stuck in a talking point that never made sense in the first place. It’s even weirder these days, given how much everybody is struggling with accessing high end GPUs and feeding absolutely insane high refresh/high res monitors with the stuff that’s available and with maximum settings going all the way to real time path-tracing. Not only are consoles not holding back the high end of PC, the high end of PC is apparently not holding back the high end of PC, and it kinda sucks.
Every game is Crysis now and nobody will praise me when I go “I told you so”. It kinda sucks.
They are putting everything on PC and they claim they will keep doing that, so… ideal outcome it is, I suppose.
I do think that’s better news. PC master race bros typically say consoles are holding PC gaming back, but this is the opposite of reality. PC gaming has benefitted a lot from having a set target hardware spec inherited from consoles. From controller standardization to performance optimizations, PC gaming would be much worse off without a console fixed target.
In unexpected ways, too. If you remember the bad old days of PC exclusive games they either targeted unattainable hardware as a tech demo or they aimed at the garbage tier lowest common denominator, which is how you ended up with games looking like World of Warcraft and The Sims for decades.
I love PC handhelds, but I certainly would hate for every PC release to be built primarily for those and laptops with mediocre iGPUs.
No, it is not!
Helldivers is fun enough, and I agree with you that the base game content is solid enough to sustain the experience.
That doesn’t make it any more valuable or engaging to spend money on more cape textures through a battlepass grind.
I would much rather pay for actual content than hope that whales and subscriptions subsidize it. Granted, I also see next to no appeal on grinding Helldivers’ missions and volatile metagame progression, so the entire design is not for me.
But for as long as you can make increasingly cheaper content to keep extracting ten bucks a month from people you will get companies trying to extract a hundred. You’re… you know, ruining it for the rest of us, please stop.
I would much rather pay 90 bucks for Donkey Kong than 45 for Helldivers 2 on account on a subset of whales subsidizing the rest of the package.
My one exception is fighting games, where I find paying for more characters down the line is flexible enough and has enough connection between meaningful content and investment that it supports a very long additional content tail. But pure cosmetics in a battlepass? Yeah, no, I’d rather not.
Man, I’m always surprised by the crap ragebait peddlers latch on to with these boring-ass investor presentations.
And I always feel the need to correct the record, which only pisses me off further.
So, for anybody interested, this is an investor scripted thing, they mostly are deflecting questions from investors that they don’t have answers to. At one point they say the Switch 2 won’t eat into their business because they have a different controller. It’s all filler nonsense.
The quote is somewhat out of context, in that they say there was an overly competitive market, but also that Concord didn’t stand out enough to compete. As much of a non-statement as that is, it’s not wrong.
Surprisingly, the ragemongers gloss over much more worrying stuff in there, like the confirmation that despite increasing subscription prices they are seeing more people buy into the expensive tier, not less (and you’re all ruining it for the rest of us, please stop). And they imply they will keep increasing prices, too.
They also point out that more than 50 percent of Helldivers’ revenue came from microtransactions now. Again, you’re all ruining it for the rest of us, please stop. They also confirm they will conitnue to milk that and “maximize revenue”.
On better news, they pretty much confirm they are making a PS6 when somebody suggests they should go PC and cloud only, so there’s that. They also confirm they want to keep making at least one big single player game per year and that they are actively looking into new IP.
If you read between the lines of investor presentation, they also kind of acknowledge that Marathon got bad feedback from playtesting and they’re trying to salvage it. Although, of course, they never say that outright.
This article sucks, and it made me listen to half an hour of investor executive nonsense and that makes whoever linked it is not my friend, either. On this, too, you’re ruining it for the rest of us. Please stop.
Yep. Definitely falls into this category. The roguelite stuff is a fun quirk, and I do enjoy unraveling the steps metagame more than I enjoy the “find a clue in a piece of paper and remember it for the next run” or the “doesn’t look like a puzzle but it is” bits.
Guys, I’ve been around a while. You’re probably not gonna recommend the game I accidentally missed that changes my mind.
Oooh, Outer Wilds. Did a couple of puzzles, I think I got around the loop once or twice, bounced right off.
I swear, I don’t know what it is. The sense of wonder just isn’t there. Maybe I’m too aware that all the pieces are put in by the designers and that withholding some pieces doesn’t inherently make the puzzle more interesting or even harder. I guess I find myself tapping my foot playing first person Lunar Lander while I wait for the thing to get around to the real game while I do rolling ball puzzles or whatnot.
Hah. Wasn’t into the “multimedia” era as much, either.
But still, I’d say context is important in that distinction. Old point and click was a AAA genre, through and through. Big, cinematic visuals and storytelling were at the core of that.
I’m not saying that’s better or that I like it more. In fact, I’d say I’m less into that kind of thing these days. But it was a different moment in time to get hold of one of those compared to an indie release overcomplicating the self-revealing world concept from Myst.
Why I haven’t been into that idea since all the way back in Myst is harder to parse for me. Maybe I’m just less metatextually enamoured with the idea of self-revealing games as a flourish than I am about having the reveal be a fully functional narrative? As I said above I adore Obra Dinn. There’s a lot of the same connective tissue there, but maybe I’m just more in touch with it when it’s a medium for a good, old-timey gothic horror story than when it’s this abstract world-in-code thing.
No, I don’t think so. I love puzzles. Hard puzzles, even. I really, really like Return of the Obra Dinn, I spent the 90s fawning over point and click adventures. I have zero problems blasting through the Portal games and a bunch of their derivatives.
For some reason it’s specifically this setup of “figure out the rules of the world and peel off the layers of the game” thing that misses me. I don’t know what to tell you there.
I wanted to like it, couldn’t really get into it.
I see what it’s going for, it’s just… not my thing. It never clikced with me moment to moment and the self-congratulatory aren’t-we-smart information discovery stuff just doesn’t work for me in most cases (this applies to Fez and The Witness, too).
I’m not mad that people do like it, though. There’s nothing in there I find… objectionable, or poorly designed. I just didn’t get into it and that’s alright.
Well, the huge brand helps.
Which is probably why what was even at launch ultimately a somewhat outdated Farmville-like got so much attention.
Downloads have always been a weird metric for mobile games. I’ve downloaded this game on maybe five or six devices during this decade, but I’m pretty sure I haven’t played it al tall in the past nine years and six months.
That’s the problem with surveys, isn’t it? What’s “latency being eliminated”? On principle it’d be your streamed game responds as quickly as a local game, which is entirely achievable if your target is running a 30fps client on a handheld device versus streaming 60 fps gameplay from a much more powerful server. We can do that now.
But is that “latency free” if you’re comparing it to running something at 240Hz in your gaming PC? With our without frame generation and upscaling? 120 Hz raw? 60Hz on console?
The question isn’t can you get latency free, the question is at what point in that chain does the average survey-anwering gamer start believing the hype about “latency free streaming”?
Which is irrelevant to me, because the real problem with cloud gaming has zero to do with latency.
I mean, as written the headline statement is always true.
I am horrified by some of the other takeaways, though:
Nearly 3 in 4 gamers (73%) would choose NVIDIA if all GPU brands performed equally. 57% of gamers have been blocked from buying a GPU due to price hikes or scalping, and 43% have delayed or canceled purchases due to other life expenses like rent and bills. Over 1 in 4 gamers (25%) say $500 is their maximum budget for a GPU today. Nearly 2 in 3 gamers (62%) would switch to cloud gaming full-time if latency were eliminated, and 42% would skip future GPU upgrades entirely if AI upscaling or cloud services met their performance needs.
I’m more than ok with that. At some point the entire point of GoG is necessarily the curation, and this is a good complement to their maintenance work, which before ended up existing alongside community fixes instead of incorporating them.
They’ve also added some features to Galaxy and improved the checkout experience. It’s encouraging to see them roll out new platform features. For a while there it seemed like it was somewhat abandoned and just coasting, which was concerning in a world of effective Steam monopoly on PC gaming.
Lots of stretches there. Voice talent is more replaceable than on-camera talent, but not “completely expendable”.
I’m calling that a good thing. Having a working relationship with an actor is a thing, and I fully support good working conditions for VAs, but not being held to actors being the go/no-go talent for a whole project at all times is a noticeable improvement over the film industry, as far as I’m concerned.
Plus it’s such an anglocentric world, where English-speaking actors are the only ones that get work and a leg to stand on when it comes to negotiation. That entire business is VERY weird in gaming and it has a lot more nuance than a lot of people, particularly in the English-speaking world will give it credit for.
This is true, but she doesn’t sound particularly angry and I wonder what type of relationship you have with them and how you communicate. Nintendo is weird, because a bunch of these legacy actors have been doing just tiny soundbytes for these games, it’s not a fully voiced thing, so it’s hard to guess what sort of ongoing relationship they have with Nintendo in the first place.
They do mix and match sometimes, but I can’t imagine she would have been waiting for a call up until the day the game launches. It may be some awkward wording or some NDA/negotiation stuff that just resolved. I don’t know, and I’m not gonna jump to conclusions. Somebody may want to ask her instead of just reporting on a tweet, though.
Huh.
I mean, she certainly knew before then, given the game has been in development for ages. Pretty sure three days before launch she may have figured. That just makes it weirder that she pointedly mentioned Nintendo told her “yesterday”. I guess maybe she meant they told her it’s not a one-off thing and they’ll go with someone else moving forward?
It sure seems the movie(s) are the new North Star for this. I noticed MK World’s Donkey Kong also sounds noticeably more… Rogen-esque.
Everything has to cater to the shittiest hardware.
Up until very recently it wasn’t uncommon for developers to keep around the most craptastic old CRT TV they could get hold of just to check whether their UI would be unreadable for everybody. Ditto for TV stations.
And even after HDMI became the standard and consoles stopped supporting SD it’s not uncommon to have a TV set to the most garbage-ass factory store settings just to check that no matter how stupidly the user has set their TV they can still read text and see all the colors more or less distinctly.
You don’t need to care about it, but devs typically do.
The team at Sandfall certainly does, given they did add accessibility options to automate all the offensive QTEs and a bunch of in-game items to make builds based on taking damage instead of parrying.
Because, you know, not everything has to cater to you, but it sure is cool when devs think about these things for a second.
Still, despite all that it’s a terrible time to make timing-based minigames like that since you have no control over latency at any point of the user’s hardware chain. You simply can’t know what type of latency you’re catering to. You don’t even know the target framerate, which can range from 30-ish to 400Hz. It’s absolutely atrocious to do timing-based gameplay in modern gaming.
They already had a FPS counter on Windows, but they’ve expanded that with CPU/GPU/RAM usage, a frametime graph and that separate FPS/DLSS frame counter. No battery stats, surprisingly, even on handhelds.
I don’t know what they’re wrapping on Windows, but they definitely have decent access, and yeah, the Nvidia overlay sometimes loses the FPS counter where Steam keeps it on Windows. Don’t ask me how that works.