



Chronus can be detected on consoles, just not super easily. And it kinda depends on each game’s developer and their ability to implement such detections. I know that Embark Studios have said that they’ve found ways to detect such devices in The Finals.
I believe that, while they can’t detect the actual hardware plugged into the console, they’re able to detect input patterns that would only be possible from M/K (such as 0ms AD-spamming). Of course, I can’t imagine that’s 100% foolproof on its own, either.


If this is something you want to try for yourself, either buy a second PS5 and use a burner account on it, or be prepared for the possibility of losing your entire PSN account. This goes for pretty much any internet-enabled console modding.
Nintendo deactivated a 10+ year old account of mine when I tried modding a Wii a while back. It wasn’t a huge deal at the time, because I still had physical copies of most of my games at that point. But these days, my library is almost entirely digital, so I keep separate fuck-around accounts so that I don’t find-out with an account I’ve spent money on.


How does one even accidentally steal a texture someone else made?
One could easily apply Hanlon’s Razor to this. For example:
I used to be a pretty hardcore Destiny 2 player for several years. In that time, I’ve seen Bungie fuck up a lot of things. But those fuck-ups were almost entirely caused by somebody in the studio not playing close-enough attention to something, and details getting mixed up in the pipeline. I don’t think anybody at Bungie knowingly put Antireal’s art into the game. I think the more likely explanation is that there was a lack of oversight, and files that shouldn’t have been mixed together, got mixed together.
It wouldn’t even be the first time Bungie had something like this happen; there was an instance where a third-party studio that Bungie contracted to build a Destiny 2 cut-scene accidentally used artwork that was not intended to be in the actual cut-scene.
Not to suggest that any of this excuses Bungie for multiple cases of plagiarism. Obviously, they need to have stricter standards in place when transferring files between parties. It’s a colossal fuck-up, but I don’t think that it was a fuck-up anybody set out to commit.


If you knew about any of the four reporters that founded 404 Media, or the incredibly high quality journalism they produce, this thought wouldn’t have even crossed your mind.
If one could ever actually read anything on 404 Media in the first place, maybe one would know that. But as somebody who doesn’t have the preexisting knowledge of who founded the org or what sort of reporting they do, it’s hard to glean anything useful when the articles are cut off by a paywall.


This has been my concern, as well. Games rarely ship as a finished product anymore. All the disc is really good for is acting as a token for ownership. But even that is limited, as the disc often doesn’t even have enough space to fully install a playable version of the game, and you still need to be able to download the rest of the game from official servers.


That’s honestly surprising to see, considering that the console launched with like… five games, none of which were even a mainline Nintendo title.
It’s also surprising because I just haven’t seen any hype for it. With the Switch 1, everybody I knew either got one right away, or as soon as they were restocked. I still don’t know anybody who bought a Switch 2, or is even excited for it.
I believe OP is referring to input latency, which isn’t so much a result of the system slowing down due to increased load, as much as running in a consistently slowed-down state causing a delay on your inputs being reflected on-screen. There’s several reasons for why this is happening more often lately.
Part of it has to do with the displays we use nowadays. In the past, most players used a CRT TV/monitor to play games, which have famously fast response times (the time between receiving the video signal and rendering that signal on the screen is nearly zero). But modern displays, while having a much crisper picture, often tend to be slower at the act of actually firing pixels on the screen, causing that delay between pressing Jump and seeing your character begin jumping.
Some games also strain their systems so hard that, after various layers of post-processing effects get applied to every rendered frame, the displayed frames are already “old” before they’re even sent down the HDMI cable, resulting in a laggier feel for the player. You’ll see this difference in action with games that have a toggle for a “performance/quality” mode in the graphics settings. Usually this setting will enable/disable certain visual effects, reducing the load on the system and allowing your inputs to be registered faster.


It’s definitely not going to live up to the hype. We already know what Hollow Knight is like, and we’ve seen a demo of what Silksong will be like from last year’s E3, and… it’s really not that much different. Not that that’s an inherently bad thing, since Hollow Knight was already really good, so any improvement on that is only going to be better.
I worry that it’ll suffer a similar fate to Duke Nukem Forever. In a vacuum, DNF isn’t necessarily a bad game, but it suffered from being overhyped for years. So when it came out and just turned out to be “okay”, that was the final nail in the coffin for the Duke Nukem franchise. I hope I’m wrong, though.


The problem with this detection method is that you occasionally run into honest players catching bans for being legitimately too good at the game. While rare, there are some players who are accurate enough with their tracking that even professional players would assume they’re cheating, and end up getting banned because the developers decided nobody should ever be that good at the game.
This ends up putting a skill ceiling on a game, which is uhhealthy for a competitive game.
Considering one of the common refrains about the most famous game in the series, Silent Hill 2, is that the combat being crap is an important part of making you feel like a regular guy way out of your depth, I’d say they have right to feel concerned. There’s a serious incongruity between “horror game” and “detailed combat system”.
I agree, and this was precisely the issue I had with some of the more recent western-developed SH games, like Homecoming. They give the player too much agency, for a franchise that was built on making the most of limitations (both technical and strategical). I have similar complaints with recent Resident Evil games for the same reasons.
That said, SH:F already seems to be a pretty major departure from the franchise, so maybe they’re trying to gauge reactions to possible avenues for spinning-off the series to other genres with its own “rules” and design.


It depends on the game, and my familiarity with it. If it’s a linear, story-based game where the player doesn’t really influence the end result at all, then watching it is just as good as playing it myself, in my opinion. Or if it’s a new addition to a franchise that I’m already experienced in, like a new Super Mario game, then watching it is generally just as fine of an experience as playing it.
But if it’s a game that’s based entirely around the experience of playing it, like most multiplayer shooters for example, then watching somebody else play may be entertaining, but doesn’t substitute actually playing it myself.
It’s not all plagiarism, though. For instance, Embark Studios uses AI to create in-game voice lines for characters in their games. They made their own models with actors hired specifically to train them.