why would you take anything you see on the internet seriously?
I think it’s prudent to be on an older node, using stock that’s more abundant, even if it’s older - especially if it still performs the duties well enough. You’re 100% on the cost side of things, especially considering that Nintendo has never had any consoles that were crazy expensive. Everything was always supposed to be family friendly and therefore family attainable.
I still think battery life is a higher concern for them than sheer power when in handheld mode though, and that’s a key differentiating factor between a Deck and a Switch, besides the Nintendo first-party library and chip architecture. It’s really cool that the Deck is flexible enough to do both high performance and low performance tasks with toggles for the draw.
The performance was never the consideration for Nintendo. They want a handheld that can last a long time, so they will always clock their chips down. You can’t compare 30 watts all the time to 30 watts plugged in, let alone 5 watts in handheld mode.
Steam Decks are great, but lets be real; when you play a big AAA title, even on moderate settings, you might get two hours out of the machine pushing it to the limit at full TDP.
This is kind of a nothingburger story. We always knew Nintendo were not going to scale their machines up to the level of PC gaming handhelds.
This article takes a bunch of time to really just say nothing. Like of course if you ask non-techies what brands they can think of they’re going to rattle off Disney and Starbucks before Nvidia.
The average person doesn’t know Nvidia the same way they don’t know other hardware manufacturers down to the component, especially for datacenter scenarios. They’re not going to be able to list things that are driving forces in the tech industry, because it’s not anything they will be able to interact with day-to-day. The average person doesn’t run the LLM themselves. PC gamer knowledge is also pretty peripheral
As the most valued company during a gold rush, does it really matter how much the average person knows your name if all the people with the money already do? It’s not like my mother and my sister are going to be buying graphics cards to run LLMs any time soon. There’s clearly a clientele for this and it’s not the average person. The fact that they produce consumer equipment is literally not at all what’s giving them their new valuation.
If you have a Gen 1 Switch laying around, I would advise you to sell it to somebody who wants to make use of the ability to put CFW on it. They are actually still pretty valuable given the fact that all of the models beyond the first generation are guarded against the exploit that even makes this possible.
If you can only install softmods you will not be able to crack the Switch to install Android on it. If you read the article it goes into detail about how only G1 Switches can actually achieve this because they are not guarded against the Fusée Gelée exploit for Nvidia Tegra processors. It continues to point out how Lites and OLEDs need to have custom soldering done for this to even work.
I have a Lite so I couldn’t even if I wanted to.
If you have an original gen 1 Switch capable of even doing this (eg, not guarded against Fusée Gelée exploit at the hardware level like all subsequent models) you will probably find a better return selling it to somebody who wants to put CFW on it rather than turning it into a hacky android tablet.
I just don’t get what the purpose is though. You’ve lost access to the proprietary primary library, which was the original reason to buy a Switch. If you want an emulation console there are cheaper alternatives as well other than the Deck, I was just using it as the de facto standard handheld.
There’s no benefit to nuking the OS and replacing it on a Switch. At least with something like a ROG Ally, you can make the argument that flipping over to Linux would make the handheld more performant and energy efficient. That cannot be said about flashing Lineage onto a Switch which functionally makes the system considerably less useful.
The Switch OS is already optimized and designed for the hardware. It’s as good as you’re going to get, and it’s also already Linux. I would much rather suggest cracking it to put custom firmware on the device based on the Switch OS; you would get more use out of the device because it could still play the games and be rigged to emulate the older ones.
It’s cool Lineage did this or whatever but it’s kind of a pointless and weird flex.
I tried Citadelum, which is a Roma-era city builder.
It’s a bit janky given that it’s an early demo, but it’s a neat premise given that the last Roman city builder I was aware of was Caesar 3 from '98.
I give it points for concept and setting, but I think Anno 117 is going to be my preferred Roman-era city builder when that drops, because I already know and love the Anno mechanics.
Yeah, the Windows handhelds are basically glorified laptops. This was kind of the approach with the ROG ALLY anyways with the XGM port, allowing connection to an eGPU enclosure with up to a 4090 inside. It just runs a full blown version of Windows and you can even put on a pro license and do dumb shit like have WSL or Hyper-V available on the device.
I have a ROG ALLY and I’ve debloated it to hell, but it’ll never match the power savings I would achieve if it was Linux-based.
I’ve been following Chimera and Bazzite on their progression for developing distros for the Windows handhelds, but it’s going to be a while before they will be fully viable on any handhelds.
Steam will always be ahead because they control the hardware and the software and they are able to fine tune the software to their very specific hardware, which is simply not happening for the Windows handhelds.
I agree.
Even using my examples of KOTOR and ME, comparing them to (relatively) modern counterparts, Jedi Survivor and Andromeda, you can see that the storytelling has taken a back seat to the open world. ME 1-3 were all very tight corridor cover shooters, going from fully constructed combat environment to another, while Andromeda tried to shoehorn in survival crafting and exploration. KOTOR has more deep RPG mechanics and overall a better story than Jedi Survivor, and I would agree it’s because the focus changed on providing sprawling open worlds over more bespoke environments. I would also say that the combat in Andromeda and Jedi Survivor are superior to their older counterparts, but at the loss of other things.
Most of them, honestly.
When you look back, it was cool what they were doing at the time, but progress is such that all newer games have iterated on those groundbreaking formulas and improved upon them, making the older games seem less spectacular than they were at launch. I have fond memories of playing PS2, N64 and Dreamcast, but when I go back to play some of those games I enjoyed as a kid, I find that there’s always something super sub-optimal like the controls or some arcane mechanic that doesn’t make much sense. I find this to be the consistent issue going back to PS2 era and earlier.
I think the PS3/360 era is the one I have the most nostalgia for all things considered. There were a lot of stellar RPGs like KOTOR and Mass Effect that generation. Stuff like Red Dead Redemption was coming out. Control schemes finally became generally standardized and understandable. Tutorials, saves and decent graphics were really finally all combined properly for the first time.
I find the same sort of issue with movies. When you go back passed the 80s, you start hitting pacing issues. Same with video games. When you go back passed the mid-2000s, you’re going to run into early installment weirdness.
I am pretty sure the entire library is not available to stream, but you can install all of it on Windows. Bazzite is also not fully compliant with the hardware yet.
There’s a list of problems and workarounds, but it’s listed as gold rating which is better than a lot of other handhelds. I would not be able to stand the LEDs being at full brightness all the time, it would infuriate me.
There are a few projects like Bazzite and Chimera which have versions for desktops and other handhelds, but they’re not really at maturity yet for the handhelds other than the Steam Deck.
Since they’re all running AMD hardware, what I anticipate is that eventually the Linux variants designed for them will be significantly more optimal for the devices, much more than the bullshit full Windows install. You do lose out on Gamepass games though, but that’s pretty much the only thing you gain by going Windows in the first place.
I compromised with my ROG Ally and ran a bunch of debloating scripts so I could have my Gamepass games available. I would rather be running Bazzite on it, which runs excellently on my laptop. It requires some funky BIOS fuckery to achieve basic functionality right now though, so it’s definitely a project to undertake later.
I think you are misunderstanding the article.
Windows for ARM is designed specifically for ARM, and it has the translation layer. The translation layer effectively allows it to function as if it’s running an x86 Windows install off the bat by offering the ability to run x86 applications on the ARM hardware. It’s not actually running an x86 OS.
The chipset is very powerful but it doesn’t require additional hardware to achieve this translation. The additional processing power built into these chips are NPUs (Neural Processing Units) which are designed to more effectively run ML/AI/LLM workloads. The translation system just works on the normal raw processing power of the machine, just the same as the M-series Macs.
On paper the Intel processor is much better than the Zen 1 Extreme chip, but lack of optimizations remains the main bottleneck.
I knew Intel would incrementally improve the support for the device, because we saw how they handled the Arc cards. They were not great at first but then a driver update was pushed that increased efficacy by almost double bringing the cards on par with low and mid-rangers from the other manufacturers.
Hopefully Intel’s support improves more and the field becomes additionally competitive.
I had a 1050ti in the machine and I bought an A770. It’s overpowered for transcoding but I do remotely stream games at 1080p, which is a good workout for the card.
For simple transcoding I would buy the A310 since it’s the cheapest card with AV1. I’m running an old 6th Gen i7-6600k and I had to mess with the UEFI to allow REBAR, but I used this tool to do it.
The new dedicated cards are actually very good. They sell them at a competitive price because they are not powerhouses, but they get the job done. If you’re targeting 1080p at your top end, it’s almost a no-brainer to go with an Arc card. If you’re pushing a higher resolution, it’s probably better to go with another manufacturer, unless you’re fine with higher resolutions and lower framerates.
I agree with the Arc cards.
They are good, they are cheap, and they’re targeting the midrange to low-end hardware segment which is not covered by any other manufacturer.
I have a 3090 in my desktop but I have an Arc card on my server for Moonlight/Sunshine streaming, as well as Plex transcoding. It’s the cheapest card to have AV1 encoding built in.
I also keep seeing them increase performance significantly with every driver update, which is pretty cool.
This sounds like a problem exclusive to the United States. In Canada all of our carriers still provide RCS. Rogers was one of the first major telcoms to implement RCS country-wide for Androids prior to the major rollout elsewhere.
Additionally, RCS is a generally open standard that can be adopted by anyone and implemented by any carrier. Google only runs their RCS back-end when carriers are unwilling or unable to do so, like in other regions worldwide. RCS is interoperable and even if it’s a system being used by Google, it’s an open standard. Apple were the ones not allowing the interoperability here, and causing the centralization.
There is a hack to get ReBar on those older machines. I have a 6700k as well and it’s running a 1050ti, but this utility I heard about allows any modern system running a UEFI to have ReBar.
If the first gen’s prices go lower than they already are, I’ll buy the hell out of an A770 for video transcoding in AV1.
Intel cards are not the best performance for gaming, but they’re getting better in a lot of respects, and they are legitimately the only manufacturer targeting the midrange and low budget segment. I hope the second round of cards are significantly better while staying within the same price range. I also hope the standalone cards help them tune the Core series better. The MSI Claw is a fucking travesty.
I didn’t say best, I said most interesting. PlayStation has the best exclusives.
Nintendo games are fun and kooky but Sony games are big and majestic. Think of like Animal Crossing versus Spider-Man. PS exclusives have been landing on PC lately which I am patient to wait for. This will never be the case with Nintendo games.
The only reason I still want consoles to be developed is that a lot of cool features were designed and pioneered with consoles. Stuff like DirectStorage was implemented on the XSX before it was on PC, but that’s an example of consoles having pushed the boundaries and built new systems that benefit the whole computing ecosystem.
I don’t find this to be the case anymore. They keep claiming “technological leaps” but they only quantify the leaps in terms of being able to run at a higher resolution with higher frames, and we’ve gotten to the point with processing that we can brute force all of that stuff. There used to be a lot of limitations to run on a console, and it caused a lot of creative workarounds and solutions within the industry. It feels like those limitations have been removed everywhere but the Switch, and I would argue that’s the console with the most interesting exclusives.
Consoles used to help push the limit of what could be done on lower-end hardware. Now there’s basically no limits, especially with size. American games are like 100+ gigs now, it’s insane. Say what you want about their business practices and how anti-consumer they are, but I at least value Nintendo’s efficiency in game design and development having been limited by hardware. They make fun games that are functionally massive but do not require tons of storage in comparison to other AAA titles.
I like “Product Degradation” way better than “enshittification”.