If “even an old Intel HD 770 can allegedly run the shader at well over 800 FPS at 1080p”, wouldn’t a modern bultin iGPU in the CPU enough to run at 240 FPS as an alternative to a dedicated GPU?
One of the overlay’s most interesting quirks is the developer’s recommendation to run two GPUs. Dedicating one GPU to just running the CRT emulation GPU shader allegedly eliminates a lot of desync issues in most games compared to running everything on one dedicated graphics card (even if that graphics card is very quick, apparently). Luckily, the GPU shader portion is extremely lightweight, and even an old Intel HD 770 can allegedly run the shader at well over 800 FPS at 1080p. This will be an annoying quirk gamers with no integrated graphics chips will have to deal with (particularly AM4-based Ryzen users)
The recommendation is not necessarily to run two dedicated GPUs, it is to have an additional GPU (including iGPU) to dedicate to running the shader.
Use of the word “dedicate” made that a bit confusing since that is common terminology for discrete GPUs
You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: [email protected]
No game suggestions, friend requests, surveys, or begging.
No Let’s Plays, streams, highlight reels/montages, random videos or shorts.
No off-topic posts/comments, within reason.
Use the original source, no clickbait titles, no duplicates.
(Submissions should be from the original source if possible, unless from paywalled or non-english sources.
If the title is clickbait or lacks context you may lightly edit the title.)
If “even an old Intel HD 770 can allegedly run the shader at well over 800 FPS at 1080p”, wouldn’t a modern bultin iGPU in the CPU enough to run at 240 FPS as an alternative to a dedicated GPU?
Yes
Full quote:
The recommendation is not necessarily to run two dedicated GPUs, it is to have an additional GPU (including iGPU) to dedicate to running the shader.
Use of the word “dedicate” made that a bit confusing since that is common terminology for discrete GPUs