Why are you here? Well, ok I guess you can stay :3
As someone else mentioned, the microtransactions existing put it in a bad light to start
My main issue however is just how UTTERLY UNPLAYABLE it was for most people’s systems on launch. The number of crashes and performance issues rivaled that of even Cyberpunk, and I still regularly play Cyberpunk. it was a complete and total disaster for many many people, and while it’s likely fixed by now it was such a struggle and headache to get through that I’ll likely never finish it.
Plastering it all over social media, I guess.
But honestly the word “quietly” being anywhere in any article’s title or anything has become the biggest red flag for just click baiting the piss out of something. I almost always take the title and throw it out the window after I see quietly in any title.
You are the one who brought up the question of even needing the CPU at all. Also, It wasn’t meant to be an attack. Just an explanation as to why you’d still need a CPU.
why would you run x86
All I meant was a large portion of software and compatibility tools still use it, and our modern desktop CPU architectures are still inspired from it. Things like CUDA are vastly different was my point
But if what you meant by your original comment was to not do away with the CPU, then yes! By all means, plenty of software is now migrating to taking advantage of the GPU as much as possible. I was only addressing you asking “at some point do we even need the CPU?” - the answer is yes :)
GPU’s as the ONLY compute source in a computer cannot and will not function, mainly due to how pipelining works on existing architectures (and other instructions)
You’re right, in that GPU’s are excellent at parallelization. Unfortunately when you pipeline several instructions to be run in parallel, you actually increase each individual instruction’s execution time. (Decreasing the OVERALL execution time though).
GPU’s are stupid good at creating triangles effectively, and pinning them to a matrix that they can then do “transformations” or other altering actions to. A GPU would struggle HARD if it had to handle system calls and “time splitting” similar to how an OS handles operating system background tasks.
This isnt even MENTIONING the instruction set changes that would be needed for x86 for example to run on a GPU alone.
TLDR: CPU’s are here to stay for a really really long time.
We have no idea what kind of hardware will be released, tech can go through all kinds of spikes even if seems to have leveled out. So while I agree the next generation or even future generations for awhile after that won’t significantly improve in 4 years, I disagree with the sentiment. Also:
Consoles aren’t just PC’s, their physical architecture is completely different.
Lol, no.
Personally the necktie