For PC gaming news and discussion.
PCGamingWiki
Rules:
- Be Respectful.
- No Spam or Porn.
- No Advertising.
- No Memes.
- No Tech Support.
- No questions about buying/building computers.
- No game suggestions, friend requests, surveys, or begging.
- No Let’s Plays, streams, highlight reels/montages, random videos or shorts.
- No off-topic posts/comments, within reason.
- Use the original source, no clickbait titles, no duplicates.
(Submissions should be from the original source if possible, unless from paywalled or non-english sources.
If the title is clickbait or lacks context you may lightly edit the title.)
- 1 user online
- 109 users / day
- 390 users / week
- 955 users / month
- 2.98K users / 6 months
- 1 subscriber
- 6.83K Posts
- 53K Comments
- Modlog
Yep. Intel sat on their asses for a decade pushing quad cores one has to pay extra to even overclock.
Then AMD implements chiplets, comes out with affordable 6, 8, 12, and 16 core desktop processors with unlocked multipliers, hyperthreading built into almost every model, and strong performance. All of this while also not sucking down power like Intel’s chips still do.
Intel cached in their lead by not investing in themselves and instead pushing the same tired crap year after year onto consumers.
There are so many dimensions to this
Don’t forget the awfully fast socket changes
And all of the failures that plagued the 13 and 14 gens. That was the main reason I switched to AMD. My 13th gen CPU was borked and had to be kept underclocked.
In the 486 era (90s) there was a not official story about the way Intel marked its CPUs: instead of starting slow and accelerate until failure, start as fast as you can and slow down until it doesn’t fail.
what was the issue?
It would cause system instability (programs/games crashing) when running normally. I had to underclock it through Intel’s XTU to make things stable again.
This was after all the BIOS updates from ASUS and with all BIOS settings set to the safe options.
When I originally got it I did notice that it was getting insanely high scores in benchmarks, then the story broke of how Intel and motherboard manufacturers were letting the CPUs clock as high as possible until they hit the thermal limit. Then mine started to fail I think about a year after I got it.
Or the 1200 different versions of CPUs. We just got some new Dell machines for our DR site last year and the number of CPU options was overwhelming. Is it really necessary for that many different CPUs?
Tbf AMD is also guilty of that, in the laptop/mobile segment specifically. And the whole AI naming thing is just dumb, albeit there aren’t that many of those
Well this scheme seems much more reasonable and logical to me.
Even within the same socket family, looking at you lga1151, can you run into compatibility problems.
I think AMD also did a smart thing by branding their sockets. AM4, AM5, what do you think is going to be next? I bet it’s AM6. What came after the Intel LGA1151? It wasn’t LGA1152.
Yea, for the customer it really doesn’t matter how many pins a certain socket has, only is it compatible or not.
remember Socket 7?
Holy shit, crosscompatibility between manufacturers? We came this close to the almighty above and still ended up where we are today 🤦♂️
I remember Slot 2
AMD tried the Intel thing too by stopping support of past generation CPU on latter AM4 boards though. Only after public outcry did they scrap that. Wouldn’t put it past them to try it again on AM5.
Are there a lot of people wanting to plug Zen 1 chips into B550 motherboards? Usually it’s the other way around, upgrading chip in an old motherboard.
It can happen if the old motherboard failed, which was more likely than the CPU failing.
There was talk of not providing firmware update for old chipsets to support new gen CPU as well, which is relevant to the cases you mentioned.
As a person that generally buys either mid-tier stuff or the flagship products from a couple years ago, it got pretty fucking ridiculous to have to figure out which socket made sense for any given intel chip. The apparently arbitrary naming convention didn’t help.
It wasn’t arbitrary, they named them after the number of pins. Which is fine but kinda confusing for your average consumer
Which is a pretty arbitrary naming convention since the number of pins in a socket doesn’t really tell you anything especially when that naming convention does NOT get applied to the processors that plug into them.
deleted by creator
They really segmented that market in the worst possible way, 2 cores and 4 cores only, possibility to use vms or overclock, and so on. Add windoze eating up every +5%/year.
Remember buying the 2600(maybe X) and it was soo fast.
The 2600k was exceptionally good and was relevant well past the normal upgrade timeframes.
Really it only got left behind because of its 4C/8T limit as everything started supporting lots of threads instead of just a couple, and just being a 2nd Generation i7.
Past me made the accidentally more financially prudent move of opting for the i7-4790k over the i5-4690k which ultimately lasted me nearly a decade. At the time the advice was of course “4 cores is all you need, don’t waste the money on an i7” but those 4 extra threads made all the difference in the longevity of that PC
Coincidentally, that’s the exact cpu I use in my server! And it runs pretty damn well.
At this point the only “issue” with it is power usage versus processing capability. Newer chips can do the same with less power.
Yeahhh, iirc it uses slightly less power than my main cpu for significantly less performance
All of the exploits against Intel processors didn’t help either. Not only is it a bad look, but the fixes reduced the speed of the those processors, making them quite a bit worse deal for the money after all.
Meltdown and Spectre? Those also applied to AMD CPUs as well, just to a lesser degree (or rather, they had their own flavor of similar vulnerabilities). I think they even recently found a similar one for ARM chips…
Only one affected AMD, forget which. But Intel knew about the vulnerabilities, but chose not to fix the hardware ahead of their release.
Yea that definitely sounds like Intel… Though it’s still worth pointing out that one of them was a novel way to spy on program memory that affects many CPU types and not really indicative of a dropped ball. (outside of shipping with known vulnerabilities, anyways)
… The power stuff from 12/13th gens or what ever though… ouch, massive dropped ball.
Even the 6-core Phenom IIs from 2010 were great value.
But to be fair, Sandy Bridge ended up aging a lot better than those Phenom IIs or Bulldozer/Piledriver.