I am no fan boy. I have read the reviews, watched the bench marks, it boggles my mind....I am just looking for some anecdotal statements and people's thoughts on the state of play in the market right now...Not really the specifics of CPU performance just yet anyway. I am just torn as to what platform to go for. I remember in the past, in the days of the Athalon how AMD CPU needed some absolutely horrible chipsets to run, remember Opti chipsets...*shudder*.....It has put me off for a long time going to AMD....They seemed like the poor man's CPU.
Right now, the best silicon and the best architecture is in the AMD camp.
Its quite simple, really if you zoom out a bit.
Intel still rides another iteration of their Core architecture that was awesome from Sandy Bridge onwards and slowly got long in the tooth, Intel eventually lost its node advantage over that time frame up to today as well, and now needs to push TDPs to the moon to compete resulting un bursty and hot chips.
AMD has built an enterprise chip that is perfect for economy of scale, giving them a far stronger long term trajectory. Zen is going to last, and as much as Intel can adopt chiplets as well, AMD can simply slot in an "E-core CCX" anywhere they want and glue it to whatever other combo of chiplets they fancy. Intels chiplet attempts so far have not hit mass production.
The blue giant is leagues behind.
This however does NOT mean that you cant find a competitive Intel chip to any AMD Zen chip, they exist. But on overall development and future iterations, Zen has a much longer breath in it even today, and we still dont know what Intel is planning to combat that at all, apart from twisting their old crap up even further - they felt the need to push a 241W peak scenario in 12th gen chips go figure..
And then there is the subject of GPU. Yes, now is the absolute worst time ever to buy in. Early adopter tech (RT), price way out of bounds and we are late in this gen too. Architecturally, AMD has a strong showing. Nvidia has a lead on RT at the expense of everything else, such as TDP, chip/die size and VRAM capacity which implies the cards wont last quite as long before you have to compromise on settings.
On top of that, RT is far from mainstream yet and consoles barely use it. It still remains a big question mark what the winning architecture to accomodate the tech should lean towards. But in a time of highly priced silicon per square mm, efficiency is king, and brute forcing real time lighting is not efficient. Therefore, dedicating hardware to it seems like a fools errand - already Nvidias focus shifted towards a heavy push on their DLSS tech and I dare say the market data shows much higher demand for DLSS to improve performance, than it does for RT effects that cause you to pay even more to hit 60 FPS in a gaming world where high refresh rate was a big selling point lately (courtesy of fast CPUs and proper multi threading in games).