• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel's Pat Gelsinger Exclaims "Intel is Back" AMD is "Over"

Joined
May 17, 2021
Messages
3,005 (2.87/day)
Processor Ryzen 5 5700x
Motherboard B550 Elite
Cooling Thermalright Perless Assassin 120 SE
Memory 32GB Fury Beast DDR4 3200Mhz
Video Card(s) Gigabyte 3060 ti gaming oc pro
Storage Samsung 970 Evo 1TB, WD SN850x 1TB, plus some random HDDs
Display(s) LG 27gp850 1440p 165Hz 27''
Case Lian Li Lancool II performance
Power Supply MSI 750w
Mouse G502
lets hope not, we all know how great it was to be a consumer when "AMD was over".
 
Joined
Jul 21, 2016
Messages
137 (0.05/day)
Processor AMD Ryzen 5 5600
Motherboard MSI B450 Tomahawk
Cooling Alpenföhn Brocken 3 140mm
Memory Patriot Viper 4 - DDR4 3400 MHz 2x8 GB
Video Card(s) Radeon RX460 2 GB
Storage Samsung 970 EVO PLUS 500, Samsung 860 500 GB, 2x Western Digital RED 4 TB
Display(s) Dell UltraSharp U2312HM
Case be quiet! Pure Base 500 + Noiseblocker NB-eLoop B12 + 2x ARCTIC P14
Audio Device(s) Creative Sound Blaster ZxR,
Power Supply Seasonic Focus GX-650
Mouse Logitech G305
Keyboard Lenovo USB
Let's wait and see what kind of CPUs will they delivier under 200€ and VGA between 150-300€.
 
Joined
Jul 9, 2015
Messages
3,413 (1.07/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Indeed. At the same time, AMD has to prove that they can keep up with Intel.
OEMs butts need to get kicked, for AMD to keep up with Intel.

Having hands down superior products and merely making a dent in OEM controlled market (which is about 80% of the total CPU market) is horrific.
 
Joined
May 2, 2017
Messages
7,762 (3.08/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
With 3000 series Ryzen became decent, with 5000 series they become good. This is mostly because of going from GloFo 12nm (which is far worse than Intel 14nm in every aspect) to TSMC 7nm.
That is a way oversimplified take. The IPC improvements from Zen/Zen+ to Zen2 are very significant. Sure, some of those are enabled by the increased density of the 7nm node allowing for more transistors and denser circuits, but putting that down to "mostly [the fab node]" is oversimplified to the absurd. Zen2 was far more than Zen/Zen+ shrunk down to 7nm. Zen2 beat Intel outright in IPC, and improved upon Zen+ by ~15%. Zen2 was a one-two punch of a highly efficient node that clocked better than its predecessor and a very noticeable architectural improvement.
Surely he was put into a spotlight where he really doesn't understand what' really going on.
Uh ... he's the friggin' CEO of Intel. "Doesn't understand what's really going on"? You're kidding, right? "Oh, don't look at him, he's just this bumbling, naïve guy who doesn't really understand the situation he's in." I mean, it's well established that people way disproportionately tend to cut people with wealth and power however much slack they might need, but this ... He knows exactly what he's doing, has worked out a plan for what to say and how to say it, has discussed every aspect of how to present the company and its strategy with both PR and engineering leads. While spoken communication is never perfectly accroding to plan, a C-level executive doesn't enter into an interview unprepared. Assuming anything else is just naïve.
Intel had high availablity on chips throughout whole COVID 19 situation, thats the perk of having their own fabs (which is both good and bad).
Given that they went through a very significant 2+ year production shortage just before this it would have been very, very worrying if that wasn't the case. Has everyone forgotten the Intel CPU drought of 2018-2020? (Though tbh it isn't quite over yet - they're still not delivering low-end SKUs at the rate they used to.) They had just gotten out of the woods on that one through a significant ramp in production capacity + finally making their 10nm process viable. It's hard to imagine a way of being better prepared for a demand spike.
Bob Swan is probably the reason why Intel was stuck on 14nm for so long, no wonder he got booted and Intel went back to the roots.
While executives do have a certain degree of power, this is ... a lot. Saying the CEO was the reason why they couldn't get their lithographic nodes to work properly? Come on. They screwed up the engineering, hit unforeseen technical challenges that they couldn't overcome in a reasonable time frame, and had tons and tons of engineering issues. Could an executive have alleviated this through, for example, lowering the requirements for the new node? Sure, and that might have helped (though there's no way to know if it would have - silicon lithography is still immensely difficult). But saying their delays were Bob Swan's fault is quite the leap.
I never said 6000 series were bad, only the lower end is bad and priced too high. Availablity is bad because AMD does not focus on GPUs. All their current chips uses TSMC 7nm and they can't focus on all chips.

And no, 6700XT is not faster than 3070 and 3080 Ti beats 6900XT too overall, especially at 4K.

As you can see here; https://www.techpowerup.com/review/gainward-geforce-rtx-3080-ti-phantom-gs/28.html
That link shows the 6900XT beating the 3080 Ti at 1080p (100% v. 99%), matching it (99% v. 99%) at 1440p, and falling slightly behind at 2160p (93% v. 98%). That a factory OC'd 3080 Ti with a 1-2% overall performance increase beats stock 6900 XT performance doesn't change that. So, saying the 3080 Ti Beats 6900 XT overall" is just not true. In reality, they perform the same, except in 2160p where the 3080 Ti wins out by a slight margin (but likely not noticeable overall - you'd typically need a >10% margin for that). You could argue that 2160p is the most important resolution for a GPU that expensive, which is at least partially true (though high refresh rate 1440p is much more common in terms of monitors), but your conclusion is still quite skewed. At best it is a minor victory, though more reasonably it's an effective tie, with the differences being sufficiently minor to not really matter.
DLSS and FSR useless? Haha, sounds more like you're clueless.
I agree there, DLSS and FSR are quite brilliant additions to how GPUs work - though they need to become more widely available and universally compatible.
 
Joined
Nov 18, 2010
Messages
7,108 (1.46/day)
Location
Rīga, Latvia
System Name HELLSTAR
Processor AMD RYZEN 9 5950X
Motherboard ASUS Strix X570-E
Cooling 2x 360 + 280 rads. 3x Gentle Typhoons, 3x Phanteks T30, 2x TT T140 . EK-Quantum Momentum Monoblock.
Memory 4x8GB G.SKILL Trident Z RGB F4-4133C19D-16GTZR 14-16-12-30-44
Video Card(s) Sapphire Pulse RX 7900XTX + under waterblock.
Storage Optane 900P[W11] + WD BLACK SN850X 4TB + 750 EVO 500GB + 1TB 980PRO[FEDORA]
Display(s) Philips PHL BDM3270 + Acer XV242Y
Case Lian Li O11 Dynamic EVO
Audio Device(s) Sound Blaster ZxR
Power Supply Fractal Design Newton R3 1000W
Mouse Razer Basilisk
Keyboard Razer BlackWidow V3 - Yellow Switch
Software FEDORA 39 / Windows 11 insider
You're kidding, right? "Oh, don't look at him, he's just this bumbling, naïve guy who doesn't really understand the situation he's in."

Did you even understood what I meant writing that? TheLostSwede did, you not.

The whole post really isn't about anything, just nitpicking again.
 
Joined
Aug 30, 2006
Messages
7,195 (1.12/day)
System Name ICE-QUAD // ICE-CRUNCH
Processor Q6600 // 2x Xeon 5472
Memory 2GB DDR // 8GB FB-DIMM
Video Card(s) HD3850-AGP // FireGL 3400
Display(s) 2 x Samsung 204Ts = 3200x1200
Audio Device(s) Audigy 2
Software Windows Server 2003 R2 as a Workstation now migrated to W10 with regrets.
I,m still waiting to see good performance at low power consumption without having to dig deep into pockets.
 

las

Joined
Nov 14, 2012
Messages
1,533 (0.37/day)
System Name Obsolete / Waiting for Zen 5 or Arrow Lake
Processor i9-9900K @ 5.2 GHz @ 1.35v / No AVX Offset
Motherboard AsRock Z390 Taichi
Cooling Custom Water
Memory 32GB G.Skill @ 4000/CL15
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 280 Hz + 77" QD-OLED @ 144 Hz VRR
Case Fractal Design Meshify C
Audio Device(s) Asus Essence STX / Upgraded Op-Amps
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
That is a way oversimplified take. The IPC improvements from Zen/Zen+ to Zen2 are very significant. Sure, some of those are enabled by the increased density of the 7nm node allowing for more transistors and denser circuits, but putting that down to "mostly [the fab node]" is oversimplified to the absurd. Zen2 was far more than Zen/Zen+ shrunk down to 7nm. Zen2 beat Intel outright in IPC, and improved upon Zen+ by ~15%. Zen2 was a one-two punch of a highly efficient node that clocked better than its predecessor and a very noticeable architectural improvement.

Uh ... he's the friggin' CEO of Intel. "Doesn't understand what's really going on"? You're kidding, right? "Oh, don't look at him, he's just this bumbling, naïve guy who doesn't really understand the situation he's in." I mean, it's well established that people way disproportionately tend to cut people with wealth and power however much slack they might need, but this ... He knows exactly what he's doing, has worked out a plan for what to say and how to say it, has discussed every aspect of how to present the company and its strategy with both PR and engineering leads. While spoken communication is never perfectly accroding to plan, a C-level executive doesn't enter into an interview unprepared. Assuming anything else is just naïve.

Given that they went through a very significant 2+ year production shortage just before this it would have been very, very worrying if that wasn't the case. Has everyone forgotten the Intel CPU drought of 2018-2020? (Though tbh it isn't quite over yet - they're still not delivering low-end SKUs at the rate they used to.) They had just gotten out of the woods on that one through a significant ramp in production capacity + finally making their 10nm process viable. It's hard to imagine a way of being better prepared for a demand spike.

While executives do have a certain degree of power, this is ... a lot. Saying the CEO was the reason why they couldn't get their lithographic nodes to work properly? Come on. They screwed up the engineering, hit unforeseen technical challenges that they couldn't overcome in a reasonable time frame, and had tons and tons of engineering issues. Could an executive have alleviated this through, for example, lowering the requirements for the new node? Sure, and that might have helped (though there's no way to know if it would have - silicon lithography is still immensely difficult). But saying their delays were Bob Swan's fault is quite the leap.

That link shows the 6900XT beating the 3080 Ti at 1080p (100% v. 99%), matching it (99% v. 99%) at 1440p, and falling slightly behind at 2160p (93% v. 98%). That a factory OC'd 3080 Ti with a 1-2% overall performance increase beats stock 6900 XT performance doesn't change that. So, saying the 3080 Ti Beats 6900 XT overall" is just not true. In reality, they perform the same, except in 2160p where the 3080 Ti wins out by a slight margin (but likely not noticeable overall - you'd typically need a >10% margin for that). You could argue that 2160p is the most important resolution for a GPU that expensive, which is at least partially true (though high refresh rate 1440p is much more common in terms of monitors), but your conclusion is still quite skewed. At best it is a minor victory, though more reasonably it's an effective tie, with the differences being sufficiently minor to not really matter.

I agree there, DLSS and FSR are quite brilliant additions to how GPUs work - though they need to become more widely available and universally compatible.

3080 Ti beating 6900XT at 1080p, are you serious? By 1%? How many are buying a 1000-1200 dollar GPU (MSRP, however more like 1500-3000 dollars in reality now) to be mostly CPU bound at 1080p tho? Those cards are aimed at 4K/UHD and 3080 Ti wins, even without use of DLSS which tons of newer and demanding games have built in. 3080 Ti, 3090, 6900XT pretty much scores identical at 1080p, because of CPU bottleneck... You buy high-end cards for high res gaming and AMD lacks at 4K/UHD or higher. It's no surprise as AMD skimped on buswidth and bandwidth in general this time.

256 bit for high-end (using regular GDDR6) simply is not that great for high res gaming. Nvidia uses 320-384 bit for high-end PLUS GDDR6X for pretty much twice the bandwidth.

3080 Ti beats 6900XT by 5% at 4K/UHD without using DLSS which boosts performance by ~75% on top of that. As long as DLSS have way better adoption than FSR, this is reality. However, FSR works on Nvidia cards too, so..

However the regular 3080 is only 4% slower than 6900XT for 4K, with a 300 dollar lower pricetag and option for DLSS.


Yeah a bad CEO can easily disrupt progress in a company. Seen that many times. It's his responsability. It's not an accident that most Intel top guys have been replaced in the last 1-2 years and Intel is moving forward again, at a rapid pace.

And yeah, the performance increase from Ryzen 1000/2000 to 3000/4000 is mostly from clockspeed. Way higher clockspeeds.

First Ryzens barely hit 4.1 on average, many 1000 chips can't even do 4 GHz stable.
However many 5000 models can hit 4.8 on all cores, some even more. Thats a big difference and that clockspeed came from one thing; Improved node.
 
Last edited:

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
46,283 (7.69/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
This smugness is good, as it motives AMD to come out with "Zen 4" sooner, than dragging feet with that "Zen 3+" chip. While it's a nice swansong for AM4, "Zen 3+" is a step in the wrong direction, as it will be held back by older DDR4 memory and PCIe Gen 4, while enthusiasts are drawn to "Alder Lake" and DDR5. Rather let AMD take its time, and come out with AM5 and "Zen 4" with DDR5 in Q1 2022.

I've heard the damnest bullshit from a bullshit orifice that smells good. It says that the next-gen CCD built on N5 will have eight "Zen 4" cores, and two optically-shrunk "Zen 1" cores serving as E cores. At 5 nm and running around 4 GHz, "Zen 1" will beat the perf/Watt curve of "Gracemont," and it's not rocket science for AMD to get these to idle at 5 W, and develop power-gating for the "Zen 4" P cores.
 
Joined
May 8, 2018
Messages
1,495 (0.69/day)
Location
London, UK
Actually if this news is true then this is good, let them fight to the death ehhe, competition drives innovation and performance.
 
Joined
Jun 19, 2010
Messages
397 (0.08/day)
Location
Germany (Euregio)
Processor Ryzen 5600X
Video Card(s) RTX 3050
Software Win11
TPU and others will test these claims.
Tech interested people will remember newsworthy anticompetitive behavior, over years, but most people do not know that the software ecosystem intel is so proud of, is widely steered by big software supplyers, some are intel tame ones by a good margin, also a good buch of them are effectively using the "best on intel" approach wich supports all of intels anticompetitive behaviors, like the compiler stunts etc. slowing down on non-intel on purpose.
So a good bunch of the ecosystem is negatively influenced by intel directly and indirectly.

Such things are at least awkward from a 80% marketshare leader. If they would be as confident in they products as they say, such shame would not be needed.
 

las

Joined
Nov 14, 2012
Messages
1,533 (0.37/day)
System Name Obsolete / Waiting for Zen 5 or Arrow Lake
Processor i9-9900K @ 5.2 GHz @ 1.35v / No AVX Offset
Motherboard AsRock Z390 Taichi
Cooling Custom Water
Memory 32GB G.Skill @ 4000/CL15
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 280 Hz + 77" QD-OLED @ 144 Hz VRR
Case Fractal Design Meshify C
Audio Device(s) Asus Essence STX / Upgraded Op-Amps
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
This smugness is good, as it motives AMD to come out with "Zen 4" sooner, than dragging feet with that "Zen 3+" chip. While it's a nice swansong for AM4, "Zen 3+" is a step in the wrong direction, as it will be held back by older DDR4 memory and PCIe Gen 4, while enthusiasts are drawn to "Alder Lake" and DDR5. Rather let AMD take its time, and come out with AM5 and "Zen 4" with DDR5 in Q1 2022.

I've heard the damnest bullshit from a bullshit orifice that smells good. It says that the next-gen CCD built on N5 will have eight "Zen 4" cores, and two optically-shrunk "Zen 1" cores serving as E cores. At 5 nm and running around 4 GHz, "Zen 1" will beat the perf/Watt curve of "Gracemont," and it's not rocket science for AMD to get these to idle at 5 W, and develop power-gating for the "Zen 4" P cores.

AMD can't do 5nm chips in Q1 2022. Apple uses it and TSMC will always prioritize Apple over AMD. End of 2022, maybe.

Also, considering how dependant on memory and timings zen is, jumping on DDR5 right now would be a bad decision. 4800 MHz at CL40? That would probably decrease performance ALOT.
Intel generally are not affected much by timings and clockspeed in comparison. Barely any difference between cheap and expensive RAM on Intel platforms (3200/CL16 being the cheap solution)

AMD pretty much needs "Zen3+" on 6nm (+ 3D Cache) to close the gap, or they will have nothing to launch in 2022 probably. They need to use 6nm ASAP in some form, to take load off 7nm line which all their chips are currently on. Thats why Radeon 6000 suffers greatly in terms of marketshare. CPUs and APUs gets priority. Just look at Steam HW Survey if you are in doubt. Ampere completely dominates RDNA2 in terms of marketshare even tho Ampere is superior for mining and ALOT of Ampere cards are used for mining, not gaming.

Radeon 6000, especially 6700XT series and up are fairly decent but alot of the speed comes from clockspeed advantage (vs Samsung 8nm) and Nvidia aims to use TSMC 5nm for RTX 4000 series in 2H 2022, meaning that it will probably be a big jump in perf considering +25% clockspeed on top of more cores across the board.

If RDNA3 really is MCM it will have issues. How will frametimes be? Will we see issues that was and somewhat still are present on mGPU rigs from MCM GPUs? Who knows. Perf numbers will probably be great, but how will the actual gaming experience be, that's what I look forward to see the most.
 
Last edited:
Joined
May 2, 2017
Messages
7,762 (3.08/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Did you even understood what I meant writing that? TheLostSwede did, you not.
I don't think so. You're - very generously - positing that the CEO of Intel likely said made some overblown claims (presumably, that he didn't quite mean, or that weren't intended to come out like that) because he
was put into a spotlight where he really doesn't understand what' really going on.
Which, again, is extremely generous and IMO a rather unreasonable assumption. I see no reason to give him the benefit of any doubt here - this is a PR move. A calculated, thought-through PR move, likely mostly targeting investors and finance people, as they tend to gobble up this type of silly macho posturing. I see no reason to think otherwise.
3080 Ti beating 6900XT at 1080p, are you serious? By 1%? How many are buying a 1000-1200 dollar GPU (MSRP, however more like 1500-3000 dollars in reality now) to be mostly CPU bound at 1080p tho? Those cards are aimed at 4K/UHD and 3080 Ti wins, even without use of DLSS which tons of newer and demanding games have built in. 3080 Ti, 3090, 6900XT pretty much scores identical at 1080p, because of CPU bottleneck... You buy high-end cards for high res gaming and AMD lacks at 4K/UHD or higher. It's no surprise as AMD skimped on buswidth and bandwidth in general this time.
... sigh. I said they are tied at everything except 2160p, with the difference there being minor. I also specifically brought up that you can argue for 2160p being of particular interest for this GPU. So yes, I follow you there - I did say it before you, after all. But I also pointed out that 1440p high refresh rate is likely more common still (there are just barely any proper 2160p gaming monitors on the market, after all). In which case they would perform the same (varying between individual titles, of course).
256 bit for high-end (using regular GDDR6) simply is not that great for high res gaming. Nvidia uses 320-384 bit for high-end PLUS GDDR6X for pretty much twice the bandwidth.
Relevance? We're talking about overall performance here, not technical details as to why performance differs. Nobody here is disputing that the Nvidia GPUs are slightly faster at higher resolutions.
3080 Ti beats 6900XT by 5% at 4K/UHD without using DLSS which boosts performance by ~75% on top of that. As long as DLSS have way better adoption than FSR, this is reality. However, FSR works on Nvidia cards too, so..
Yep, DLSS is good, and improves performance. So does FSR. And FSR being newer means less adoption - once it's been on the market a while we'll see how this plays out. My money's on FSR gaining traction faster than DLSS due to its openness, universal compatibility and ease of implementation, but I don't think DLSS will die off entirely either. But now you're suddenly adding a lot of caveats to what was previously a statement that
3080 Ti beats 6900XT too overall, especially at 4K.
So, either you're changing your tune (by adding further caveats), or you're admitting that it isn't as simple as you first said. Either way, your statement strongly implies that the 3080 Ti beats the 6900 XT at lower-than-2160p resolutions (if not, then it wouldn't be "especially at 4k"), which your own source showed is just not true.
However the regular 3080 is only 4% slower than 6900XT for 4K, with a 300 dollar lower pricetag and option for DLSS.
Sure. But then, all high end/flagship GPUs are terrible value. The 6800 XT, 6800, and 3070 are much better value propositions too. The point being: that logic works both ways, not just the one way you're using it. The 3080 is great value for 2160p, but otherwise relatively unremarkable (beyond being a crazy powerful GPU overall, of course). Ignoring the pricing clusterf*ck that is current reality, the best value GPUs are, in rough order, 3060, 6600 XT, 3060 Ti (very close, essentially tied), 6800, 3070 (again, essentially tied), and then we start getting into a royal mess of far too many SKUs to make sense of. Arguments like what you're saying here can be made at literally every step down this ladder.
Yeah a bad CEO can easily disrupt progress in a company. Seen that many times. It's his responsability. It's not an accident that most Intel top guys have been replaced in the last 1-2 years and Intel is moving forward again, at a rapid pace.
"Disrupting progress in a company" is not the same as "probably [being] the reason why Intel was stuck on 14nm for so long". You're conflating overall strategy with specific technical issues. These do of course overlap - which I exemplified in my previous post - but you're assigning a far too simplistic blame for a highly complex situation. The world's best CEO can't fix your engineering problems (unless they're also a brilliant engineer who happens to have the correct skills and temporarily steps down from their CEO position to work as an engineer, which ... yeah, I don't think that happens often). There is a relation between executives and overall progress, but the link between that and the specific constituent parts of that progress is complicated, tenuous, and extremely difficult to pin down.
And yeah, the performance increase from Ryzen 1000/2000 to 3000/4000 is mostly from clockspeed. Way higher clockspeeds.

First Ryzens barely hit 4.1 on average, many 1000 chips can't even do 4 GHz stable.
However many 5000 models can hit 4.8 on all cores, some even more. Thats a big difference and that clockspeed came from one thing; Improved node.
Again: yes, I said as much. But you're completely ignoring the major IPC improvements that happened at the same time! As I said (and linked to), Zen2 beat Zen+ by ~15% in independent testing using industry-standard methods. Zen3 delivered a further 19% IPC increase according to the same testing. That means that, regardless of the clock speed afforded by the node, 5000-series CPUs outperform 2000-series CPUs by nearly 37%. That is a major performance increase. Saying that the improved overall performance is only down to clock speeds is wrong. Period. It is down to clock speeds and architectural improvements. The node enables both in some way, but this is not a 1:1 relation - the architecture also needs to be built to reach those clock speeds, and tuned to hit those IPC numbers. Saying this is all down to the node is an oversimplification.
AMD can't do 5nm chips in Q1 2022. Apple uses it and TSMC will always prioritize Apple over AMD.
Apple is hardly the only TSMC 5nm customer at this point. Huawei was working with them even last year, and while this leaked roadmap is clearly no longer up to date (the Snapdragon 875 never materialized, instead we got the Samsung 5nm 888; HiSilicon got stomped down even further by trade embargoes), it shows that there are plenty of TSMC 5nm customers in the 2021-2022 time frame.
Also, considering how dependant on memory and timings zen is, jumping on DDR5 right now would probably be a bad decision. 4800 MHz at CL40? That would probably decrease performance ALOT.
Intel generally are not affeected much by timings and speed in comparison.
We'll see - Intel CPUs have also become more RAM speed/timing sensitive lately (mostly due to higher core counts putting more pressure on the interconnects and keeping cores fed). They still aren't as sensitive as AMD, but we have no idea how this will develop in the future. I would also guess that both upcoming architectures will support both DDR4 and DDR5, with motherboards supporting either being available. I wouldn't expect an all-out DDR5 change until the generation after these.
AMD pretty much needs "Zen3+" on 6nm to close the gap, or they will have nothing to launch in 2022 probably..
Close what gap? Intel still has a slight IPC deficit with their 11th gen, but they clock higher, so everything mostly evens out. Until we see reviews we have no idea how these upcoming chips (from either company) will perform.
 

las

Joined
Nov 14, 2012
Messages
1,533 (0.37/day)
System Name Obsolete / Waiting for Zen 5 or Arrow Lake
Processor i9-9900K @ 5.2 GHz @ 1.35v / No AVX Offset
Motherboard AsRock Z390 Taichi
Cooling Custom Water
Memory 32GB G.Skill @ 4000/CL15
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 280 Hz + 77" QD-OLED @ 144 Hz VRR
Case Fractal Design Meshify C
Audio Device(s) Asus Essence STX / Upgraded Op-Amps
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
I don't think so. You're - very generously - positing that the CEO of Intel likely said made some overblown claims (presumably, that he didn't quite mean, or that weren't intended to come out like that) because he

Which, again, is extremely generous and IMO a rather unreasonable assumption. I see no reason to give him the benefit of any doubt here - this is a PR move. A calculated, thought-through PR move, likely mostly targeting investors and finance people, as they tend to gobble up this type of silly macho posturing. I see no reason to think otherwise.

... sigh. I said they are tied at everything except 2160p, with the difference there being minor. I also specifically brought up that you can argue for 2160p being of particular interest for this GPU. So yes, I follow you there - I did say it before you, after all. But I also pointed out that 1440p high refresh rate is likely more common still (there are just barely any proper 2160p gaming monitors on the market, after all). In which case they would perform the same (varying between individual titles, of course).

Relevance? We're talking about overall performance here, not technical details as to why performance differs. Nobody here is disputing that the Nvidia GPUs are slightly faster at higher resolutions.

Yep, DLSS is good, and improves performance. So does FSR. And FSR being newer means less adoption - once it's been on the market a while we'll see how this plays out. My money's on FSR gaining traction faster than DLSS due to its openness, universal compatibility and ease of implementation, but I don't think DLSS will die off entirely either. But now you're suddenly adding a lot of caveats to what was previously a statement that

So, either you're changing your tune (by adding further caveats), or you're admitting that it isn't as simple as you first said. Either way, your statement strongly implies that the 3080 Ti beats the 6900 XT at lower-than-2160p resolutions (if not, then it wouldn't be "especially at 4k"), which your own source showed is just not true.

Sure. But then, all high end/flagship GPUs are terrible value. The 6800 XT, 6800, and 3070 are much better value propositions too. The point being: that logic works both ways, not just the one way you're using it. The 3080 is great value for 2160p, but otherwise relatively unremarkable (beyond being a crazy powerful GPU overall, of course). Ignoring the pricing clusterf*ck that is current reality, the best value GPUs are, in rough order, 3060, 6600 XT, 3060 Ti (very close, essentially tied), 6800, 3070 (again, essentially tied), and then we start getting into a royal mess of far too many SKUs to make sense of. Arguments like what you're saying here can be made at literally every step down this ladder.

"Disrupting progress in a company" is not the same as "probably [being] the reason why Intel was stuck on 14nm for so long". You're conflating overall strategy with specific technical issues. These do of course overlap - which I exemplified in my previous post - but you're assigning a far too simplistic blame for a highly complex situation. The world's best CEO can't fix your engineering problems (unless they're also a brilliant engineer who happens to have the correct skills and temporarily steps down from their CEO position to work as an engineer, which ... yeah, I don't think that happens often). There is a relation between executives and overall progress, but the link between that and the specific constituent parts of that progress is complicated, tenuous, and extremely difficult to pin down.

Again: yes, I said as much. But you're completely ignoring the major IPC improvements that happened at the same time! As I said (and linked to), Zen2 beat Zen+ by ~15% in independent testing using industry-standard methods. Zen3 delivered a further 19% IPC increase according to the same testing. That means that, regardless of the clock speed afforded by the node, 5000-series CPUs outperform 2000-series CPUs by nearly 37%. That is a major performance increase. Saying that the improved overall performance is only down to clock speeds is wrong. Period. It is down to clock speeds and architectural improvements. The node enables both in some way, but this is not a 1:1 relation - the architecture also needs to be built to reach those clock speeds, and tuned to hit those IPC numbers. Saying this is all down to the node is an oversimplification.

Apple is hardly the only TSMC 5nm customer at this point. Huawei was working with them even last year, and while this leaked roadmap is clearly no longer up to date (the Snapdragon 875 never materialized, instead we got the Samsung 5nm 888; HiSilicon got stomped down even further by trade embargoes), it shows that there are plenty of TSMC 5nm customers in the 2021-2022 time frame.

We'll see - Intel CPUs have also become more RAM speed/timing sensitive lately (mostly due to higher core counts putting more pressure on the interconnects and keeping cores fed). They still aren't as sensitive as AMD, but we have no idea how this will develop in the future. I would also guess that both upcoming architectures will support both DDR4 and DDR5, with motherboards supporting either being available. I wouldn't expect an all-out DDR5 change until the generation after these.

Close what gap? Intel still has a slight IPC deficit with their 11th gen, but they clock higher, so everything mostly evens out. Until we see reviews we have no idea how these upcoming chips (from either company) will perform.

The gap when Alder Lake comes out next month.

Yes DLSS is good and I have enjoyned its benefits for a year now. Especially when I use my 4K/UHD OLED for gaming, it allows for 100+ easily in most games with high settings using quality mode, perfect for my 120 Hz native OLED TV.

FSR might be good, it might not. Adaption is low right now, and there's several examples on why open is a bad things too. ALOT of games with injected FSR delivers WORSE performance and WORSE image quality than not using FSR at all. Right now DLSS is clearly superior. I have tried both in sevaral games, and this is also what tests are concluding.

I'm simply saying that no sane person is buying top-end graphics cards, to game at 1080p which looks horrible and you will be CPU bound in most cases anyway.

3080 Ti and 6900XT are essentially even at 1440p, which is the lowest resolution I'm looking at today. I would never accept or recommend 1080p in 2021, unless you are building a budget rig maybe. 1440p IPS at 144 Hz or more is so cheap right now that 1080p is dead to me.

AMD will face serious competition in all markets in the following years, it's all I'm saying. And without using best possible nodes at TSMC, I can't really see them be able to compete on perf, however they can (and should) on value.

Radeon 6600XT and 6700XT have bad value/perf compared to RTX 3060 and 3060 Ti/3070. The Nvidia cards sold waaaaay better.

I can't wait to see how Intel's dGPUs will perform, if they go with aggressive pricing (which they should), they could easily goggle up some marketshare in the low to mid-end market - AMDs prime segment - and considering Nvidia soon hits 85% dGPU marketshare, this could turn out messy. AMD might be sub 10% marketshare in the dGPU segment in 1-2 years, unless they fix their pricing and availablity issues. Not many are going to pay Nvidia prices for AMD GPUs. That's just reality.
 
Last edited:
Joined
Dec 10, 2019
Messages
16 (0.01/day)
Location
Tasmania
Processor Ryzen 2400GE
Motherboard MSI B450I Gaming Plus AC
Cooling Noctua NH-L9a-AM4
Memory Corsair Vengeance LPX 16GB 3000MHz CL15 DDR4, clocked to 3200
Storage ADATA XPG SX8200 Pro NVMe 1TB, Drive tank with 18TB of assorted drives for media etc
Display(s) Philips BDM3270
Case Jonsbo UMX3 black
Power Supply Seasonic SS-250SUB
Mouse Logitech M585
Keyboard MSI Vigor GK50
Software Everything imaginable except games
The real question is, would Intel have made the advances they have recently if AMD had not been eating into their market share? No. If Intel had no real competition, we would still be paying exorbitant prices for sub-par CPUs and APUs with rubbish embedded graphics. That alone makes me want to stick with AMD for all future purchases, they pushed Intel to get off their butts and do better instead of just ripping off PC buyers, like they had been for the last decade. AMD deserves the sales, Intel don't.
 
Joined
Nov 6, 2016
Messages
1,561 (0.58/day)
Location
NH, USA
System Name Lightbringer
Processor Ryzen 7 2700X
Motherboard Asus ROG Strix X470-F Gaming
Cooling Enermax Liqmax Iii 360mm AIO
Memory G.Skill Trident Z RGB 32GB (8GBx4) 3200Mhz CL 14
Video Card(s) Sapphire RX 5700XT Nitro+
Storage Hp EX950 2TB NVMe M.2, HP EX950 1TB NVMe M.2, Samsung 860 EVO 2TB
Display(s) LG 34BK95U-W 34" 5120 x 2160
Case Lian Li PC-O11 Dynamic (White)
Power Supply BeQuiet Straight Power 11 850w Gold Rated PSU
Mouse Glorious Model O (Matte White)
Keyboard Royal Kludge RK71
Software Windows 10
It was just a matter of time - AMD did well condering all.

HOWEVER without Intel being stuck at 14nm _and_ AMD using TSMC 7nm - AMD would never have been able to do what they did. Ryzen 1000 and 2000 on GloFo was nothing special at all however it delivered an alternative to Intel which even the most hardcore AMD fanboys had joined, because FX CPUs were pretty much garbage.

With 3000 series Ryzen became decent, with 5000 series they become good. This is mostly because of going from GloFo 12nm (which is far worse than Intel 14nm in every aspect) to TSMC 7nm.

AMD have beaten Intel and even Nvidia before but Intel and Nvidia always came back and took the crown. Which resulted in AMD lowering prices and went back to the drawing board. Just look how AMD priced 5000 series compared to 1000/2000 and somewhat 3000 series. AMD started milking just like Intel did for ~10 years. AMD left NON-X models and generally priced all the chips way higher than before.

I own 2 Ryzen machines (htpc + nas/server) and 1 Intel (gaming rig) so you can stop the fanboy BS. I even own two consoles, so I actually have 4 AMD chips in my home... However I also have 3 intel laptops :p So I guess it's 50/50

Argument's like this look at AMD and Intel in a vacuum, yes Intel was on 14nm, but intel also has an R&D budget literally 6.5x that of AMD's and an annual revenue literally 8x that of AMD's, so there's literally ZERO excuse for intel not beating AMD, and it makes AMD beating intel even more impressive. If anybody can name another company, from any industry, who has bested there competitor while having an R&D budget 6.5x smaller and an annual revenue 8x smaller, please provide that example, because if it even exists at all, I'd be surprised.

That being said, consumers should be rooting for AMD right now, not intel, AMD has absolutely ZERO guarantees of being in the consumer x86 space, and is literally just a few generations in a row of being beaten of exiting from that space. With respect to a long term outlook, it would have been in everybody's interest to have AMD best Intel for another 3-5 years, time enough to get market enterprise x86 market penetration as well as in mobile consumer....the two most lucrative x86 market shares. PC enthusiasts are so fickle and the majority of non-enthusiasts literally don't even know of AMD's existence and just out of habit, buy an intel system whenever they buy a new laptop, regardless of AMD's benefits, so we seriously better hope AMD isn't done.....I think we can all remember the mid-2010s, four core stagnation, and 4% gen over gen performance gains, and if we ever return to that, I'll be done with windows PCs
 

las

Joined
Nov 14, 2012
Messages
1,533 (0.37/day)
System Name Obsolete / Waiting for Zen 5 or Arrow Lake
Processor i9-9900K @ 5.2 GHz @ 1.35v / No AVX Offset
Motherboard AsRock Z390 Taichi
Cooling Custom Water
Memory 32GB G.Skill @ 4000/CL15
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 280 Hz + 77" QD-OLED @ 144 Hz VRR
Case Fractal Design Meshify C
Audio Device(s) Asus Essence STX / Upgraded Op-Amps
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
Argument's like this look at AMD and Intel in a vacuum, yes Intel was on 14nm, but intel also has an R&D budget literally 6.5x that of AMD's and an annual revenue literally 8x that of AMD's, so there's literally ZERO excuse for intel not beating AMD, and it makes AMD beating intel even more impressive. If anybody can name another company, from any industry, who has bested there competitor while having an R&D budget 6.5x smaller and an annual revenue 8x smaller, please provide that example, because if it even exists at all, I'd be surprised.

That being said, consumers should be rooting for AMD right now, not intel, AMD has absolutely ZERO guarantees of being in the consumer x86 space, and is literally just a few generations in a row of being beaten of exiting from that space. With respect to a long term outlook, it would have been in everybody's interest to have AMD best Intel for another 3-5 years, time enough to get market enterprise x86 market penetration as well as in mobile consumer....the two most lucrative x86 market shares. PC enthusiasts are so fickle and the majority of non-enthusiasts literally don't even know of AMD's existence and just out of habit, buy an intel system whenever they buy a new laptop, regardless of AMD's benefits, so we seriously better hope AMD isn't done.....I think we can all remember the mid-2010s, four core stagnation, and 4% gen over gen performance gains, and if we ever return to that, I'll be done with windows PCs

Again, because of a sleeping CEO, which was fired as a result. Complete stagnation.

Intel should have been reaching out to TSMC a long time before Pat Gelsinger did. Intel secured TSMC 3nm qucikly after Pat joined Intel as CEO - He is changing Intel fast.

Bob Swan was probably on vacation for the majority of his time at Intel, like most former Intel leaders which has now also been booted. Intel dominated AMD so hard, that Intel did not have to improve at all and this is 100% on the CEO, which was sleeping too. He did not lead the company, which is his job. He thought AMD was beat and gone. Meanwhile AMD hired Jim Keller, the man behind Zen and came back slowly over 3-4 years, without Intel improved much in those 3-4 years.

If AMD had been dominating Intel for 10 years, you would have seen the same thing. AMD would have turned from the good guy to the bad guy really fast and the milking would have been the same. Monopoly = Milking. Thats why both players are important and I can't wait till we get Intel as a 3rd player in the GPU market.

AMD have raised prices alot many times throughout the history. However Intel and Nvidia always came back and made them reduce the prices again.
 
Last edited:

TheLostSwede

News Editor
Joined
Nov 11, 2004
Messages
16,001 (2.26/day)
Location
Sweden
System Name Overlord Mk MLI
Processor AMD Ryzen 7 7800X3D
Motherboard Gigabyte X670E Aorus Master
Cooling Noctua NH-D15 SE with offsets
Memory 32GB Team T-Create Expert DDR5 6000 MHz @ CL30-34-34-68
Video Card(s) Gainward GeForce RTX 4080 Phantom GS
Storage 1TB Solidigm P44 Pro, 2 TB Corsair MP600 Pro, 2TB Kingston KC3000
Display(s) Acer XV272K LVbmiipruzx 4K@160Hz
Case Fractal Design Torrent Compact
Audio Device(s) Corsair Virtuoso SE
Power Supply be quiet! Pure Power 12 M 850 W
Mouse Logitech G502 Lightspeed
Keyboard Corsair K70 Max
Software Windows 10 Pro
Benchmark Scores https://valid.x86.fr/5za05v
This smugness is good, as it motives AMD to come out with "Zen 4" sooner, than dragging feet with that "Zen 3+" chip. While it's a nice swansong for AM4, "Zen 3+" is a step in the wrong direction, as it will be held back by older DDR4 memory and PCIe Gen 4, while enthusiasts are drawn to "Alder Lake" and DDR5. Rather let AMD take its time, and come out with AM5 and "Zen 4" with DDR5 in Q1 2022.
Eh? How is PCIe 4.0 holding anything back? Intel's first stab at PCIe 5.0 is only for the x16 slot, unless a board maker turns that into a x8 slot and shared the remaining bandwidth with something else. Also, I doubt we'll see consumer PCIe 5.0 SSDs any time soon, but I guess I could be wrong here.
There's no way AMD is going to have something on a new socket ready for Q1 next year, unless you want a repeat of the Ryzen launch, where we get meh motherboards, buggy UEFI's and AGESA's and everything else that wasn't so great about that platform. I'd rather that AMD take their time and put out a near problem free platform later next year.

As for DDR5, I hope you have very deep pockets if you want something better than 4800MHz modules.

AMD can't do 5nm chips in Q1 2022.
Source? I mean, sorry, but you really have to back up claims like this.
You and I can't possibly know what agreements AMD has with TSMC, so maybe stick with things like "highly unlikely" if that's your opinion in this matter, instead of making unsubstantiated claims.

@Valantar - Personally I feel the interview answers are clumsy, almost a bit unprofessional, but I guess it's possible that it's meant to be some kind of business strategy to show that Intel is back and bolder than ever. Either which way, some of it rubbed me the wrong way, as it almost comes out like no-one is good enough to compete with Intel and no-one will ever be able to properly compete with Intel, as they have the largest market share and if its partners can't understand that, then they should get a good talking to, to make sure they understand.
 
Last edited:
Joined
Jul 16, 2014
Messages
8,115 (2.29/day)
Location
SE Michigan
System Name Dumbass
Processor AMD Ryzen 7800X3D
Motherboard ASUS TUF gaming B650
Cooling Artic Liquid Freezer 2 - 420mm
Memory G.Skill Sniper 32gb DDR5 6000
Video Card(s) GreenTeam 4070 ti super 16gb
Storage Samsung EVO 500gb & 1Tb, 2tb HDD, 500gb WD Black
Display(s) 1x Nixeus NX_EDG27, 2x Dell S2440L (16:9)
Case Phanteks Enthoo Primo w/8 140mm SP Fans
Audio Device(s) onboard (realtek?) - SPKRS:Logitech Z623 200w 2.1
Power Supply Corsair HX1000i
Mouse Steeseries Esports Wireless
Keyboard Corsair K100
Software windows 10 H
Benchmark Scores https://i.imgur.com/aoz3vWY.jpg?2
There is nothing better, and more funny, than a diatribe by a schill that doesnt name sources for the imaginary facts. That doesnt mean they arent true, I mean embelishment is a matter of opinion. All this drama from Intel is telling AMD to put up or shut up in a round about way, "hey we got better clocks, nyahnyah". Making a mountain out of production delays by the foundry doesnt mean its Intels or AMDs fault unless there is more than a minor correction needed., Foundry delays cant always be controlled, just ask the 10nm crowd. :p
 
Last edited:

las

Joined
Nov 14, 2012
Messages
1,533 (0.37/day)
System Name Obsolete / Waiting for Zen 5 or Arrow Lake
Processor i9-9900K @ 5.2 GHz @ 1.35v / No AVX Offset
Motherboard AsRock Z390 Taichi
Cooling Custom Water
Memory 32GB G.Skill @ 4000/CL15
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 280 Hz + 77" QD-OLED @ 144 Hz VRR
Case Fractal Design Meshify C
Audio Device(s) Asus Essence STX / Upgraded Op-Amps
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
Eh? How is PCIe 4.0 holding anything back? Intel's first stab at PCIe 5.0 is only for the x16 slot, unless a board maker turns that into a x8 slot and shared the remaining bandwidth with something else. Also, I doubt we'll see consumer PCIe 5.0 SSDs any time soon, but I guess I could be wrong here.
There's no way AMD is going to have something on a new socket ready for Q1 next year, unless you want a repeat of the Ryzen launch, where we get meh motherboards, buggy UEFI's and AGESA's and everything else that wasn't so great about that platform. I'd rather that AMD take their time and put out a near problem free platform later next year.

As for DDR5, I hope you have very deep pockets if you want something better than 4800MHz modules.


Source? I mean, sorry, but you really have to back up claims like this.
You and I can't possibly know what agreements AMD has with TSMC, so maybe stick with things like "highly unlikely" if that's your opinion in this matter, instead of making unsubstantiated claims.

@Valantar - Personally I feel the interview answers are clumsy, almost a bit unprofessional, but I guess it's possible that it's meant to be some kind of business strategy to show that Intel is back and bolder than ever. Either which way, some of it rubbed me the wrong way, as it almost comes out like no-one is good enough to compete with Intel and no-one will ever be able to properly compete with Intel, as they have the largest market share and if its partners can't understand that, then they should get a good talking to, to make sure they understand.

PCI 4.0 is not holding anything back. Even PCI 3.0 is not really holding anything back (besides NVME x4 in synthetic benches).

Apple owns the 5nm node. 2020 and 2021 Appe chips are using it. Apple gets priority, this is nothing new. Tons of news about this. Apple have NOT been lacking availablity and chips throughout whole COVID19, why? Because TSMC prioritizes them. Obviously.

Nvidia will use TSMC 5nm in late 2022, I don't see AMD being able to use it before Nvidia, thats why I wrote late 2022.

It's pretty much a fact that we will see a refresh from AMD next, instead of a full new release. 6nm is ready (optimized 7nm), but 5nm is not. Thats why they have been rambling about 3D cache all the time. You will see refresh of 5000 series on 6nm node with added 3D Cache.

Several AMD chips are rumoured on the 6nm node for next year. Console APUs, GPUs and CPUs. They are spreading out from 7nm to be able to make more chips. AMD will occupy 7nm + 6nm at the same time, instead of only 7nm.

It's not really claims more like logic. AMD can't be using TSMC most advanced node, it would ruin their pricing. However they don't really need their most advanced node either. They used TSMC 7nm for Zen 2 and Zen 3 while Intel was using 14nm. Now Intel is using 10nm+ (aka Intel 7) and TSMC 6nm is going to be fine. If AMD can't compete unless they have a serious node advtange (think 14nm vs 7nm) then their design is flawed anyway and the focus will shift to performance/value instead.

7nm (and 6nm) is way cheaper than 5nm/5nm+ and probably better yields too so AMD have no rush for 5nm yet

They will probably first need 5nm when Meteor Lake hits in late 2022 on Intel 4 aka 7nm
 
Last edited:

TheLostSwede

News Editor
Joined
Nov 11, 2004
Messages
16,001 (2.26/day)
Location
Sweden
System Name Overlord Mk MLI
Processor AMD Ryzen 7 7800X3D
Motherboard Gigabyte X670E Aorus Master
Cooling Noctua NH-D15 SE with offsets
Memory 32GB Team T-Create Expert DDR5 6000 MHz @ CL30-34-34-68
Video Card(s) Gainward GeForce RTX 4080 Phantom GS
Storage 1TB Solidigm P44 Pro, 2 TB Corsair MP600 Pro, 2TB Kingston KC3000
Display(s) Acer XV272K LVbmiipruzx 4K@160Hz
Case Fractal Design Torrent Compact
Audio Device(s) Corsair Virtuoso SE
Power Supply be quiet! Pure Power 12 M 850 W
Mouse Logitech G502 Lightspeed
Keyboard Corsair K70 Max
Software Windows 10 Pro
Benchmark Scores https://valid.x86.fr/5za05v
Apple owns the 5nm node. 2020 and 2021 Appe chips are using it.
Nvidia will use TSMC 5nm in later 2022, I don't see AMD being able to use it before Nvidia, thats why I wrote late 2022.
Sorry, I didn't ask for your opinion once again, I asked for proof.
Do you work at TSMC and are you involved in how nodes are being allocated? Or maybe you're in the pricing department?
If not, then this is just an opinion about a matter that you don't have any insight into, hence why you can't make statements like this.
Rumours are rumours, but in as much as I don't expect to see a 5nm part from AMD when they announce a new CPU, I don't have enough insight into things to claim it as a fact, like you do.
 

las

Joined
Nov 14, 2012
Messages
1,533 (0.37/day)
System Name Obsolete / Waiting for Zen 5 or Arrow Lake
Processor i9-9900K @ 5.2 GHz @ 1.35v / No AVX Offset
Motherboard AsRock Z390 Taichi
Cooling Custom Water
Memory 32GB G.Skill @ 4000/CL15
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 280 Hz + 77" QD-OLED @ 144 Hz VRR
Case Fractal Design Meshify C
Audio Device(s) Asus Essence STX / Upgraded Op-Amps
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
Sorry, I didn't ask for your opinion once again, I asked for proof.
Do you work at TSMC and are you involved in how nodes are being allocated? Or maybe you're in the pricing department?
If not, then this is just an opinion about a matter that you don't have any insight into, hence why you can't make statements like this.
Rumours are rumours, but in as much as I don't expect to see a 5nm part from AMD when they announce a new CPU, I don't have enough insight into things to claim it as a fact, like you do.
Proof of unreleased products? I know the business, just using common sense. You will see in 3-6 months when AMD puts out CPUs using 6nm TSMC. Think of this post when it happends.
 

TheLostSwede

News Editor
Joined
Nov 11, 2004
Messages
16,001 (2.26/day)
Location
Sweden
System Name Overlord Mk MLI
Processor AMD Ryzen 7 7800X3D
Motherboard Gigabyte X670E Aorus Master
Cooling Noctua NH-D15 SE with offsets
Memory 32GB Team T-Create Expert DDR5 6000 MHz @ CL30-34-34-68
Video Card(s) Gainward GeForce RTX 4080 Phantom GS
Storage 1TB Solidigm P44 Pro, 2 TB Corsair MP600 Pro, 2TB Kingston KC3000
Display(s) Acer XV272K LVbmiipruzx 4K@160Hz
Case Fractal Design Torrent Compact
Audio Device(s) Corsair Virtuoso SE
Power Supply be quiet! Pure Power 12 M 850 W
Mouse Logitech G502 Lightspeed
Keyboard Corsair K70 Max
Software Windows 10 Pro
Benchmark Scores https://valid.x86.fr/5za05v
Proof of unreleased products? I know the business, just using common sense. You will see in 3-6 months when AMD puts out CPUs using 6nm TSMC. Think of this post when it happends.
No, proof of your claims that something is impossible due to X.
Again, I'm simply suggesting you change your language a bit about things that you can't possibly know, rather than claiming things to be a fact, when in fact you have no more insight into things than anyone else in this forum and possibly less.
 

las

Joined
Nov 14, 2012
Messages
1,533 (0.37/day)
System Name Obsolete / Waiting for Zen 5 or Arrow Lake
Processor i9-9900K @ 5.2 GHz @ 1.35v / No AVX Offset
Motherboard AsRock Z390 Taichi
Cooling Custom Water
Memory 32GB G.Skill @ 4000/CL15
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 280 Hz + 77" QD-OLED @ 144 Hz VRR
Case Fractal Design Meshify C
Audio Device(s) Asus Essence STX / Upgraded Op-Amps
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
No, proof of your claims that something is impossible due to X.
Again, I'm simply suggesting you change your language a bit about things that you can't possibly know, rather than claiming things to be a fact, when in fact you have no more insight into things than anyone else in this forum and possibly less.
You can believe what you want, I don't force you. AMDs next CPU launch will be using 6nm TSMC and it will still be on AM4. You heard it here first. No need to answer just wait, I'd say 3-6 months.
 
Joined
Mar 10, 2010
Messages
11,878 (2.31/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
It was just a matter of time - AMD did well condering all.

HOWEVER without Intel being stuck at 14nm _and_ AMD using TSMC 7nm - AMD would never have been able to do what they did. Ryzen 1000 and 2000 on GloFo was nothing special at all however it delivered an alternative to Intel which even the most hardcore AMD fanboys had joined, because FX CPUs were pretty much garbage.

With 3000 series Ryzen became decent, with 5000 series they become good. This is mostly because of going from GloFo 12nm (which is far worse than Intel 14nm in every aspect) to TSMC 7nm.

AMD have beaten Intel and even Nvidia before but Intel and Nvidia always came back and took the crown. Which resulted in AMD lowering prices and went back to the drawing board. Just look how AMD priced 5000 series compared to 1000/2000 and somewhat 3000 series. AMD started milking just like Intel did for ~10 years. AMD left NON-X models and generally priced all the chips way higher than before.

I own 2 Ryzen machines (htpc + nas/server) and 1 Intel (gaming rig) so you can stop the fanboy BS. I even own two consoles, so I actually have 4 AMD chips in my home... However I also have 3 intel laptops :p So I guess it's 50/50
You rant about AMD in an Intel thread and decry fanboi bs, right.

As for Pat, OT go you bro I just hope you're mouth isn't cashing checks your CPUs can't pay.

But it does sound a bit Poor Volta to me, I hope this isn't just hype but reviews please or gtfo.
 
Joined
May 2, 2017
Messages
7,762 (3.08/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
The gap when Alder Lake comes out next month.
Do you have access to a review, or have a CPU in hand? Otherwise you're making baseless assumptions. We have no idea how these CPUs will perform. They'll likely be faster, but by how much, and in which scenarios? Without knowing this, speculation is useless.
Yes DLSS is good and I have enjoyned its benefits for a year now. Especially when I use my 4K/UHD OLED for gaming, it allows for 100+ easily in most games with high settings using quality mode, perfect for my 120 Hz native OLED TV.
Good for you? Most people don't play PC games on their TVs (for the most part, though it is definitely gaining popularity). And as I said, DLSS is great, and I think smart upscaling tech is a major part in the future of real-time 3D graphics. But closed-off, single-vendor stuff doesn't tend to last.
FSR might be good, it might not. Adaption is low right now, and there's several examples on why open is a bad things too. ALOT of games with injected FSR delivers WORSE performance and WORSE image quality than not using FSR at all. Right now DLSS is clearly superior. I have tried both in sevaral games, and this is also what tests are concluding.
Poorly executed amateur injections does not say anything about the quality of FSR, it says something about the skill and/or methods used by those implementing it. I, at least, expect game developers to be reasonably competent. And I never said DLSS wasn't superior - it tends to win out slightly either in quality, framerate, or both. But that ultimately doesn't matter if FSR is more broadly adopted due to ease of implementation or other factors. We'll see how this plays out.
I'm simply saying that no sane person is buying top-end graphics cards, to game at 1080p which looks horrible and you will be CPU bound in most cases anyway.
Yes, but who has said that here? Nobody. So, can you please stop with the straw man arguments?
3080 Ti and 6900XT are essentially even at 1440p, which is the lowest resolution I'm looking at today. I would never accept or recommend 1080p in 2021, unless you are building a budget rig maybe. 1440p IPS at 144 Hz or more is so cheap right now that 1080p is dead to me.
There are still plenty of people buying reasonably high end GPUs for 1080p gaming - those with 360Hz displays, for example. That's also a small group, but I'd wager it's bigger than those with 2160p120/144 displays. I don't think that's reasonable either, but then my tastes aren't universal. I would assume yours aren't either - that's generally not how tastes work, after all.
AMD will face serious competition in all markets in the following years, it's all I'm saying. And without using best possible nodes at TSMC, I can't really see them be able to compete on perf, however they can (and should) on value.
They're going to have access to the 5nm node when they have products ready for it - fab orders are placed well in advance, after all, and products are taped out long before they are put into mass production.
Radeon 6600XT and 6700XT have bad value/perf compared to RTX 3060 and 3060 Ti/3070. The Nvidia cards sold waaaaay better
Lolwut? 6600 XT outperforms the 3060 in pretty much everything, and at MSRP is very evenly matched with the Ti (it's a bit cheaper and a bit weaker). The 6700 XT is a bit of an odd duck in terms of value, that's true - the 6800 makes it look bad, and IMO it should have been a $399 card. But I don't control that, so ... *shrug*

As for who sold better - so what? Nvidia has a huge marketshare and mindshare advantage. Given their rough 80/20 market split, it would be shocking if the 3060/- Ti didn't outsell similarly priced AMD GPUs by ~4x. That's what the market share tells us to expect, after all.
I can't wait to see how Intel's dGPUs will perform, if they go with aggressive pricing (which they should), they could easily goggle up some marketshare in the low to mid-end market - AMDs prime segment - and considering Nvidia soon hits 85% dGPU marketshare, this could turn out messy. AMD might be sub 10% marketshare in the dGPU segment in 1-2 years, unless they fix their pricing and availablity issues. Not many are going to pay Nvidia prices for AMD GPUs. That's just reality.
Well, that's that mindshare thing I mentioned. AMD is struggling with an undeserved bad image for driver issues and a lack of features (their current drivers are no more buggy than Nvidia's, and they generally have a wider featureset than Nvidia, though the 5700 XT did have some serious issues that screwed up a years-long track record of well-maintained drivers for AMD), something that takes a very long time to rectify, especially when you're a much smaller player in the market. Going for value/price is one way forward, but also one that risks continuing these stigma among users ("Nvidia is best, AMD is cheap"), which is ultimately counterproductive. They're clearly playing a long game in order to position themselves as equal to Nvidia, which they largely are, despite their much smaller size and budgets. IMO, that's the only way towards growing or even maintaining their share in a competitive market. Positioning yourself as a cheaper also-ran is not a recipe for success. How Intel will play into this I have absolutely no idea about - that depends on how their GPUs perform, how bad their drivers and support are, and how much they're willing to put into marketing. I'm guessing the start of it will be a combined bombardment of ads + terrible drivers, but how it develops from there is anyone's guess.

You can believe what you want, I don't force you. AMDs next CPU launch will be using 6nm TSMC and it will still be on AM4. You heard it here first. No need to answer just wait, I'd say 3-6 months.
AMD's next consumer CPUs will be Zen3 with V-cache. For AM4. This is not news. AMD confirmed this months ago.
 
Top