• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel's Pat Gelsinger Exclaims "Intel is Back" AMD is "Over"

las

Joined
Nov 14, 2012
Messages
1,141 (0.31/day)
System Name Obsolete
Processor i9-9900K @ 5.2 GHz
Motherboard AsRock Z390 Taichi
Cooling Custom Water
Memory 32GB G.Skill @ 4000/CL15
Video Card(s) MSI 3080 Ventus 3X @ 2+ GHz
Storage WD Black SN850 1TB + 32TB NAS (RAID-Z2)
Display(s) 27" 1440p/240Hz/IPS + 65" 4K/120Hz/OLED
Case Fractal Design Meshify C
Audio Device(s) Asus Essence STX w/ Upgraded Op-Amps
Power Supply Corsair RM1000x
Mouse Logitech G Pro Wireless
Keyboard Corsair K60 Pro (MX Low Profile Speed)
Software Windows 10 Pro x64
Do you have access to a review, or have a CPU in hand? Otherwise you're making baseless assumptions. We have no idea how these CPUs will perform. They'll likely be faster, but by how much, and in which scenarios? Without knowing this, speculation is useless.

Good for you? Most people don't play PC games on their TVs (for the most part, though it is definitely gaining popularity). And as I said, DLSS is great, and I think smart upscaling tech is a major part in the future of real-time 3D graphics. But closed-off, single-vendor stuff doesn't tend to last.

Poorly executed amateur injections does not say anything about the quality of FSR, it says something about the skill and/or methods used by those implementing it. I, at least, expect game developers to be reasonably competent. And I never said DLSS wasn't superior - it tends to win out slightly either in quality, framerate, or both. But that ultimately doesn't matter if FSR is more broadly adopted due to ease of implementation or other factors. We'll see how this plays out.

Yes, but who has said that here? Nobody. So, can you please stop with the straw man arguments?

There are still plenty of people buying reasonably high end GPUs for 1080p gaming - those with 360Hz displays, for example. That's also a small group, but I'd wager it's bigger than those with 2160p120/144 displays. I don't think that's reasonable either, but then my tastes aren't universal. I would assume yours aren't either - that's generally not how tastes work, after all.

They're going to have access to the 5nm node when they have products ready for it - fab orders are placed well in advance, after all, and products are taped out long before they are put into mass production.

Lolwut? 6600 XT outperforms the 3060 in pretty much everything, and at MSRP is very evenly matched with the Ti (it's a bit cheaper and a bit weaker). The 6700 XT is a bit of an odd duck in terms of value, that's true - the 6800 makes it look bad, and IMO it should have been a $399 card. But I don't control that, so ... *shrug*

As for who sold better - so what? Nvidia has a huge marketshare and mindshare advantage. Given their rough 80/20 market split, it would be shocking if the 3060/- Ti didn't outsell similarly priced AMD GPUs by ~4x. That's what the market share tells us to expect, after all.

Well, that's that mindshare thing I mentioned. AMD is struggling with an undeserved bad image for driver issues and a lack of features (their current drivers are no more buggy than Nvidia's, and they generally have a wider featureset than Nvidia, though the 5700 XT did have some serious issues that screwed up a years-long track record of well-maintained drivers for AMD), something that takes a very long time to rectify, especially when you're a much smaller player in the market. Going for value/price is one way forward, but also one that risks continuing these stigma among users ("Nvidia is best, AMD is cheap"), which is ultimately counterproductive. They're clearly playing a long game in order to position themselves as equal to Nvidia, which they largely are, despite their much smaller size and budgets. IMO, that's the only way towards growing or even maintaining their share in a competitive market. Positioning yourself as a cheaper also-ran is not a recipe for success. How Intel will play into this I have absolutely no idea about - that depends on how their GPUs perform, how bad their drivers and support are, and how much they're willing to put into marketing. I'm guessing the start of it will be a combined bombardment of ads + terrible drivers, but how it develops from there is anyone's guess.


AMD's next consumer CPUs will be Zen3 with V-cache. For AM4. This is not news. AMD confirmed this months ago.

If you buy 6900XT or 3080Ti/3090 for 1080p gaming, even tho you are running 360 Hz, then you are clueless about how gaming PCs work. You will be CPU (and somewhat RAM) bound anyway and a card like 6700XT/3060 Ti will deliver pretty much the same performance in esport titles which people buying these monitors are playing anyway. Going 1080p/360Hz for maxing out AAA games is downright stupid because the 80% pixel increase going to 1440p will deliver way more impressive visuals while still delivering very high fps.

Most "pro gamers" are NOT using high-end GPUs for 1080p and below. Games like CSGO etc requires pretty much NOTHING from the GPU. Streamers generally use higher-end cards, but they tend to run 1440p and above too.

I'm running 1440p at 240 Hz and I'm CPU bound in most cases too, unless I'm maxing out demanding AAA games. Even in Warzone I'm pretty much at 200+ fps at all times with GPU usage going dropping. BF2042 later today should probably require more, I still expect 120 fps minimum tho, but I will aim for 200+ like usual, using custom settings, while retaining 95% of Ultra IQ. Game has DLSS tho, so that will be tested.

There have been plenty of Alder Lake leaks showing big leaps in several benchmarks. I don't deny them, like some people do tho. Even tho I'm not interested in buying Alder Lake at all (maybe for laptop, not for desktop).
 
Last edited:
Joined
Feb 23, 2019
Messages
4,551 (3.29/day)
Location
Poland
Processor Ryzen 7 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling BeQuiet Dark Rock 4
Memory 2x16 GB Crucial Ballistix 3600 CL16 Rev E @ 3800 CL16
Video Card(s) RTX3080 Ti FE
Storage SX8200 Pro 1 TB, Plextor M6Pro 256 GB, WD Blue 2TB
Display(s) Acer XB273GP
Case SilverStone Primera PM01 RGB
Audio Device(s) SoundBlaster G6 | Fidelio X2 | Sennheiser 6XX
Power Supply SeaSonic Focus Plus Gold 750W
Mouse Logitech G400 | SteelSeries Rival 300
Keyboard MK Typist (Kailh Box White)
Pat needs a wizard robe and he can cosplay as D&D warlock.

As for AMD being over I will wait until we see benchmarks and we see growth in Intel's market share.
 
Joined
Jul 7, 2019
Messages
483 (0.39/day)
I don't know; every time someone smack-talks the competition in the tech world before releasing a product with finalized (pre-independent review) benchmarks against current rival equivalents, they and their product end up being an embarrassment. Raja and his "Poor Volta" being a more memorable one. I like the current AMD where they don't brag until they have the final (pre-independent) benchmarks ready to throw up on the screen, with bars that push out to two screens or more, compared to their nearest competitors. That's including the fact that some of those graphs admitting they lose in a scenario or two.

At the same time, Intel does have the money to trash talk, even if it's just to hype investors. They also have their own fabs, as blessed/cursed as that can be, and can maintain stock better unless/until AMD can buy out the majority of TSMC's node production similar to how Apple can and does. And naturally, if it doesn't quite work, they can just ramp up the power reqs again until it's competitive in everything except power efficiency.
 

las

Joined
Nov 14, 2012
Messages
1,141 (0.31/day)
System Name Obsolete
Processor i9-9900K @ 5.2 GHz
Motherboard AsRock Z390 Taichi
Cooling Custom Water
Memory 32GB G.Skill @ 4000/CL15
Video Card(s) MSI 3080 Ventus 3X @ 2+ GHz
Storage WD Black SN850 1TB + 32TB NAS (RAID-Z2)
Display(s) 27" 1440p/240Hz/IPS + 65" 4K/120Hz/OLED
Case Fractal Design Meshify C
Audio Device(s) Asus Essence STX w/ Upgraded Op-Amps
Power Supply Corsair RM1000x
Mouse Logitech G Pro Wireless
Keyboard Corsair K60 Pro (MX Low Profile Speed)
Software Windows 10 Pro x64
Pat needs a wizard robe and he can cosplay as D&D warlock.

As for AMD being over I will wait until we see benchmarks and we see growth in Intel's market share.

AMD won't be over, this have happended many times before. AMD will just lower prices and focus on better performance for the money. 90% performance at 75% the price will still sell. It's not like AMD have been far ahead of Intel anyway in terms of performance, considering they had a huge node advantage with 7nm vs 14nm. Mostly it's the watt usage, performance wise in gaming, emulation and regular workloads, not really much difference between Ryzen 5000 and even Intel 8700K (especially when overclocked to 5 GHz or more like most will do), some tests Ryzen 5000 wins, others 8700K wins. Performance per watt is the biggest difference, if you care about that for desktop pc's tho. 150 watts or 200 watts, big difference? Personally, I could not care less. GPUs today can peak to 500-600 watts and some people point out a 50 watt increase on a CPU? Come on. Even a cheap 240mm or dual tower air cooler will keep an intel chip at 5 ghz without any problem. Alot of people are even pushing 5.2-5.4 GHz for 24/7 usage.

My 9900K consumes like 100-150 watts in gaming locked at 5.2. Could not care less if it can hit 225+ in synthetic burn-ins with avx2.

It's not exactly like Ryzen 5800X/5900X/5950X running at 4.7 GHz or even 4.8 are "cool" in comparison. Still needs some decent cooling.
 
Last edited:
Joined
Mar 16, 2017
Messages
1,310 (0.63/day)
Location
Tanagra
This is also the same company that provided chips to Apple for years, but now that that is over they are now actively campaigning against them by promoting touchscreens and such.

The leaks of Adler Lake have me skeptical. The little cores are only found on expensive SKUs, suggesting that is a core-war gimmick, Windows 11 is likely going to be needed to get the best results, and more than anything, I would like to see some reviews that show that runaway power consumption is a thing of the past. Intel has a lot to prove, IMO. When they were dominant, they didn’t talk about “the competition,” so the fact that they are having some hard feelings suggests this will be an RX580 moment—designs pushed to the limit to get some performance credibility back. I guess we’ll find out.
 

las

Joined
Nov 14, 2012
Messages
1,141 (0.31/day)
System Name Obsolete
Processor i9-9900K @ 5.2 GHz
Motherboard AsRock Z390 Taichi
Cooling Custom Water
Memory 32GB G.Skill @ 4000/CL15
Video Card(s) MSI 3080 Ventus 3X @ 2+ GHz
Storage WD Black SN850 1TB + 32TB NAS (RAID-Z2)
Display(s) 27" 1440p/240Hz/IPS + 65" 4K/120Hz/OLED
Case Fractal Design Meshify C
Audio Device(s) Asus Essence STX w/ Upgraded Op-Amps
Power Supply Corsair RM1000x
Mouse Logitech G Pro Wireless
Keyboard Corsair K60 Pro (MX Low Profile Speed)
Software Windows 10 Pro x64
This is also the same company that provided chips to Apple for years, but now that that is over they are now actively campaigning against them by promoting touchscreens and such.

The leaks of Adler Lake have me skeptical. The little cores are only found on expensive SKUs, suggesting that is a core-war gimmick, Windows 11 is likely going to be needed to get the best results, and more than anything, I would like to see some reviews that show that runaway power consumption is a thing of the past. Intel has a lot to prove, IMO. When they were dominant, they didn’t talk about “the competition,” so the fact that they are having some hard feelings suggests this will be an RX580 moment—designs pushed to the limit to get some performance credibility back. I guess we’ll find out.

Apple's M1 has very little to do with Intel chips tho. ARM vs x86.
It's not like Apple uses ARM chips in their higher end stuff. However ARM in MacBook Air makes perfect sense (some some of the lower end stuff too).
ARM is cheap and "fast enough" for most stuff while consuming very little power. This way Apple can spend the money on SCREEN which matters more for most users and still be able to sell MacBook Air for a pretty low price. Perfect for students for example, with 15-18 hours of battery life. You can use it for 2 days without charging...
ARM chips are heavily dependant on optimization and perfect programming in order to yield good/best performance and that is the reason why it works this well for Apple, since their ecosystem is more closed and dev's generelly knows which models they are coding for.
 
Last edited:
Joined
Feb 3, 2017
Messages
3,320 (1.56/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
Clickbait.

These two are not equal statements:
AMD is "Over"
"this period of time when people could say, "Hey, [AMD] is leading," that's over."

Let's wait and see what kind of CPUs will they delivier under 200€ and VGA between 150-300€.
When looking at the current generation, in the sub-200€ CPU market Intel is the only player in town :)
And 11400F is nothing to scoff at.

That is a way oversimplified take. The IPC improvements from Zen/Zen+ to Zen2 are very significant. Sure, some of those are enabled by the increased density of the 7nm node allowing for more transistors and denser circuits, but putting that down to "mostly [the fab node]" is oversimplified to the absurd. Zen2 was far more than Zen/Zen+ shrunk down to 7nm. Zen2 beat Intel outright in IPC, and improved upon Zen+ by ~15%. Zen2 was a one-two punch of a highly efficient node that clocked better than its predecessor and a very noticeable architectural improvement.
While you are correct, "oversimplified to the absurd" is also quite an exaggeration. Rocket Lake is probably a pretty good example for the impact of "mostly [the fab node]" problem. Rocket Lake is smack in the middle of Zen2 and Zen3 when it comes to IPC, let down primarily by power consumption and smaller cache. First of which is directly related to fab node used and second is usually related to the available transistor budget. Latter is usually also related to the fab node.
 
Last edited:
Joined
Nov 11, 2004
Messages
12,779 (1.94/day)
Location
Sweden
System Name Overlord Mk MXX
Processor AMD Ryzen 7 5800X
Motherboard Gigabyte X570 Aorus Master
Cooling Noctua NH-D15 SE AM4
Memory 32GB Viper Steel 3600 DDR4 @ 3800MHz 16-19-16-19-36
Video Card(s) Gigabyte RTX 2080 Gaming OC 8G
Storage 1TB WD Black NVMe (2018), 2TB Viper VPN100, 2TB Kingston KC3000
Display(s) Acer XV272K LVbmiipruzx 4K@160Hz
Case Fractal Design Focus 2 Solid
Audio Device(s) Corsair Virtuoso SE
Power Supply Seasonic Focus GX 750W
Mouse Logitech G502 Lightspeed
Keyboard Svive Triton Pro
Software Windows 10 Pro
Benchmark Scores https://valid.x86.fr/33u9si
Clickbait.

These two are not equal statements:
Not really, he says that the good job AMD has been doing is over once Alder Lake is out in a different part of the interview. Maybe I misinterpreted that, but it sounds like he's saying AMD no longer has a chance against Intel, i.e. they're out.
 
Joined
Feb 3, 2017
Messages
3,320 (1.56/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
The real question is, would Intel have made the advances they have recently if AMD had not been eating into their market share? No. If Intel had no real competition, we would still be paying exorbitant prices for sub-par CPUs and APUs with rubbish embedded graphics. That alone makes me want to stick with AMD for all future purchases, they pushed Intel to get off their butts and do better instead of just ripping off PC buyers, like they had been for the last decade. AMD deserves the sales, Intel don't.
You mean Intel's game plan was to stay on 14nm and Skylake for 6 years? I doubt it.
 
Joined
Mar 2, 2011
Messages
1,226 (0.29/day)
Location
Omaha, NE
System Name Graphics Card Free...
Processor Ryzen 5 5600G
Motherboard MSI B450 Gaming Plus MAX Wifi
Cooling Cryorig M9a w/ BeQuiet! PureWings 2 ~ 92mm
Memory Corsair Dominator Platinum DDR4 3200 ~ 16GB(2x8GB)
Storage Samsung EVO 870 SSD - 1TB
Display(s) AOC 24G2
Case Cardboard...
Power Supply eVGA SuperNova 550w G3
Mouse Logitech t400 Zone Touch Mouse
Keyboard IBM Model "M" Keyboard
Software Manjaro ~ KDE Plasma
Benchmark Scores She's a Runner!
Back when Conroe was released no silly comments were necessary. I was a member of the largest AMD site on the web when the e6600 hit the streets.

The website was a ghost town literally overnight...

From my perspective...Intel has come up with nothing more than incremental improvements since first gen. Milking the (cash cow)public with every new generation and they keep coming back for more. I simply can't understand it myself...other than destroying the worlds resources for nothing...what does it accomplish? What is 8-10% IPC improvements netting you at the end of the day?

From my perspective...this extreme acceleration of production is destroying our children's future and it's these companies who are doing this who are crying about climate change the loudest. It makes zero sense.

Unless there is an alternate explanation...and I personally believe there is. My thesis would start with taxation and control. What about yours?

Oh...veered off topic. Apologize.

Where is Conroe 2 Intel? Where? When?

Best,

Liquid Cool

P.S. Phanbuey. If I was running China...NOW is the best time to attack Taiwan. They won't get a better opportunity after the mid-term elections. They have a window here and I believe someone is going to mis-step. Israel-Iran seems plausible as well. Not to mention Russia and....ok, sorry, going off topic. I'll stop here.
 
Last edited:
Joined
Jul 9, 2015
Messages
3,334 (1.23/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Joined
Jul 16, 2014
Messages
7,821 (2.55/day)
Location
SE Michigan
System Name Dumbass
Processor AMD FX-9370
Motherboard ASUS SABERTOOTH 990FX R2.0 +SB950
Cooling Artic Liquid Freezer 2 - 420mm
Memory G.Skill Sniper 16gb DDR3 2400
Video Card(s) GreenTeam 1080 Gaming X 8GB
Storage Samsung EVO 500gb & 1Tb, 2tb HDD
Display(s) 1x Nixeus NX_EDG27, 2x Dell S2440L (16:9)
Case Phanteks Enthoo Primo w/8 140mm SP Fans
Audio Device(s) onboard (realtek?) - SPKRS:Logitech Z623 200w 2.1
Power Supply Corsair HX1000i
Mouse Logitech G604
Keyboard Logitech G910 Orion Spark
Software windows 10
Benchmark Scores https://i.imgur.com/aoz3vWY.jpg?2
Not really, he says that the good job AMD has been doing is over once Alder Lake is out in a different part of the interview. Maybe I misinterpreted that, but it sounds like he's saying AMD no longer has a chance against Intel, i.e. they're out.
Thats how I read that. Trying show confidence in his products, while typically doing an Intel on AMD ( the PR inflated benchmarks ).
 
Joined
Sep 4, 2019
Messages
121 (0.10/day)
I never said 6000 series were bad, only the lower end is bad and priced too high.

The 6600XT is the only current gen card that is regularly available for less than $500. Not to mention it unquestionably outperforms the 3060 which is not only less available but also street prices for more. Dont buy into the media BS. Thousands of gamers are buying and enjoying their 6600XT.
 
Joined
Apr 16, 2019
Messages
632 (0.47/day)
Gelsinger doesn't bullshit, so this likely means that what I've been saying for a while now (that AMD's good months are coming to an end) is imminent.
 
Joined
Mar 16, 2017
Messages
1,310 (0.63/day)
Location
Tanagra
What is the issue with that???
It just reeks of hard feelings. A CPU company is bashing a “lifestyle” company using justifications not related to CPU company’s own product merits, but rather platform merits. Intel is making cases that MS should be making, or rather has already made. Up until Apple dropped Intel, Macs not having touchscreens was not an issue to intel, but now that tired comparison emerges yet again. I still think there’s a bigger collaboration between intel and MS with Windows 11. To the point that intel is even making the same arguments for Windows over MacOS that MS has. Intel needs MS, and needs Windows to be interesting again. Windows 10 isn’t the last version of Windows after all, and what a surprise it launches just before Intels first big.LITTLE design, one that needs a new scheduler.
 
Joined
Jun 17, 2018
Messages
38 (0.02/day)
System Name RYZEN
Processor 5900X @ 4.7Ghz EK Evo Supremacy RBG
Motherboard Gigabyte Auros Elite X570
Cooling Aplhacool Monsta 360
Memory Gskill Royals RGB @ 3800Mhz 16GB
Video Card(s) RX 6800XT @ 2500 mhz core 2100mhz mem. EK Vega WaterBlock
Display(s) Acer 49 144hz Curve 1080P Monitor
Case Thermaltake G21 Dual Tempered Glass SPCC
Audio Device(s) steel series siberia elite prism
Power Supply EVGA G2 1600Watts Gold
Mouse tt sports level 10 mouse
Keyboard Logitech 710+ Mechanical keyboard
Software Windows 10
It was just a matter of time - AMD did well condering all.

HOWEVER without Intel being stuck at 14nm _and_ AMD using TSMC 7nm - AMD would never have been able to do what they did. Ryzen 1000 and 2000 on GloFo was nothing special at all however it delivered an alternative to Intel which even the most hardcore AMD fanboys had joined, because FX CPUs were pretty much garbage.

With 3000 series Ryzen became decent, with 5000 series they become good. This is mostly because of going from GloFo 12nm (which is far worse than Intel 14nm in every aspect) to TSMC 7nm.

AMD have beaten Intel and even Nvidia before but Intel and Nvidia always came back and took the crown. Which resulted in AMD lowering prices and went back to the drawing board. Just look how AMD priced 5000 series compared to 1000/2000 and somewhat 3000 series. AMD started milking just like Intel did for ~10 years. AMD left NON-X models and generally priced all the chips way higher than before.

I own 2 Ryzen machines (htpc + nas/server) and 1 Intel (gaming rig) so you can stop the fanboy BS. I even own two consoles, so I actually have 4 AMD chips in my home... However I also have 3 intel laptops :p So I guess it's 50/50
fanboy in your post all day long but sleep well believing your bias isn't showing.
 
Joined
May 2, 2017
Messages
7,761 (3.79/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
If you buy 6900XT or 3080Ti/3090 for 1080p gaming, even tho you are running 360 Hz, then you are clueless about how gaming PCs work. You will be CPU (and somewhat RAM) bound anyway and a card like 6700XT/3060 Ti will deliver pretty much the same performance in esport titles which people buying these monitors are playing anyway. Going 1080p/360Hz for maxing out AAA games is downright stupid because the 80% pixel increase going to 1440p will deliver way more impressive visuals while still delivering very high fps.
In case you missed it, I wasn't arguing for 1080p360 with a 6900 XT or 3080 Ti being sensible, I just said it happens.
Most "pro gamers" are NOT using high-end GPUs for 1080p and below. Games like CSGO etc requires pretty much NOTHING from the GPU. Streamers generally use higher-end cards, but they tend to run 1440p and above too.
No "pro" gamers play below 1080p these days. And I'm fully aware of how low the CPU requirements for CS:GO and the like are. Again, I just said that it happens. Stating facts, not voicing opinions about those facts.
I'm running 1440p at 240 Hz and I'm CPU bound in most cases too, unless I'm maxing out demanding AAA games. Even in Warzone I'm pretty much at 200+ fps at all times with GPU usage going dropping. BF2042 later today should probably require more, I still expect 120 fps minimum tho, but I will aim for 200+ like usual, using custom settings, while retaining 95% of Ultra IQ. Game has DLSS tho, so that will be tested.
Yes. And? Once again, with feeling: I never said this was smart, I just said people do it.

You seem to be under the impression that I was somehow arguing that doing these things was smart, sensible, something like that. Please rid yourself of that impression, as I haven't said anything like that. Whether or not these decisions are sensible (I would argue they typically aren't; most high-end GPU purchases are driven by technofetishism and an inability to make rational buying decisions, or at best under some very specific conditions for why it makes sense for that person), this happens. I haven't commented beyond that. I'm just saying that a) these GPUs sell well, and b) there are very few 2160p120+ displays out there, and 1440p144+ displays have been taking off in the past year or two, meaning most people buying high end GPUs are likely using one of those. And as you say, that leaves you CPU bound. Whatever the rationale may be, an absolute consideration of performance/$ is not the basis for these decisions, and whether the GPUs only really make sense for 2160p is rather immaterial when faced with the reality that 2160p gaming is rare.
There have been plenty of Alder Lake leaks showing big leaps in several benchmarks. I don't deny them, like some people do tho. Even tho I'm not interested in buying Alder Lake at all (maybe for laptop, not for desktop).
Leaks are leaks. They might be accurate, or they might be complete fabrications. We have also had leaks promising major gains, and other showing lacklustre results (which I assume you've missed, since you haven't mentioned them, and they go against what you're saying?). As such, I take the sensible apporach of reserving judgement until we have actually reliable benchmarks to go off of. For now, we have no way of knowing anything concrete.
 
Joined
Dec 26, 2006
Messages
2,144 (0.37/day)
Location
Northern Ontario Canada
System Name Just another PC
Processor Ryzen 1700
Motherboard Gigabyte GA-AX370-K3
Cooling Noctua NH-C12P SE14
Memory DDR4-2133 2x16GB
Video Card(s) Asus Tuf - AMD RX 6800
Storage 960 EVO 500GB OS, 1TB SSD Steam & 2TB WD Blue SSD Storage
Display(s) LG 27UL550-W
Case Be Quiet Pure Base 600 (no window)
Audio Device(s) Realtek ALC1220
Power Supply SuperFlower Leadex V Gold Pro 850W ATX Ver2.52
Mouse Mionix Naos 8200
Keyboard Corsair Strafe with browns
Software W10 Pro x64
Benchmark Scores Starts when push power button!!
Indeed. At the same time, AMD has to prove that they can keep up with Intel.
When it comes to chip production they can’t. Unfortunately.

Gelsinger continues with "We have 80 percent market share.

Maybe give out a few more x86 licenses and see how long that lasts.

Not hard when have only 1’real’ competitor with no fabs.

if there was a dozen capable companies making x86 to mass market and you had 80% market share that would be something to brag about.
 
Joined
Jun 22, 2014
Messages
408 (0.13/day)
System Name Desktop / "Console"
Processor Ryzen 5950X / Ryzen 5800X
Motherboard Asus X570 Hero / Asus X570-i
Cooling Deepcool Castle 280X / Cryorig C1
Memory 32GB Gskill Trident DDR4-3600 CL16 / 16GB Crucial Ballistix DDR4-3600 CL16
Video Card(s) RTX 3090 FE / RTX 2080ti FE
Storage 1TB Samsung 980 Pro, 1TB Sabrent Rocket 4 Plus NVME / 1TB Sabrent Rocket 4 NVME, 1TB Intel 660P
Display(s) LG 34GK950F / LG 65CX Oled
Case Lian Li O11 Mini / Sliger CL530 Conswole
Audio Device(s) Sony AVR, SVS Ultra Bookshelf , dual 3000 Micro / Marantz AVR, SVS Prime Pinnacle, dual SB3000
Power Supply Silverstone SX1000 / Silverstone SX800
VR HMD Quest 2 + Airlink, Elite Strap w/ VR Cover face & head strap pads
Joined
Nov 11, 2004
Messages
12,779 (1.94/day)
Location
Sweden
System Name Overlord Mk MXX
Processor AMD Ryzen 7 5800X
Motherboard Gigabyte X570 Aorus Master
Cooling Noctua NH-D15 SE AM4
Memory 32GB Viper Steel 3600 DDR4 @ 3800MHz 16-19-16-19-36
Video Card(s) Gigabyte RTX 2080 Gaming OC 8G
Storage 1TB WD Black NVMe (2018), 2TB Viper VPN100, 2TB Kingston KC3000
Display(s) Acer XV272K LVbmiipruzx 4K@160Hz
Case Fractal Design Focus 2 Solid
Audio Device(s) Corsair Virtuoso SE
Power Supply Seasonic Focus GX 750W
Mouse Logitech G502 Lightspeed
Keyboard Svive Triton Pro
Software Windows 10 Pro
Benchmark Scores https://valid.x86.fr/33u9si
When it comes to chip production they can’t. Unfortunately.
Well, they don't have fabs any more so...
Gelsinger continues with "We have 80 percent market share.

Maybe give out a few more x86 licenses and see how long that lasts.

Not hard when have only 1’real’ competitor with no fabs.

if there was a dozen capable companies making x86 to mass market and you had 80% market share that would be something to brag about.
Intel obviously doesn't want any more competition than they already have, since to them, it seems competition is a bad thing.
If it wasn't, they wouldn't be doing so many scummy things in the "channel". I mean, we know for a fact that they have and most likely still are, or they wouldn't have been fined for it.
 
Last edited:
Joined
Feb 11, 2009
Messages
4,294 (0.85/day)
System Name Cyberline
Processor Intel Core i7 2600k -> 12600k
Motherboard Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4
Cooling Tuniq Tower 120 -> Custom Watercoolingloop
Memory Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz
Video Card(s) AMD RX480 -> ... nope still the same :'(
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD
Display(s) Philips 32inch LPF5605H (television) -> Dell S3220DGF
Case antec 600 -> Thermaltake Tenor HTCP case
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
ctrl f "leadership", there we go, that is our one note pat back at it again
 
Joined
Apr 24, 2008
Messages
1,744 (0.33/day)
Processor RyZen R9 3950X
Motherboard ASRock X570 Taichi
Cooling Coolermaster Master Liquid ML240L RGB
Memory 64GB DDR4 3200 (4x16GB)
Video Card(s) RTX 3050
Storage Samsung 2TB SSD
Display(s) Asus VE276Q, VE278Q and VK278Q triple 27” 1920x1080
Case Zulman MS800
Audio Device(s) On Board
Power Supply Seasonic 650W
VR HMD Oculus Rift, Oculus Quest V1, Oculus Quest 2
Software Windows 11 64bit
Big talk indeed,….

If true, fine, but if the market dominance and performance comes at “drag them across concrete” prices then I’m not too sure I care.
 
Joined
Apr 16, 2019
Messages
632 (0.47/day)
When it comes to chip production they can’t. Unfortunately.

Gelsinger continues with "We have 80 percent market share.

Maybe give out a few more x86 licenses and see how long that lasts.

Not hard when have only 1’real’ competitor with no fabs.

if there was a dozen capable companies making x86 to mass market and you had 80% market share that would be something to brag about.
Oh, these poor, poor team red unfortunates with no fabs! One might almost think they never had them and then got rid of them because that's what they deemed better at the time.
And while not quite a dozen, there were several more companies making x86 chips back in the 90s and Intel's share was still just as high...
 
Joined
Oct 23, 2020
Messages
671 (0.86/day)
Location
Austria
System Name nope
Processor I3 10100F
Motherboard ATM Gigabyte h410
Cooling Arctic 12 passive
Memory ATM Gskill 1x 8GB NT Series (No Heatspreader bling bling garbage, just Black DIMMS)
Video Card(s) Sapphire HD7770 and EVGA GTX 470 and Zotac GTX 960
Storage 120GB OS SSD, 240GB M2 Sata, 240GB M2 NVME, 300GB HDD, 500GB HDD
Display(s) Nec EA 241 WM
Case Coolermaster whatever
Audio Device(s) Onkyo on TV and Mi Bluetooth on Screen
Power Supply Super Flower Leadx 550W
Mouse Steelseries Rival Fnatic
Keyboard Logitech K270 Wireless
Software Deepin, BSD and 10 LTSC
Sure for the Consumers Amd play to high, Consumer cant buy anything under 150€ from this Company ok yeah a 3000G for 100€ and a 1200 for 128€.

After theyr high rise they would kiss the Floor again.


AMD life a long time only from Cheap and Median Consumers, and since 3000 or Renoir they piss on them harder then any Company did it in the past.
 
Last edited:
Joined
Jan 29, 2021
Messages
1,143 (1.68/day)
Location
Alaska USA
If Intel can't hit back at AMD after a year, I don't know what to say about them to be honest. At least based on rumours and leaks so far, they've all been pointing to a good improvement from a single core perspective. But the limited number of performance cores will still cost them some performance deficit in my opinion, at least until we can confirm if this is really the case with official reviews. I believe some of the advantage of Intel Alder Lake is contributed by the DDR5 that comes along with it if the benchmark/ app is bandwidth sensitive.

I feel Pat is missing out the pricing factor that will help to win back market share/ stop their bleeding. There are a few leaks on pricing, but some of them look quite sketchy. So best to wait for Intel's MSRP to determine if they will be able to win back more users. There is no bad product, but bad pricing, and the high cost of entry for Alder Lake CPUs may actually deter people other than hard core enthusiasts to adopt it early.
B660 boards and locked cpu's will be where it's at.
 
Top