• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GeForce RTX 40 SUPER Custom Model Pricing Leaks Out

Joined
May 11, 2018
Messages
1,015 (0.46/day)
In a few years, Intel is probably back in the lead with 20A/18A and will be open for business. I don't think TSMC can retain their lead for much longer. Maybe AMD can use Intel for their chips then :laugh:

This doesn't make much sense. Right now everyone is paying all the very expensive process upgrades for TSMC, upgrades that are allegedly rising astronomically with every new process. So how can anyone just leapfrog into cutting edge process without figuring out intermediate steps?

I believe it would take a severe industry crysis to end the TSMC supremacy. But that's not so unimaginable, reasons could be political (Chinese don't even have to blocade or invade Taiwan, just stop exporting crucial materials and components), caused by natural disasters, or just market response to too high cost.
 
Joined
Oct 6, 2021
Messages
1,567 (1.59/day)
This is exactly why. Making dGPUs for AMD just eat away at their output at TSMC and they earn more from CPUs and APUs.

GPU development and production is still important for AMD, but they don't need to have high-end offerings for what they do; iGPUs, APUs (including Console APUs) etc. It is simply not important for them. They officially said this, when they said 7900XTX was 4080 counter and not a 4090 counter. They left the entuisiast market.

High-end dGPU is a niche market for AMD and probably always will be. Name one high-end AMD offering that sold well in recent years?
AMD have always sold mostly low to mid-end GPUs. Research and development is very costly and high-end GPUs makes little sense for AMD. This is why they want to go MCM so they can scale their offerings much easier, without ramping up costs like crazy for high-end.


I bet AMDs goal with Radeon 8000 is just to use 5nm still, while using 3nm for CPUs as fast as possible (when Apple is done with it) - Zen 5. Cheap GPUs with good enough raster perf is what is needed to drive AMDs GPU marketshare forward again. Along with FSR and AFMF improvements.

AMD can't afford to go 3nm too soon. Too costly. Nvidia will be able to. Going from 4/5nm to 3nm for Nvidia will also mean price increases on their own. Probably around 50% more per wafer.

However Nvidia rules the gaming GPU market while not even focussing on the market. They have full focus on AI and Enterprise and this won't change for years. They even scaled back gaming GPU production to make AI/Enterprise chips. I think we won't see a flood of 4000 SUPER cards on release because of this.


TSMC increased production costs alot over the last years + Inflation. This is not only Nvidia increasing prices. Look at AMD prices today as well. They are generally not cheap, mostly because TSMC wants their cut. TSMC knows AMD relies 100% on TSMC. Remember how poorly Ryzan was prior to TSMC? GloFo 12nm was trash compared to even Intel 14nm.

In a few years, Intel is probably back in the lead with 20A/18A and will be open for business. I don't think TSMC can retain their lead for much longer. Maybe AMD can use Intel for their chips then :laugh:

But yeah, process improvement + inflation + shipping and higher development and production costs is what is driving up prices. This is true in all markets really. Expect hardware to get more and more expensive, especially in the high-end.

I predict RTX 5090 to be 1999 but I would not be surprised if its 2499. AMD has nothing to counter it. Just like 4090. AMD barely could counter 3090/3090 Ti even tho Nvidia used a cheap and mediocre process node in Samsung 8nm thats closer to 10nm TSMC in reality and yet Nvidia still won. Superior architecture is the reason.

AMD probably paid twice as much per 7nm wafer compared to Nvidia using Samsung 8nm, if not more. Nvidia did not need the best node to beat AMD.
You're right, the last one to compete for the top was the 6950XT, it managed to tie or beat the 3090 and even the 3090ti, it consumed less, it also cost a lot less, but the 3090/3090ti still sold many times more.

I think they will only return to compete for the top when there is an MCM solution that only one chip (GCD) can be scaled from base to high-end, as happens in the CPU line. This would greatly ease development costs.
 

las

Joined
Nov 14, 2012
Messages
1,533 (0.36/day)
System Name Obsolete / Waiting for Zen 5 or Arrow Lake
Processor i9-9900K @ 5.2 GHz @ 1.35v / No AVX Offset
Motherboard AsRock Z390 Taichi
Cooling Custom Water
Memory 32GB G.Skill @ 4000/CL15
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 280 Hz + 77" QD-OLED @ 144 Hz VRR
Case Fractal Design Meshify C
Audio Device(s) Asus Essence STX / Upgraded Op-Amps
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
Before the Nvidia cult we had decent pricing.

280 $430
480 $500
580 $500
680 $500
780 $500
980 $550
1080 $600 - cult forms
2080 $700
3080 $800 (8960 CUDA version)
4080 $1200

The media didn’t do us any favors either by not dispelling the AMD bad driver myth that was perpetuated by non-AMD card owners trying to elevate Nvidia.

Yeah 7900XTX was priced at 999 while having higher power draw and inferior features than 4080. Like all AMD hardware tho, price drops over time. Nvidia keeps their pricing like Apple stuff. Meaning second hand prices are better. Sold my 3090 for 1000 dollars shortly before picking up a 4090 on launch.

The main reason prices went up are inflation, production costs, shipping costs and TSMC demanding a higher and higher cut.

If AMD wants to be the good guy, they can drop 7900XTX to 799 and 7900XTX to 599, they would probably loose money by doing that tho. They don't care about GPUs much.

With 4080 SUPER coming in at 1000, 7900XTX should go sub 800 ASAP. Might even be too much as long as features are highly inferior.
 
Last edited:
Joined
Feb 18, 2005
Messages
5,408 (0.77/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Logitech G613
Software Windows 10 Professional x64
The media didn’t do us any favors either by not dispelling the AMD bad driver myth that was perpetuated by non-AMD card owners trying to elevate Nvidia.
Such a myth that W1zz called it out in his review and there is a 20-page thread on these forums titled "How many of you Radeon 5700 owners have ditched your cards over the drivers"; similar threads can be found all over the internet.
Such a myth that Radeon drivers' idle and multi-monitor power consumption have been consistently broken on every new GPU release since Vega, or four generations. Read any of W1zz's launch day reviews of these cards.

Take your historical revisionism and shove it.
 

las

Joined
Nov 14, 2012
Messages
1,533 (0.36/day)
System Name Obsolete / Waiting for Zen 5 or Arrow Lake
Processor i9-9900K @ 5.2 GHz @ 1.35v / No AVX Offset
Motherboard AsRock Z390 Taichi
Cooling Custom Water
Memory 32GB G.Skill @ 4000/CL15
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 280 Hz + 77" QD-OLED @ 144 Hz VRR
Case Fractal Design Meshify C
Audio Device(s) Asus Essence STX / Upgraded Op-Amps
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
You're right, the last one to compete for the top was the 6950XT, it managed to tie or beat the 3090 and even the 3090ti, it consumed less, it also cost a lot less, but the 3090/3090ti still sold many times more.

I think they will only return to compete for the top when there is an MCM solution that only one chip (GCD) can be scaled from base to high-end, as happens in the CPU line. This would greatly ease development costs.
I actually had a 6900XT with OC that performed just like 6950XT OC'ed. Remember its the exact same chip. Not the first time AMD refreshes an identical chip with slightly higher clocks.

6900XT is not close to 3090 Ti really. 6950XT did not beat 3090 Ti overall and mostly delivered 3090 performance, if you looked at overall performance that is.


6950XT performed like 5-6% better than 6900XT stock for stock.

However the big problem with 6800 and 6900 series was power spiking, look at 20ms spikes, this is what destroys PSUs over time or make the systems reboot. AMD fixed this with 7000 series.

However 3090 Ti was crazy peaked and used way too much power. One of the worst GPU purchases in recent memory, because 4090 landed like 6 months later with almost twice the performance at much much lower power usage with a 1599 price tag instead of 1999 dollars.
 
Last edited:
Joined
Dec 12, 2016
Messages
1,388 (0.51/day)
You're right, the last one to compete for the top was the 6950XT, it managed to tie or beat the 3090 and even the 3090ti, it consumed less, it also cost a lot less, but the 3090/3090ti still sold many times more.

I think they will only return to compete for the top when there is an MCM solution that only one chip (GCD) can be scaled from base to high-end, as happens in the CPU line. This would greatly ease development costs.
Yeah and the 6950xt was released a long, long time…oh wait. That was a year and a half ago. Lol!

I think Las doesn’t realize that AMD ‘skipped’ the high end occasionally (Sea Island, Polaris, RDNA1) but also competed for the high end for the most part. AMD could ‘skip’ the high end again for next gen but market forces determine business and rarely does someone leave a market segment. But It does happen. Intel is facing that rare tough choice currently.

Such a myth that W1zz called it out in his review and there is a 20-page thread on these forums titled "How many of you Radeon 5700 owners have ditched your cards over the drivers"; similar threads can be found all over the internet.
Such a myth that Radeon drivers' idle and multi-monitor power consumption have been consistently broken on every new GPU release since Vega, or four generations. Read any of W1zz's launch day reviews of these cards.

Take your historical revisionism and shove it.
You cannot win this argument as a simple internet search conforms that no company is immune to driver problems and no company is ahead when it comes to good/bad drivers. Here is one such forum among many:

https://www.reddit.com/r/computers/comments/1761ayr

Drivers have bugs. Its always been that way. Now what some people are confusing are driver features such as super sampling. Some companies have better features implemented in a superior way. Nvidia definitely has better features. But that doesn’t mean AMD drivers are bad.
 
Last edited:
Joined
Oct 6, 2021
Messages
1,567 (1.59/day)
I actually had a 6900XT with OC that performed just like 6950XT OC'ed. Remember its the exact same chip. Not the first time AMD refreshes an identical chip with slightly higher clocks.

6900XT is not close to 3090 Ti really. 6950XT did not beat 3090 Ti overall and mostly delivered 3090 performance, if you looked at overall performance that is.


6950XT performed like 5-6% better than 6900XT stock for stock.

However the big problem with 6800 and 6900 series was power spiking, look at 20ms spikes, this is what destroys PSUs over time or make the systems reboot. AMD fixed this with 7000 series.

However 3090 Ti was crazy peaked and used way too much power. One of the worst GPU purchases in recent memory, because 4090 landed like 6 months later with almost twice the performance at much much lower power usage with a 1599 price tag instead of 1999 dollars.
1705325916263.png

1705325936842.png


It's the same design, but the 6950XT chip had better binning. 4-6% difference, some reviewers consider it a 5% margin of error, and depending on the game selection it could win or lose the battle for the top. But the main point is that the 3090ti cost $2000 and still sold much more than the 6950XT, Nvidia has a "mind share" equivalent to Apple, to the point of being infected by serious problems such as drivers killing GPUs, the infamous 3.5GB of vram; and similar abominations.
 

las

Joined
Nov 14, 2012
Messages
1,533 (0.36/day)
System Name Obsolete / Waiting for Zen 5 or Arrow Lake
Processor i9-9900K @ 5.2 GHz @ 1.35v / No AVX Offset
Motherboard AsRock Z390 Taichi
Cooling Custom Water
Memory 32GB G.Skill @ 4000/CL15
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 280 Hz + 77" QD-OLED @ 144 Hz VRR
Case Fractal Design Meshify C
Audio Device(s) Asus Essence STX / Upgraded Op-Amps
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
View attachment 329830
View attachment 329831

It's the same design, but the 6950XT chip had better binning. 4-6% difference, some reviewers consider it a 5% margin of error, and depending on the game selection it could win or lose the battle for the top. But the main point is that the 3090ti cost $2000 and still sold much more than the 6950XT, Nvidia has a "mind share" equivalent to Apple, to the point of being infected by serious problems such as drivers killing GPUs, the infamous 3.5GB of vram; and similar abominations.

We can all cherry pick reviews but 3090 Ti custom cards generally beat 6950XT custom cards and the power spiking on 6950XT was still a big problem and only got worse, some 6950XT spiked to 650-700 watts

6950XT was nothing but an overclocked 6900XT since chip is 100% identical


Nvidia did not have drivers killing any GPUs, they were defective from the beginning and AMD cards died too from Diablo 4 and other games. More Nvidia cards died because more people use Nvidia.

AMD gimped tons of GPUs with PCIe lanes as well.

AMD released tons of bad GPUs, just as Nvidia, especially in the lower end segment ->https://www.techspot.com/review/2398-amd-radeon-6500-xt/
 
Joined
Dec 25, 2020
Messages
5,074 (4.00/day)
Location
São Paulo, Brazil
System Name Project Kairi Mk. IV "Eternal Thunder"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard MSI MEG Z690 ACE (MS-7D27) BIOS 1G
Cooling Noctua NH-D15S + NF-F12 industrialPPC-3000 w/ Thermalright BCF and NT-H1
Memory G.SKILL Trident Z5 RGB 32GB DDR5-6800 F5-6800J3445G16GX2-TZ5RK @ 6400 MT/s 30-38-38-38-70-2
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 1x WD Black SN750 500 GB NVMe + 4x WD VelociRaptor HLFS 300 GB HDDs
Display(s) 55-inch LG G3 OLED
Case Cooler Master MasterFrame 700
Audio Device(s) EVGA Nu Audio (classic) + Sony MDR-V7 cans
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Ocean Plastic Mouse
Keyboard Galax Stealth
Software Windows 10 Enterprise 22H2
Benchmark Scores "Speed isn't life, it just makes it go faster."
Nvidia has a "mind share" equivalent to Apple, to the point of being infected by serious problems such as drivers killing GPUs, the infamous 3.5GB of vram; and similar abominations.

If the only dirt you've got is a decade old GPU's "misfortune" of being relatively inefficiently designed, or a "killer driver" released even prior to that, it's little wonder that they've amassed a mindshare equivalent to Apple. Personally I've been bitten by the AMD bugs more times than I could count. I vehemently disagree with most of their business directives of late. I gave them a timeout. I still very much love AMD, but I just need some time off - like any heated relationship I suppose.
 
Joined
Feb 18, 2005
Messages
5,408 (0.77/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Logitech G613
Software Windows 10 Professional x64
You cannot win this argument as a simple internet search conforms that no company is immune to driver problems and no company is ahead when it comes to good/bad drivers. Here is one such forum among many:

https://www.reddit.com/r/computers/comments/1761ayr

Drivers have bugs. Its always been that way. Now what some people are confusing are driver features such as super sampling. Some companies have better features implemented in a superior way. Nvidia definitely has better features. But that doesn’t mean AMD drivers are bad.
You were the one who made the claim that AMD drivers aren't bad, I provided irrefutable evidence that they are. Trying to change the subject to "oh look NVIDIA drivers are also bad" is irrelevant whataboutism that doesn't detract from the proven untruth of your claims.

If the only dirt you've got is a decade old GPU's "misfortune" of being relatively inefficiently designed, or a "killer driver" released even prior to that, it's little wonder that they've amassed a mindshare equivalent to Apple. Personally I've been bitten by the AMD bugs more times than I could count. I vehemently disagree with most of their business directives of late. I gave them a timeout. I still very much love AMD, but I just need some time off - like any heated relationship I suppose.
The "3.5GB VRAM" thing is a guaranteed way to detect someone who's run out of actual points to criticize NVIDIA on, and knows it.
 
Last edited by a moderator:
Joined
Mar 29, 2023
Messages
1,044 (2.34/day)
Processor Ryzen 7800x3d
Motherboard Asus B650e-F Strix
Cooling Corsair H150i Pro
Memory Gskill 32gb 6000 mhz cl30
Video Card(s) RTX 4090 Gaming OC
Storage Samsung 980 pro 2tb, Samsung 860 evo 500gb, Samsung 850 evo 1tb, Samsung 860 evo 4tb
Display(s) Acer XB321HK
Case Coolermaster Cosmos 2
Audio Device(s) Creative SB X-Fi 5.1 Pro + Logitech Z560
Power Supply Corsair AX1200i
Mouse Logitech G700s
Keyboard Logitech G710+
Software Win10 pro
MSI are smoking their socks if they think that people are going to pay a 10% premium over the Founders Edition of the already-expensive 4080S.

You shouldn't be pointing the finger at MSI... it's nvidia screwing over their AIB "partners".

But the founder edition is hardly available outside 'murica anyways, so for most of the world, the cheapest partner cards are the real price.
 
Joined
Feb 20, 2019
Messages
7,536 (3.88/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
Oof +$100 over MSRP for a Ventus? I paid +$10 for a Ventus 1660 Super and it doesn't even have fan stop or power limit increase though it's a competent card with good thermals. The 4070 Ventus (also no Power Limit increase) is listed for MSRP so +$100 is crazy.
That *is* crazy.

Ventus is the base model;
  • Single BIOS,
  • Basic VRM with the minimum acceptable VRM design for stock speeds.
  • Cheapest possible cooler they can make with minimum acceptable heatpipe count and only the essential PCB components contacted.
  • No RGBLED
  • The lowest-clocked model in MSI's range and usually within 1-2% of Nvidia's reference board base clocks.
  • Default power limits with no headroom.
  • Shorter warranties than their premium models (in territories where it's legal to offer shorter warranties).
...and now, to make things worse than just the stupid price hike, the Ventus now comes with the shitty 12VHPWR connector that's riddled with issues, so there's NO REASON to buy it over the FE.
 
Joined
Dec 25, 2020
Messages
5,074 (4.00/day)
Location
São Paulo, Brazil
System Name Project Kairi Mk. IV "Eternal Thunder"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard MSI MEG Z690 ACE (MS-7D27) BIOS 1G
Cooling Noctua NH-D15S + NF-F12 industrialPPC-3000 w/ Thermalright BCF and NT-H1
Memory G.SKILL Trident Z5 RGB 32GB DDR5-6800 F5-6800J3445G16GX2-TZ5RK @ 6400 MT/s 30-38-38-38-70-2
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 1x WD Black SN750 500 GB NVMe + 4x WD VelociRaptor HLFS 300 GB HDDs
Display(s) 55-inch LG G3 OLED
Case Cooler Master MasterFrame 700
Audio Device(s) EVGA Nu Audio (classic) + Sony MDR-V7 cans
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Ocean Plastic Mouse
Keyboard Galax Stealth
Software Windows 10 Enterprise 22H2
Benchmark Scores "Speed isn't life, it just makes it go faster."
That *is* crazy.

Ventus is the base model;
  • Single BIOS,
  • Basic VRM with the minimum acceptable VRM design for stock speeds.
  • Cheapest possible cooler they can make with minimum acceptable heatpipe count and only the essential PCB components contacted.
  • No RGBLED
  • The lowest-clocked model in MSI's range and usually within 1-2% of Nvidia's reference board base clocks.
  • Default power limits with no headroom.
  • Shorter warranties than their premium models (in territories where it's legal to offer shorter warranties).
...and now, to make things worse than just the stupid price hike, the Ventus now comes with the shitty 12VHPWR connector that's riddled with issues, so there's NO REASON to buy it over the FE.

The cards have been updated with 2x6 connectors that have shorter sense pins. They are not hazardous. The reason you have to buy it over FE is... that it's available and the FE isn't. Unless you're in the privileged markets Nvidia bothers to sell them at.
 

bug

Joined
May 22, 2015
Messages
13,358 (4.03/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
I think Las doesn’t realize that AMD ‘skipped’ the high end occasionally (Sea Island, Polaris, RDNA1) but also competed for the high end for the most part. AMD could ‘skip’ the high end again for next gen but market forces determine business and rarely does someone leave a market segment. But It does happen. Intel is facing that rare tough choice currently.
Just like sour grapes, when AMD falls behind, they claim they're not interested in building big chips or competing at the high end anymore. When they launch a new architecture, they promptly forget about that. Rinse and repeat.

I have nothing personal against AMD (quite the opposite). It's just that their GPUs have been trailing Nvidia for too long. And that's the lipstick their marketing puts on the pig.
 
Joined
Apr 12, 2013
Messages
6,874 (1.68/day)
And that was before the new Jensen's law, where you get worse price / performance with new releases, not better.
Tbf to Nvidia, although only on a technicality, the "artificial" pent up demand has come from people wanting to make money from thin air or Cat videos :laugh:
like any heated relationship I suppose.
Yes if only this was a one off episode of love(?) island :shadedshu:
 
Joined
Oct 6, 2021
Messages
1,567 (1.59/day)
We can all cherry pick reviews but 3090 Ti custom cards generally beat 6950XT custom cards and the power spiking on 6950XT was still a big problem and only got worse, some 6950XT spiked to 650-700 watts

6950XT was nothing but an overclocked 6900XT since chip is 100% identical


Nvidia did not have drivers killing any GPUs, they were defective from the beginning and AMD cards died too from Diablo 4 and other games. More Nvidia cards died because more people use Nvidia.

AMD gimped tons of GPUs with PCIe lanes as well.

AMD released tons of bad GPUs, just as Nvidia, especially in the lower end segment ->https://www.techspot.com/review/2398-amd-radeon-6500-xt/
To make such a statement I assume you analyzed the driver code and the entire long list of reported cases of GPUs dying shortly after the driver was released that Nvidia magically took offline after the first cases appeared. This happened not once, but twice as I recall.

None of you can maintain any argumentative cohesion, because at that point you begin to despair of proving which brand is worse, and bring the discussion to irrelevant topics. I completely agree that AMD shouldn't have released such horrible low-end GPUs.

However, I'm arguing around the point that the problems Nvidia has, and these are downplayed and ignored, not that AMD doesn't have or had problems;

If the only dirt you've got is a decade old GPU's "misfortune" of being relatively inefficiently designed, or a "killer driver" released even prior to that, it's little wonder that they've amassed a mindshare equivalent to Apple. Personally I've been bitten by the AMD bugs more times than I could count. I vehemently disagree with most of their business directives of late. I gave them a timeout. I still very much love AMD, but I just need some time off - like any heated relationship I suppose.
Mindshare is fascinating, it consists of first offering products with some difference and quality compared to the competition, and this takes root in people, they recommend it to friends and family, they speak well of the brand, making the most effective marketing possible; After achieving this, at some point you may no longer offer any difference, in fact now your products have deficits and are more expensive, but you have gained a group of loyal followers, who simply buy brand X or Y without even checking any competing products. .

In short, it's not about having the best products, it's about making people believe that you do. Something starts as rational and becomes emotional.

As a casual gamer, I've honestly never had serious problems with AMD drivers, but I'm not pointing fingers and calling anyone who says they have a fanboy, software will always have problems. Even though I've had serious problems with Nvidia in the past, I don't spend my time constantly saying that their software is garbage.
 
Last edited by a moderator:
Joined
Feb 20, 2019
Messages
7,536 (3.88/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
The cards have been updated with 2x6 connectors that have shorter sense pins. They are not hazardous. The reason you have to buy it over FE is... that it's available and the FE isn't. Unless you're in the privileged markets Nvidia bothers to sell them at.
The cards may have been updated, but all the power supplies and cables manufactured in the last 3 years haven't magically been changed to the latest 2024 standard. Even if you buy a brand new PSU today, you won't necessarily know when your PSU was manufactured until you open the packaging and look for the date of manufacture near the serial number. Realistically most things I buy have sat on a warehouse shelf somewhere for 3-12 months, even if it was brand new stock to the retailer last week.

I don't trust it. Not because I am an expert in the field, but because people who are (like Der8auer who works as a consultant/employee for CableMod actually designing cables that they don't want to melt) has numerous criticisms of it. If the people whose job it is to make safe cables don't trust the connector, why should I trust the connector? Unlike Nvidia who have damage control and negative PR reasons to lie and cover up their mistake, CableMod/Der8auer have nothing to lose by being honest.
 
Last edited:
Joined
Jan 29, 2021
Messages
1,778 (1.44/day)
Location
Alaska USA
Looking forward to the reviews. I think the 4070 Ti Super 16GB has a chance of being a real hit for those who game at 1440P.
 
Joined
Feb 18, 2005
Messages
5,408 (0.77/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Logitech G613
Software Windows 10 Professional x64
However, I'm arguing around the point that the problems Nvidia has, and these are downplayed and ignored, not that AMD doesn't have or had problems;
No you're not; you're bringing up problems that NVIDIA had in the past, and have solved. NVIDIA's problems aren't downplayed, they're forgotten because the company generally fixes them as soon as possible. Meanwhile AMD consistently makes the same dumb mistakes over and over and over again. That leads to a perception of quality from NVIDIA, and a perception of the opposite from AMD, and AMD does nothing to address that perception; they just increase their prices to match NVIDIA, then wonder why nobody wants to buy AMD.

And you now and almost always don't add anything to the discussion, you just come and talk bad about people and label one or the other a fanboy, as if that were your only existential reason.
Do you know when the GTX 970 was released? A decade ago. It sold incredibly well because the "3.5GB" issue that AMD fanboys love to talk about simply was not an issue for people who actually owned the card and used it for gaming. Is it a negative point? Yes. Is it a negative point that practically ever mattered to users? No. Did NVIDIA ever repeat that design? No. So bringing it up as if it somehow matters... just stop. Please, stop embarrassing yourselves.

In contrast, we have Vega which released in 2017 with Wizz's review noting high power draw.
Then RDNA in 2019, with his review noting high multi-monitor/idle power draw and driver instability.
RDNA 2 review in 2020 noted that AMD had fixed everything except for media playback.
Then RDNA 3 in 2022 - back to high multi-monitor and media playback power consumption.

This isn't imaginary. This is a trend of AMD failing to get something incredibly simple and basic, correct. Consumers look at that and go "if this company can't get bloody power consumption right, what else can't they get right? What other crap is lurking that might bite me later down the line?" Then they look at NVIDIA's products, and say "shit, that's expensive, but I'd rather pay through my ass and not have to worry about the product causing me problems", and they buy NVIDIA.
 
Joined
Jul 20, 2020
Messages
884 (0.62/day)
System Name Gamey #1 / #2
Processor Ryzen 7 5800X3D / Core i7-9700F
Motherboard Asrock B450M P4 / Asrock B360M P4
Cooling IDCool SE-226-XT / CM Hyper 212
Memory 32GB 3200 CL16 / 32GB 2666 CL14
Video Card(s) PC 6800 XT / Soyo RTX 2060 Super
Storage 4TB Team MP34 / 512G Tosh RD400+2TB WD3Dblu
Display(s) LG 32GK650F 1440p 144Hz VA
Case Corsair 4000Air / CM N200
Audio Device(s) Dragonfly Black
Power Supply EVGA 650 G3 / Corsair CX550M
Mouse JSCO JNL-101k Noiseless
Keyboard Steelseries Apex 3 TKL
Software Win 10, Throttlestop
AMD isn't really commited to their graphics section. Sure, they're investing, keeping in close second place, but not really inovating, and not competing for bigger market share...

The 7700 XT thru 7900 XTX are chiplet-based GPUs, that alone is a big innovation not seen from any other consumer GPUs. That's not to say AMD should be content with that and not get other aspects of their GPUs to perform better like low-demand power consumption, but ignoring what AMD has actually innovated with mutes your other criticisms.

With such a change to GPU hardware design it's not a surprise that some things have turned out worse than people hoped, like top end core frequencies/performance (which AMD touted and then didn't deliver) and low-demand power use, but hopefully for them it will be a case of design and iterate for the next generation.

Nvidia took no such chances with the 4000 series because they didn't need to as moving from Samsung's crap "8" to TSMC's 5nm process meant they would clearly take the lead in performance with a 600 mmsq die as well as smaller dies. We'll see what Blackwell brings but moving to TSMC 3nm will likely require no significant hardware innovation as the increased density from a node shrink will be sufficient to compete with AMD unless AMD can make GPU Chiplets 2.0 work significantly better than 1.0.
 
Joined
Feb 18, 2005
Messages
5,408 (0.77/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Logitech G613
Software Windows 10 Professional x64
We'll see what Blackwell brings but moving to TSMC 3nm will likely require no significant hardware innovation as the increased density from a node shrink will be sufficient to compete with AMD unless AMD can make GPU Chiplets 2.0 work significantly better than 1.0.
I'm willing to give AMD another generation, or even two, on the GPU chiplet R&D road because NVIDIA (and Apple) are staring at 3nm with no path currently beyond. If AMD can do GPU chiplets right they will be in the same driving seat they were in with Zen vs Intel... the problem is whether the GPU division can last long enough to get there.
 
Joined
Oct 6, 2021
Messages
1,567 (1.59/day)
No you're not; you're bringing up problems that NVIDIA had in the past, and have solved. NVIDIA's problems aren't downplayed, they're forgotten because the company generally fixes them as soon as possible. Meanwhile AMD consistently makes the same dumb mistakes over and over and over again. That leads to a perception of quality from NVIDIA, and a perception of the opposite from AMD, and AMD does nothing to address that perception; they just increase their prices to match NVIDIA, then wonder why nobody wants to buy AMD.


Do you know when the GTX 970 was released? A decade ago. It sold incredibly well because the "3.5GB" issue that AMD fanboys love to talk about simply was not an issue for people who actually owned the card and used it for gaming. Is it a negative point? Yes. Is it a negative point that practically ever mattered to users? No. Did NVIDIA ever repeat that design? No. So bringing it up as if it somehow matters... just stop. Please, stop embarrassing yourselves.

In contrast, we have Vega which released in 2017 with Wizz's review noting high power draw.
Then RDNA in 2019, with his review noting high multi-monitor/idle power draw and driver instability.
RDNA 2 review in 2020 noted that AMD had fixed everything except for media playback.
Then RDNA 3 in 2022 - back to high multi-monitor and media playback power consumption.

This isn't imaginary. This is a trend of AMD failing to get something incredibly simple and basic, correct. Consumers look at that and go "if this company can't get bloody power consumption right, what else can't they get right? What other crap is lurking that might bite me later down the line?" Then they look at NVIDIA's products, and say "shit, that's expensive, but I'd rather pay through my ass and not have to worry about the product causing me problems", and they buy NVIDIA.
The critical issue, at least for me, lies in the lack of transparency during the launch of the GTX 970, not for performance reasons. Consumers trusted the information provided by the company when purchasing the product, expecting the 4 GB of memory advertised to be fully accessible and of uniform performance. The discovery that 0.5 GB of this memory was significantly slower than the rest resulted in a feeling of disappointment.

Lying or omitting information about the characteristics of any product is not a small problem, no wonder this happened: "NVIDIA settled in a 2015 class-action lawsuit against it, for misrepresenting the amount of memory on GeForce GTX 970 graphics cards. The company has agreed to pay every buyer of the card USD $30 (per card), and also cover the legal fees of the class, amounting to $1.3 million. The company, however, did not specify how much money it has set aside for the payout, and whether it will compensate only those buyers who constitute the class (i.e. buyers in the U.S., since that's as far as the court's jurisdiction can reach), or the thousands of GTX 970 buyers worldwide." NVIDIA Settles Class-Action Lawsuit Over GTX 970 Memory | TechPowerUp

You assume a lot of things without knowing, when I abandoned Nvidia in the Maxwell gen(980), the driver had a problem with a specific game, which prevented me from playing, Mass Effect, I waited for a year for a fix, which didn't come, to my surprise googling out of pure curiosity I discovered that to date the problem has not been corrected;

I'm not like you who jump to make excuses for Nvidia, I always criticize what deserves criticism, regardless of the flag, for me Vega was a waste of money just like all HBM GPUs aimed at consumers, for me they should have jumped and focused all resources in advancing the launch of RDNA, and preparing the architecture transition, leaving the software side in a better state for launch. Vega was only good as an iGPU.

If you don't want to have problems, it's better to buy a console, the chances of running into bugs that prevent you from playing are lower.
 
Last edited:
Joined
Apr 6, 2021
Messages
1,131 (0.97/day)
Location
Bavaria ⌬ Germany
System Name ✨ Lenovo M700 [Tiny]
Cooling ⚠️ 78,08% N² ⌬ 20,95% O² ⌬ 0,93% Ar ⌬ 0,04% CO²
Audio Device(s) ◐◑ AKG K702 ⌬ FiiO E10K Olympus 2
Mouse ✌️ Corsair M65 RGB Elite [Black] ⌬ Endgame Gear MPC-890 Cordura
Keyboard ⌨ Turtle Beach Impact 500
Just like sour grapes, when AMD falls behind, they claim they're not interested in building big chips or competing at the high end anymore. When they launch a new architecture, they promptly forget about that. Rinse and repeat.

I have nothing personal against AMD (quite the opposite). It's just that their GPUs have been trailing Nvidia for too long. And that's the lipstick their marketing puts on the pig.
Tbh. I think AMD didn't exit high end, they just sugar coated the obvious. :D Nvidia is so far ahead with the 4090, there is no way to catch up. And I bet that isn't even the max. Nvidia could go.

In contrast, we have Vega which released in 2017 with Wizz's review noting high power draw.
Then RDNA in 2019, with his review noting high multi-monitor/idle power draw and driver instability.
RDNA 2 review in 2020 noted that AMD had fixed everything except for media playback.
Then RDNA 3 in 2022 - back to high multi-monitor and media playback power consumption.

This isn't imaginary. This is a trend of AMD failing to get something incredibly simple and basic, correct. Consumers look at that and go "if this company can't get bloody power consumption right, what else can't they get right? What other crap is lurking that might bite me later down the line?" Then they look at NVIDIA's products, and say "shit, that's expensive, but I'd rather pay through my ass and not have to worry about the product causing me problems", and they buy NVIDIA.
Tbh. both companies produced enough turds to fill a dump. ;) Most of the "AMD has bad drivers" is nothing more than a sour taste from the Vega(?) generation which had massive problems. Nowadays when looking at gaming support AMD isn't trailing Nvidia, I see them rather leading since AMD also dominates the console market. In fact I see more Nvidia users posting driver problems for new released games than AMD users. But there are problems, and they need to be called out. Which sadly often doesn't happen by so called tech reviewers. They fail to inform the "average joe" about issues, so I mainly blame them why stuff doesn't get fixed.


It all comes down to what is more important to you:
AMD: best bang for buck, better multi monitor support, better Linux support, better GPU software, better day1 drivers, fine wine driver progress
Nvidia: way better power efficiency, better frame generation, way more (non gaming) software features, expensive, better resale value
 

las

Joined
Nov 14, 2012
Messages
1,533 (0.36/day)
System Name Obsolete / Waiting for Zen 5 or Arrow Lake
Processor i9-9900K @ 5.2 GHz @ 1.35v / No AVX Offset
Motherboard AsRock Z390 Taichi
Cooling Custom Water
Memory 32GB G.Skill @ 4000/CL15
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 280 Hz + 77" QD-OLED @ 144 Hz VRR
Case Fractal Design Meshify C
Audio Device(s) Asus Essence STX / Upgraded Op-Amps
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
As I see it, Nvidia right now doesn't even need to sell gaming cards. AIB partners do. But with the crypto craze rhey could cash in, because miners needed gaming cards - any cards at any price, when they were desperate enough.

AI craze is different. Almost nobody needs gaming cards - except where there are limitations on selling AI accelerators, and even that is very limited.

So I'm a bit sorry for all AIB partners now. I bet we'll even see prices go much higher than MSRP - because all the cards will be produced in very small numbers, Nvidia has already said it's dedicating production to AI.

And no gamer will be buying the overpriced cards, and AI people don't need them.

And suddenly EVGAs decision in September 2022 will make perfect sense.
4090 sells like hotcakes really, which is why price went up, so this does not affect all "gaming" cards, even tho 4090 is kind of a hybrid here

Gamers can be thankful that the lower tier GPUs are not interresting for AI or you would see the same thing happen as back during the mining craze. 90-95% of PC gamers buy in the sub 1000 dollar segment anyway.

EVGA stopped making GPUs because they became worse and worse over the years, their designs in general were lacking in the last years they were active in the GPU market, with several issues on both vrm, pcb and design in general, they were not really selling alot of cards and could not afford to stay competitive, so they pulled the plug and focussed on other areas. EVGA is not doing well right now. I would not be surprised if the company is sold or closes down in a few years. Selling OEM PSUs and cheap stuff like mice and keyboards is not going to work well. Their mice and keyboards are not even great and their PSUs also dropped in quality since their entry in this market with the first Super Flower designs.

Stop selling GPUs was the first nail in the coffin for EVGA most likely. I don't see them survive long without it. It was what they became known for.

AIBs earn tons of money right now. I know for sure since i work with B2B in the hardware sector. Numbers right now are climbing not dropping and they will go up way more in 24 and 25.

This doesn't make much sense. Right now everyone is paying all the very expensive process upgrades for TSMC, upgrades that are allegedly rising astronomically with every new process. So how can anyone just leapfrog into cutting edge process without figuring out intermediate steps?

I believe it would take a severe industry crysis to end the TSMC supremacy. But that's not so unimaginable, reasons could be political (Chinese don't even have to blocade or invade Taiwan, just stop exporting crucial materials and components), caused by natural disasters, or just market response to too high cost.
Intel have been building fabs for years to regain lead. Pat Gelsinger is turning Intel around as fast as he can.

Intel 4 this year (Meteor Lake), Intel 20A in Q4 (Arrow Lake) and then 18A next year.

TSMC hit a wall as well and it struggling to go lower than 3nm right now (Which Apple has priority on).

However, Apple wants chip production outside of Asia only, which is why they forced TSMC to build more fabs outside of Asia.

Apple would jump to Intel for sure when Intel regain lead. Both are US companies.

It makes perfect sense if you read news and officials statements in the last few years. Intel has always came back eventually.

Also, TSMC has been pushing up prices as well. They will be forced to cut prices when Intel regains lead, or atleast get to same processes
 
Last edited:
Top