• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Is Nvidia's GPP really dead?

Joined
Jul 13, 2016
Messages
2,832 (1.00/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
I have to wonder whether Nvidia really canceled it's GPP (GeForce Partner Program) or merely is now trying to do it on the DL because ASUS, MSI, and Gigabyte only have Nvidia variants of their flagship video card lines(ROG, Suprim, and Master) and I can't seem to find anywhere where these product lines have been announced for the AMD side. On the contrary, the 4090 master was announced a month before the 4090 even launched so you'd expect that if they were going to release them for the AMD side they'd say something or have it listed on their website by now.

Such a specific ommission for each vendor's lineup, exactly in line with the requirements of the canceled GPP definitely raises some suspicion. Maybe they need more time to release those variants for AMD but in any case people should keep an eye on this as we may be seeing the return of the GPP, only this time in the dark.
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
40,435 (6.59/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
Its on the dl because of said heatsink designs. (green heatsink is not good on a red card) Gpp never truly died.

And the asus noctua exclusivity on a ngreedia only card...

It's why I buy cards from comps that dont make mobos, with AsRock being the exception, and Sapphire hasn't released a mobo since Skt AM3...
 
Last edited:
Joined
Dec 6, 2018
Messages
342 (0.17/day)
Location
Hungary
Processor i5-9600K
Motherboard ASUS Prime Z390-A
Cooling Cooler Master Hyper 212 Black Edition PWM
Memory G.Skill DDR4 RipjawsV 3200MHz 16GB kit
Video Card(s) Asus RTX2060 ROG STRIX GAMING
Display(s) Samsung Odyssey G7 27"
Case Cooler Master MasterCase H500
Power Supply SUPER FLOWER Leadex Gold 650W
Mouse BenQ Zowie FK1+-B
Keyboard Cherry KC 1000
Software Win 10
And the asus noctua exclusivity on a ngreedia only card...
finders keepers. I guess "ngreedia" was faster than AMD with this excellent partnership. But AMD is free to partner up with Arctic or Deepcool or some other great brands, why don't they do it?
 
Joined
Mar 10, 2010
Messages
11,878 (2.30/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
finders keepers. I guess "ngreedia" was faster than AMD with this excellent partnership. But AMD is free to partner up with Arctic or Deepcool or some other great brands, why don't they do it?
No desire to fill the back pockets of the few to make such deals happen?!.

Nvidia didn't find ROG Aurus etc but they're definitely trying to ring fence them.

It does look to me not only like GPP is ongoing.

Paid shitstorms perhaps I note Der8aure always has an Nvidia card in the background of everything for example and he's not alone.

Anti consumer is anti consumer regardless of what GPU YOU own.
 
Joined
Jan 5, 2006
Messages
17,794 (2.66/day)
System Name AlderLake / Laptop
Processor Intel i7 12700K P-Cores @ 5Ghz / Intel i3 7100U
Motherboard Gigabyte Z690 Aorus Master / HP 83A3 (U3E1)
Cooling Noctua NH-U12A 2 fans + Thermal Grizzly Kryonaut Extreme + 5 case fans / Fan
Memory 32GB DDR5 Corsair Dominator Platinum RGB 6000MHz CL36 / 8GB DDR4 HyperX CL13
Video Card(s) MSI RTX 2070 Super Gaming X Trio / Intel HD620
Storage Samsung 980 Pro 1TB + 970 Evo 500GB + 850 Pro 512GB + 860 Evo 1TB x2 / Samsung 256GB M.2 SSD
Display(s) 23.8" Dell S2417DG 165Hz G-Sync 1440p / 14" 1080p IPS Glossy
Case Be quiet! Silent Base 600 - Window / HP Pavilion
Audio Device(s) Panasonic SA-PMX94 / Realtek onboard + B&O speaker system / Harman Kardon Go + Play / Logitech G533
Power Supply Seasonic Focus Plus Gold 750W / Powerbrick
Mouse Logitech MX Anywhere 2 Laser wireless / Logitech M330 wireless
Keyboard RAPOO E9270P Black 5GHz wireless / HP backlit
Software Windows 11 / Windows 10
Benchmark Scores Cinebench R23 (Single Core) 1936 @ stock Cinebench R23 (Multi Core) 23006 @ stock
partner up with Arctic

Like this Powercolor ATi X1950 Pro GPU that I had in the past, with a super quiet and cool cooler from Arctic.:cool:

Screenshot_20230123_174547.png
 
Joined
Dec 25, 2020
Messages
4,596 (3.79/day)
Location
São Paulo, Brazil
System Name Project Kairi Mk. IV "Eternal Thunder"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard MSI MEG Z690 ACE (MS-7D27) BIOS 1G
Cooling Noctua NH-D15S + NF-F12 industrialPPC-3000 w/ Thermalright BCF and NT-H1
Memory G.SKILL Trident Z5 RGB 32GB DDR5-6800 F5-6800J3445G16GX2-TZ5RK @ 6400 MT/s 30-38-38-38-70-2
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 1x WD Black SN750 500 GB NVMe + 4x WD VelociRaptor HLFS 300 GB HDDs
Display(s) 55-inch LG G3 OLED
Case Cooler Master MasterFrame 700
Audio Device(s) EVGA Nu Audio (classic) + Sony MDR-V7 cans
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Logitech G305 Lightspeed K/DA
Keyboard Logitech K400 Plus
Software Windows 10 Enterprise 22H2
Benchmark Scores "Speed isn't life, it just makes it go faster."
I won't chalk it up to any conspiracy, i'd sooner point out at AMD's inability to create a high-end GPU which can perform to expectations. Radeon has never been so far behind GeForce ever before. As an example I will use the AD104: NVIDIA is basically selling low end GPU designs at the premium segment, the 4070 Ti, which was originally the 4080 12 GB, is basically an RTX 4060 project.

Everything from the memory amount/bus width/ROP count/GPU die size to bill of materials point it squarely at being a midrange product, especially its size relative to the full-die configuration of the largest processor of the same architecture. It makes do with a relatively paltry 64 clusters while the full AD102 is supposed to have up to 144 clusters (and currently ships with 142 clusters in the RTX 6000 Ada configuration, or 128 clusters in the RTX 4090).



Yet the 4070 Ti, with all of its drawbacks, third-tier ASIC, 192-bit memory interface is enough to hold its own against the (on paper) much better RX 7900 XT and doesn't exactly look ugly next to the 7900 XTX, and therein lies the bloody problem! They have no competition, AIBs are interested in preserving their premium segment brands for the best products - and AMD DOES NOT deliver here.
 
Joined
Dec 6, 2018
Messages
342 (0.17/day)
Location
Hungary
Processor i5-9600K
Motherboard ASUS Prime Z390-A
Cooling Cooler Master Hyper 212 Black Edition PWM
Memory G.Skill DDR4 RipjawsV 3200MHz 16GB kit
Video Card(s) Asus RTX2060 ROG STRIX GAMING
Display(s) Samsung Odyssey G7 27"
Case Cooler Master MasterCase H500
Power Supply SUPER FLOWER Leadex Gold 650W
Mouse BenQ Zowie FK1+-B
Keyboard Cherry KC 1000
Software Win 10
Anti consumer is anti consumer regardless of what GPU YOU own.
Nvidia being anti-consumer is an argument that comes up quite often. The only thing I expect from gigacorp is nothing less than to adhere to warranty and update your drivers for as long as possible. I can't imagine anything else that gigacorp could do for me, with the exception of handing out their products for free.
 
Joined
Mar 10, 2010
Messages
11,878 (2.30/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
You keep buying that stuff they'll double down on it.

so the world is AMD fault, the leap's of logic wow.

There's Ample proof Nvidia are Anti consumer.

Stick your head back under the covers, ignore it all and just keep buying, after all.

The more you buy the more you save.
 
Joined
Dec 25, 2020
Messages
4,596 (3.79/day)
Location
São Paulo, Brazil
System Name Project Kairi Mk. IV "Eternal Thunder"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard MSI MEG Z690 ACE (MS-7D27) BIOS 1G
Cooling Noctua NH-D15S + NF-F12 industrialPPC-3000 w/ Thermalright BCF and NT-H1
Memory G.SKILL Trident Z5 RGB 32GB DDR5-6800 F5-6800J3445G16GX2-TZ5RK @ 6400 MT/s 30-38-38-38-70-2
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 1x WD Black SN750 500 GB NVMe + 4x WD VelociRaptor HLFS 300 GB HDDs
Display(s) 55-inch LG G3 OLED
Case Cooler Master MasterFrame 700
Audio Device(s) EVGA Nu Audio (classic) + Sony MDR-V7 cans
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Logitech G305 Lightspeed K/DA
Keyboard Logitech K400 Plus
Software Windows 10 Enterprise 22H2
Benchmark Scores "Speed isn't life, it just makes it go faster."
You keep buying that stuff they'll double down on it.

so the world is AMD fault, the leap's of logic wow.

Agreed. I personally gave up on upgrading my GPU this generation, I don't agree with the Ada product stack and NVIDIA's shameless upselling of low-end designs at the premium segment, I condemn NVIDIA's predatory and deceitful marketing and the whole hardware issues with RDNA 3 products coupled with low availability and unattractive prices basically killed all interest for me. I decided to upgrade my processor instead. I didn't exactly need it, with a 5950X, but selling it while it retains value would enable me to treat myself to something like a 13900KS and that is just what I plan on doing.
 
Joined
Dec 6, 2018
Messages
342 (0.17/day)
Location
Hungary
Processor i5-9600K
Motherboard ASUS Prime Z390-A
Cooling Cooler Master Hyper 212 Black Edition PWM
Memory G.Skill DDR4 RipjawsV 3200MHz 16GB kit
Video Card(s) Asus RTX2060 ROG STRIX GAMING
Display(s) Samsung Odyssey G7 27"
Case Cooler Master MasterCase H500
Power Supply SUPER FLOWER Leadex Gold 650W
Mouse BenQ Zowie FK1+-B
Keyboard Cherry KC 1000
Software Win 10
Agreed. I personally gave up on upgrading my GPU this generation, I don't agree with the Ada product stack and NVIDIA's shameless upselling of low-end designs at the premium segment, I condemn NVIDIA's predatory and deceitful marketing and the whole hardware issues with RDNA 3 products coupled with low availability and unattractive prices basically killed all interest for me. I decided to upgrade my processor instead. I didn't exactly need it, with a 5950X, but selling it while it retains value would enable me to treat myself to something like a 13900KS and that is just what I plan on doing.
I guess you can afford to sit this one out with your 3090:) some people have been waiting since mining boom to upgrade and they simply have no choice. if they want factory fresh stuff, it's the 40 series
 
Joined
Oct 15, 2011
Messages
1,966 (0.43/day)
Location
Springfield, Vermont
System Name KHR-1
Processor Ryzen 9 5900X
Motherboard ASRock B550 PG Velocita (UEFI-BIOS P3.40)
Memory 32 GB G.Skill RipJawsV F4-3200C16D-32GVR
Video Card(s) Sapphire Nitro+ Radeon RX 6750 XT
Storage Western Digital Black SN850 1 TB NVMe SSD
Display(s) Alienware AW3423DWF OLED-ASRock PG27Q15R2A (backup)
Case Corsair 275R
Audio Device(s) Technics SA-EX140 receiver with Polk VT60 speakers
Power Supply eVGA Supernova G3 750W
Mouse Logitech G Pro (Hero)
Software Windows 11 Pro x64 23H2
I won't chalk it up to any conspiracy, i'd sooner point out at AMD's inability to create a high-end GPU which can perform to expectations. Radeon has never been so far behind GeForce ever before. As an example I will use the AD104: NVIDIA is basically selling low end GPU designs at the premium segment, the 4070 Ti, which was originally the 4080 12 GB, is basically an RTX 4060 project.

Everything from the memory amount/bus width/ROP count/GPU die size to bill of materials point it squarely at being a midrange product, especially its size relative to the full-die configuration of the largest processor of the same architecture. It makes do with a relatively paltry 64 clusters while the full AD102 is supposed to have up to 144 clusters (and currently ships with 142 clusters in the RTX 6000 Ada configuration, or 128 clusters in the RTX 4090).



Yet the 4070 Ti, with all of its drawbacks, third-tier ASIC, 192-bit memory interface is enough to hold its own against the (on paper) much better RX 7900 XT and doesn't exactly look ugly next to the 7900 XTX, and therein lies the bloody problem! They have no competition, AIBs are interested in preserving their premium segment brands for the best products - and AMD DOES NOT deliver here.
So anyone thinks it's like 2012-2016 for CPUs? Except it's for GPUs this time? The funny thing is, that even back in "the CPU malaise era", (2012-2016) I didn't think the sky was falling.

I decided to upgrade my processor instead.
That reminds me of '21, LOL! Where I did an Intel Comet Lake build and couldn't get another video card. For me, I got a CPU upgrade, because I already got the most card that I ever had, back in July, 2022.
 
Joined
Dec 25, 2020
Messages
4,596 (3.79/day)
Location
São Paulo, Brazil
System Name Project Kairi Mk. IV "Eternal Thunder"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard MSI MEG Z690 ACE (MS-7D27) BIOS 1G
Cooling Noctua NH-D15S + NF-F12 industrialPPC-3000 w/ Thermalright BCF and NT-H1
Memory G.SKILL Trident Z5 RGB 32GB DDR5-6800 F5-6800J3445G16GX2-TZ5RK @ 6400 MT/s 30-38-38-38-70-2
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 1x WD Black SN750 500 GB NVMe + 4x WD VelociRaptor HLFS 300 GB HDDs
Display(s) 55-inch LG G3 OLED
Case Cooler Master MasterFrame 700
Audio Device(s) EVGA Nu Audio (classic) + Sony MDR-V7 cans
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Logitech G305 Lightspeed K/DA
Keyboard Logitech K400 Plus
Software Windows 10 Enterprise 22H2
Benchmark Scores "Speed isn't life, it just makes it go faster."
So anyone thinks it's like 2012-2016 for CPUs? Except it's for GPUs this time? The funny thing is, that even back in "the CPU malaise era", (2012-2016) I didn't think the sky was falling.

Yeah, though CPUs generate a lot less FOMO: we've been oversupplied on CPU power for a long time now, and a processor has to become exceptionally out of date (currently 13 or more years old) for it to begin showing signs of software incompatibility. Sandy Bridge is about 12 years old (2600K launched Q1 2011) and that's pretty much become the baseline nowadays, with the odd few softwares that require AVX2 (Haswell/2013), most will do fine with a combination of SSE 4.2 + the original 128-bit AVX, and that is something even an old i5-2500K has covered.

Most budget processors sold today are genuinely powerful enough, for example a i5-12400 or a Ryzen 5600 have a metric ton of computing power in them. On the GPU realm, whether through artificial gating of features in software and drivers, or through genuine innovation, an old GPU will generally not be able to keep up. While Sandy Bridge is usable, a contemporary Fermi or TeraScale 2 graphics processor is a museum piece.

I guess you can afford to sit this one out with your 3090:) some people have been waiting since mining boom to upgrade and they simply have no choice. if they want factory fresh stuff, it's the 40 series

I can, but I am still frustrated. By the time Blackwell or RDNA 4 ship, I will have used this card for over four years...
 
Joined
Oct 15, 2011
Messages
1,966 (0.43/day)
Location
Springfield, Vermont
System Name KHR-1
Processor Ryzen 9 5900X
Motherboard ASRock B550 PG Velocita (UEFI-BIOS P3.40)
Memory 32 GB G.Skill RipJawsV F4-3200C16D-32GVR
Video Card(s) Sapphire Nitro+ Radeon RX 6750 XT
Storage Western Digital Black SN850 1 TB NVMe SSD
Display(s) Alienware AW3423DWF OLED-ASRock PG27Q15R2A (backup)
Case Corsair 275R
Audio Device(s) Technics SA-EX140 receiver with Polk VT60 speakers
Power Supply eVGA Supernova G3 750W
Mouse Logitech G Pro (Hero)
Software Windows 11 Pro x64 23H2
While Sandy Bridge is usable, a contemporary Fermi or TeraScale 2 graphics processor is a museum piece.
Almost like how I look at 56K!
 
Joined
Mar 10, 2010
Messages
11,878 (2.30/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
Yeah, though CPUs generate a lot less FOMO: we've been oversupplied on CPU power for a long time now, and a processor has to become exceptionally out of date (currently 13 or more years old) for it to begin showing signs of software incompatibility. Sandy Bridge is about 12 years old (2600K launched Q1 2011) and that's pretty much become the baseline nowadays, with the odd few softwares that require AVX2 (Haswell/2013), most will do fine with a combination of SSE 4.2 + the original 128-bit AVX, and that is something even an old i5-2500K has covered.

Most budget processors sold today are genuinely powerful enough, for example a i5-12400 or a Ryzen 5600 have a metric ton of computing power in them. On the GPU realm, whether through artificial gating of features in software and drivers, or through genuine innovation, an old GPU will generally not be able to keep up. While Sandy Bridge is usable, a contemporary Fermi or TeraScale 2 graphics processor is a museum piece.



I can, but I am still frustrated. By the time Blackwell or RDNA 4 ship, I will have used this card for over four years...
Look, the OP questioned Nvidia and GPP, so far you posted some, it's AMD's fault their shit comments, now we're on old CPU, which you're also wrong about(some are stutter fests in game now i5 2500K) AND offtopic about.

there are other threads for cpu chat.

so Nvidia GPP, on topic, anything pertinent or are you just trying to derail a thread?

seams so since a 4070Ti is what your talking UP, not down,and they're shit imho as was the 3090 and 3090TI, too expensive by far for what they offer / offered but you bought in(3090)??? and now im to believe your pissed at nvidia ffffor expensive cards wtaf
 
Joined
Dec 25, 2020
Messages
4,596 (3.79/day)
Location
São Paulo, Brazil
System Name Project Kairi Mk. IV "Eternal Thunder"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard MSI MEG Z690 ACE (MS-7D27) BIOS 1G
Cooling Noctua NH-D15S + NF-F12 industrialPPC-3000 w/ Thermalright BCF and NT-H1
Memory G.SKILL Trident Z5 RGB 32GB DDR5-6800 F5-6800J3445G16GX2-TZ5RK @ 6400 MT/s 30-38-38-38-70-2
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 1x WD Black SN750 500 GB NVMe + 4x WD VelociRaptor HLFS 300 GB HDDs
Display(s) 55-inch LG G3 OLED
Case Cooler Master MasterFrame 700
Audio Device(s) EVGA Nu Audio (classic) + Sony MDR-V7 cans
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Logitech G305 Lightspeed K/DA
Keyboard Logitech K400 Plus
Software Windows 10 Enterprise 22H2
Benchmark Scores "Speed isn't life, it just makes it go faster."
Look, the OP questioned Nvidia and GPP, so far you posted some, it's AMD's fault their shit comments, now we're on old CPU, which you're also wrong about(some are stutter fests in game now i5 2500K) AND offtopic about.

there are other threads for cpu chat.

so Nvidia GPP, on topic, anything pertinent or are you just trying to derail a thread?

seams so since a 4070Ti is what your talking UP, not down,and they're shit imho as was the 3090 and 3090TI, too expensive by far for what they offer / offered but you bought in(3090)??? and now im to believe your pissed at nvidia ffffor expensive cards wtaf

There must have been some misunderstanding somewhere along the way. I actually agree with you.

The GPP is dead. There is no grand conspiracy regarding add-in board partners not wanting to deploy Radeon chipsets in their flagship GPUs. I posed a reasoning as to why I, and I alone, believe that to be the case. I am not derailing any thread, if you read, all talk has been rather on-topic...
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,037 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
and I can't seem to find anywhere where these product lines have been announced for the AMD side
It's also a question of design support from the GPU vendor, availability of chips, potential margins, existing inventory, etc.
 
Joined
Jul 13, 2016
Messages
2,832 (1.00/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
For some reason I was not receiving notifications for this thread, anyways onto some replies.

I won't chalk it up to any conspiracy, i'd sooner point out at AMD's inability to create a high-end GPU which can perform to expectations. Radeon has never been so far behind GeForce ever before. As an example I will use the AD104: NVIDIA is basically selling low end GPU designs at the premium segment, the 4070 Ti, which was originally the 4080 12 GB, is basically an RTX 4060 project.

The problem with this theory is that ASUS and Gigabyte both had ROG and Master versions of cards further down AMD's 6000 series stack. The flagship card lines have never been restricted to only the top SKUs.

I wouldn't necessarily say Radeon is behind GeForce or that it's a historic gap. After all, the 5000 series wasn't that long ago and that series topped out at the midrange. Absolute performance isn't the best measure for determining who is ahead of who, you want to look at architecture and die size. For a GPU of it's size, the 5700 XT was very good and you could see that once AMD scales that GPU up Nvidia would have heavy competition. It's extremely similar to the 7000 series, where the 7900 XTX graphics die is a mere 300mm2. If that same GPU was 600mm2, it would have beaten the 4090 by a large margin. After all the 4090 is only 17% faster than the current 7900 XTX. The only problem AMD has is power consumption. To me the 7000 series looks like AMD failed to get enough fabric bandwidth for two GCDs. Almost certainly AMD will either do multiple GCD next generation or, if it can't, increase the die size. Otherwise it doesn't make sense that the top GPU is topping out at 300mm2.

At the end of the day AMD is charging $1000 for the 7900 XTX because it can. In reality that GPU is costing AMD a fraction of what the 6900 XT cost to produce when it launched. After all, AMD recently pointed out the massive cost savings smaller chiplets bring. If a 16 core chiplet based CPU costs half that of a monolithic 16 core CPU, imagine the difference in cost between much bigger GPUs.

It's also a question of design support from the GPU vendor, availability of chips, potential margins, existing inventory, etc.

Yes, could be that AMD doesn't have enough chips to support the new SKUs. I'm willing to wait 1-3 months for availability to improve to see if these flagship cards come out on the AMD side.

I'm not sure how much of a factor design support or margins would be. If they had these products for last gen AMD products and these are longtime AMD partners, I don't see how those variables could have changed. If we assume that design support or margins are an issue, it's extremely unlikely that it would happen simultaneously across multiple AIBs at the same time and specifically only to AIBs that happen to also sell Nvidia cards. Sapphire and XFX and not pulling their top models after all. There are reports of Nvidia squeezing partner margins but we have not heard the same from AMD's side.

Those flagship products are the AIB's high margin products, it doesn't make a lot of sense that they wouldn't also release those on AMD. It only makes sense if you consider it's something similar to the GPP.

Please correct me if I'm missing anything, the way I see it right now is if we don't see those flagship lines for AMD it's extremely likely it's down to GPP like shenanigans.
 
Last edited:
Joined
Aug 20, 2007
Messages
20,773 (3.41/day)
System Name Pioneer
Processor Ryzen R9 7950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage 2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64
Historic gap? lolwut? Did we forget the era of Raja and Poor Vega?
 
Joined
Jan 8, 2017
Messages
8,929 (3.36/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
GPP was just a formality for something that was happening regardless.

In reality that GPU is costing AMD a fraction of what the 6900 XT cost to produce when it launched.

Simply untrue, Navi 31 is on 5 and 6 nm with double the transistor count vs Navi 21 on 7nm, there is no way it's cheaper than Navi 21, it's perhaps cheaper than it would have otherwise been if it was monolithic but that's it.

It's extremely similar to the 7000 series, where the 7900 XTX graphics die is a mere 300mm2. If that same GPU was 600mm2, it would have beaten the 4090 by a large margin.

Those 300mm^2 do not include the L3 cache but yes even without it at 600mm^2 and some extra memory controllers it would likely outperform AD102 by a lot and that's assuming the same manufacturing process as Navi 31.
 
Last edited:

tabascosauz

Moderator
Supporter
Staff member
Joined
Jun 24, 2015
Messages
7,526 (2.34/day)
Location
Western Canada
System Name ab┃ob
Processor 7800X3D┃5800X3D
Motherboard B650E PG-ITX┃X570 Impact
Cooling PA120+T30┃AXP120x67
Memory 64GB 6000CL30┃32GB 3600CL14
Video Card(s) RTX 4070 Ti Eagle┃RTX A2000
Storage 8TB of SSDs┃1TB SN550
Display(s) 43" QN90B / 32" M32Q / 27" S2721DGF
Case Caselabs S3┃Lazer3D HT5
Power Supply Corsair HX1000┃HDPlex
At the end of the day AMD is charging $1000 for the 7900 XTX because it can. In reality that GPU is costing AMD a fraction of what the 6900 XT cost to produce when it launched. After all, AMD recently pointed out the massive cost savings smaller chiplets bring. If a 16 core chiplet based CPU costs half that of a monolithic 16 core CPU, imagine the difference in cost between much bigger GPUs.

I don't know about a fraction of 6900XT, fanout bridge and IFOP are very different technologies from capability and cost perspective. Fanout bridge is not quite EMIB, but applying that logic to Intel then Meteor Lake should be dirt cheap to make, because it's all just chiplets......right?

IFOP is a slow, cheap and narrow interface. And AMD has had 3 generations of CPUs to drive the costs even further down. idk about exact AM5 changes but afaik no major design changes, and from Matisse > Vermeer AMD literally did 0 work aside from swap the CCDs. IOD was also archaic, didn't change for 3 years, and the X570 PCH just shared the same [probably lower quality] die.

Out of necessity Fanout is orders of magnitude faster than IFOP (and still incurs slight penalties at iso-clock!). I don't think AMD has kept it a secret that it's new packaging tech, and it's only cheap in comparison to making a gigantic die. And there are 5 or 6 connections on every GPU as opposed to either 1 or 2 IFOP links. And both GCD and MCD are brand new designs, and both N5 and N6 are reasonably new.

Obviously, I don't believe that AMD has no room to drop the XT and XTX price, but chiplets aren't magic. Honestly the much smaller GCD has some other benefits, contact for Navi31 seems consistently reasonable compared to the frequently 20-30C+ difference between edge vs. hotspot temp on 520mm Navi21 (XTX vapor chamber aside). MCDs not nearly as low-power as IOD, probably some additional thermal benefits from spreading things out a little.
 
Joined
Feb 3, 2017
Messages
3,481 (1.32/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
Historic gap? lolwut? Did we forget the era of Raja and Poor Vega?
Both price level and expectations are different this time around but isn't the situation itself current situation in high end pretty much exactly the same as Vega launch?
Nvidia releases first and by the specs and numbers we expect a high-end competitor from AMD. What comes out ends up competing with a GPU one step down.
Vega 64 vs 1080Ti/1080, 7900XTX vs 4090/4080.
 
Joined
Jul 13, 2016
Messages
2,832 (1.00/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
Simply untrue, Navi 31 is on 5 and 6 nm with double the transistor count vs Navi 21 on 7nm, there is no way it's cheaper than Navi 21, it's perhaps cheaper than it would have otherwise been if it was monolithic but that's it.

TPU published an article not that long ago showing the cost of 5nm right now (not the launch price two years ago with was very expensive) only being approx 30% higher than 7nm. I have a comment on that article comparing costs to past nodes from the very chart published in that article and the cost increase of 5nm is smaller than average. That's also considering that past nodes with large wafer price increases didn't come with gigantic price increases for consumers as well. Going from 28nm to 14nm was a 100% increase in wafer pricing yet video card pricing during the time increased a bit above rate of inflation.

When I said what I said, there's significant data that supports it.

FYI 6nm is refined TSMC's 7nm. Same deal as 4nm, it's just refined 5nm. TSMC's own website verifies this.

I don't know about a fraction of 6900XT, fanout bridge and IFOP are very different technologies from capability and cost perspective. Fanout bridge is not quite EMIB, but applying that logic to Intel then Meteor Lake should be dirt cheap to make, because it's all just chiplets......right?

IFOP is a slow, cheap and narrow interface. And AMD has had 3 generations of CPUs to drive the costs even further down. idk about exact AM5 changes but afaik no major design changes, and from Matisse > Vermeer AMD literally did 0 work aside from swap the CCDs. IOD was also archaic, didn't change for 3 years, and the X570 PCH just shared the same [probably lower quality] die.

Out of necessity Fanout is orders of magnitude faster than IFOP (and still incurs slight penalties at iso-clock!). I don't think AMD has kept it a secret that it's new packaging tech, and it's only cheap in comparison to making a gigantic die. And there are 5 or 6 connections on every GPU as opposed to either 1 or 2 IFOP links. And both GCD and MCD are brand new designs, and both N5 and N6 are reasonably new.

Obviously, I don't believe that AMD has no room to drop the XT and XTX price, but chiplets aren't magic. Honestly the much smaller GCD has some other benefits, contact for Navi31 seems consistently reasonable compared to the frequently 20-30C+ difference between edge vs. hotspot temp on 520mm Navi21 (XTX vapor chamber aside). MCDs not nearly as low-power as IOD, probably some additional thermal benefits from spreading things out a little.

The primary cost savings from chiplets is that you break a large die into smaller ones (with secondary savings coming from being able to reuse chiplets across multiple product lines). This allows you to dramatically decrease the amount of wafer you waste (of course there are many other benefits but we are just focusing on cost here). That said, breaking down the die too much will increase the cost and introduce additional difficulties in the design and manufacturing processes. If you are already getting close to 100% yield with 90mm2 chiplets, breaking down the die further does not provide costs savings as wasted wafer on bad dies is already extremely minimal.

As stated above, 6nm is a refined 7nm: https://www.anandtech.com/show/1422...echnology-7-nm-with-higher-transistor-density

"An evolution of TSMC's 7nm node, N6 will continue to use the same design rules, making it easier for companies to get started on the new process."

TSMC advertises 6nm as cost effective and easy to transition to for existing 7nm products.

In the chip design world, I'd say chiplets are as close to magic as you are going to get. Increased yield, potential for much lower costs, modularity and scalability, exceptional binning ability, higher bandwidth potential, and lower latency potential (which the university of toronto demonstrated with an active interposer).
 
Joined
Aug 20, 2007
Messages
20,773 (3.41/day)
System Name Pioneer
Processor Ryzen R9 7950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage 2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64
Joined
Jan 8, 2017
Messages
8,929 (3.36/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
TPU published an article not that long ago showing the cost of 5nm right now (not the launch price two years ago with was very expensive) only being approx 30% higher than 7nm. I have a comment on that article comparing costs to past nodes from the very chart published in that article and the cost increase of 5nm is smaller than average. That's also considering that past nodes with large wafer price increases didn't come with gigantic price increases for consumers as well. Going from 28nm to 14nm was a 100% increase in wafer pricing yet video card pricing during the time increased a bit above rate of inflation.

Even if you assume those newer nodes cost exactly the same as the old 7nm, which just can't be true, you can't go around the fact that Navi 31 has the twice the amount of transistors. In the best possible case scenario, they cost about the same :

1675113012267.png
 
Joined
Jul 13, 2016
Messages
2,832 (1.00/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
Even if you assume those newer nodes cost exactly the same as the old 7nm, which just can't be true, you can't go around the fact that Navi 31 has the twice the amount of transistors. In the best possible case scenario, they cost about the same :

View attachment 281541

The newer node doesn't cost the same, as I pointed out it about 30% more expensive.

Yield scales exponentially with die size. This is down to edge loss, defects, and the fact that as you increase die size the amount of wasted material at the edge of the wafer increases. If you have wafer to be cut into 520mm2 dies, each defect costs you 520mm2. If you have a wafer to be cut into 320mm2 dies, each defect costs you only 320mm2. Given that TSMC's current 0.11 per mm2 defect density, this means 66 defects per wafer. That's just the losses from defects, let's go on to what might else impact yield. Wafers are circles and dies are rectangles or squares, which means the larger you make each individual square or rectangle, the more wasted wafer you have at the edges. This is separate from edge loss, which is the increased chance to damage dies on the edge of the wafer. This can be counteracted by reservering more space along the edge but that requires a fixed amount of wasted wafer space. These three factors all add up big time, especially when you are talking about dies the size of modern high end GPUs.

AMD's graphic you provided is a great illustration of this. The 3950X is als a 3 die, 2 CPU chiplet product that weighs in at 272mm2 including all 3 dies. Now if AMD is seeing a more than a halving of costs with dies of that small size, apply the concepts explained above when coming from a 520mm2 die. The 30% increase in cost per wafer does not compare to the cost savings for big dies when for even much smaller dies they are paying less than half.

Transistor count isn't a factor relevant to cost. You are paying a fixed amount per wafer regardless of the number transistors on each die that gets cut from said wafer. The three factors to the cost of an individual die are 1) wafer cost 2) Die size 3) Defect density. SRAM (cache, registers, ect) consists of 70%+ of all transistors on modern CPUs (more in X3D and server products). Cost per transistor might go up but it's not how prices are determined.
 
Top