• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GeForce value over time and 40-series pricing.

Joined
Jul 13, 2016
Messages
1,653 (0.69/day)
Processor Ryzen 7700X
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) EVGA 1080 Ti
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
you may never get a nvidia card to review ever again unless you play ball. With the price these things cost you have to be Linus to buy them all to review
also i'm sure those 10.000$ Louis vuitton bags are some editor review somewhere, if you don't care about money

Plus the only way to get a pre-release or launch review up is through Nvidia.

I think at some point both reviewers and consumers are going to make a stand to make change happen.
 
Joined
Feb 18, 2005
Messages
3,976 (0.61/day)
Location
Ikenai borderline!
... despite it being 71.8% more expensive and having a 39.6% smaller die then the 3080
What the actual fuck does die size have to do with the price that NVIDIA has to pay TSMC for GPUs manufactured on a leading-edge node? Nothing, that's what.

I wish people would educate themselves before spouting such stupidity.
 
Joined
Nov 15, 2021
Messages
1,568 (3.48/day)
Location
Knoxville, TN, USA
System Name Work Computer | Laptop | Unfinished Computer
Processor Core i7-6700 | Core i7-7700HQ | Ryzen 5 5600X
Motherboard Dell Q170 | Lenovo ThinkPad | Gigabyte Aorus Elite Wi-Fi
Cooling A fan? | Some Fans | Truly Custom Loop
Memory 4x4GB Crucial 2133 C17 | 2x8GB Assorted | 4x8GB Corsair Vengeance RGB 3600 C26
Video Card(s) Dell Radeon R7 450 | Intel HD 630 + Quadro M620 Mobile | RTX 2080 Ti FE
Storage Crucial BX500 2TB | WD Black SN750 512GB + Intel H20 512GB + 1TB HDD | TBD
Display(s) 3x LG QHD 32" GSM5B96 | Internal 1080p | TBD
Case Dell | Lenovo ThinkPad | Heavily Modified Phanteks P400
Audio Device(s) Half-broken Earbuds | Headphones | TBD
Power Supply Dell TFX Non-standard | Lenovo 225W | EVGA BQ 650W
Mouse Logitech M275 | Monster No-Name $7 Gaming Mouse| TBD
Keyboard Logitech K345 | Lenovo's Incredible Keyboard | TBD
Software Quotesoft, Adobe Acrobat, DaVinci Resolve | GAMES | MOAR GAMES
What the actual fuck does die size have to do with the price that NVIDIA has to pay TSMC for GPUs manufactured on a leading-edge node? Nothing, that's what.

I wish people would educate themselves before spouting such stupidity.
I tend to agree. Smaller die? Yes, but a wafer now costs $20,000 instead of $7,000.
 
Joined
Feb 18, 2005
Messages
3,976 (0.61/day)
Location
Ikenai borderline!
I tend to agree. Smaller die? Yes, but a wafer now costs $20,000 instead of $7,000.
And NVIDIA doesn't care because its compute customers - its fastest-growing revenue segment - are happy to pay whatever NVIDIA charges for its new GPUs, because the reduction in compute time of said GPUs is worth far more than the asking price. Such is capitalism.
 
Last edited:
Joined
Jan 14, 2019
Messages
4,846 (3.26/day)
Location
Midlands, UK
System Name Nebulon-B Mk. 4
Processor AMD Ryzen 7 7700X
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Silent Loop 2 280 mm
Memory 2x 16 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) AMD Radeon RX 6750 XT
Storage 1 TB Crucial P5 Plus, 2 TB Corsair MP600 R2
Display(s) Samsung C24F390, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Power Supply Seasonic Prime GX-750
Mouse Cherry MW 8 Advanced
Keyboard MagicForce 68
Software Windows 10 Pro
Benchmark Scores Unigine Superposition 1080p Ultra: 7150, Cinebench R23 multi: 19250, single: 1975.
I find it funny that we're still extrapolating Nvidia Ada prices and spending time on discussing trends and stuff, when we have AMD heavily discounting RDNA 2 cards right now.

Conclusion: Who cares about the 4080 and 4090? Let them rot on store shelves. ;)
 
Joined
Apr 30, 2020
Messages
587 (0.58/day)
System Name New One
Processor Ryzen 5 5600X
Motherboard MSI MGP Gaming WIFI
Cooling Stock
Memory Corsair Vengeance pro RGB 3200mhz 16Gbs
Video Card(s) EVGA 2080 Ti FTW 3
Storage Western digital Sata SDD 500gb
Display(s) HP X24i
Case Corsair 4000D Airflow
Power Supply Corsair 650M
Mouse Lifeworks gaming mouse lol
Keyboard lifeworks monster rgb light keyboard
Another apologist thread suggesting via mental gymnastics and a vague realisation of what the pandemic and mining craze did to the pricing over the last few years.

That Nvidia are being fair.

My arse, your talking out of your arse.

Stop propping up shitty ideals to make right that which is absolutely wrong.

The last five years happened.

Don't imply tacit fairness where pure bullshit is enshrined.

Especially when Nvidia's MSRP is such a tangential pile of irrelevant turd, find a 4090 at MSRP.

Few 4080s are being sold at MSRP.

why is the one that was released in 2018 a $100 more than the one that was released in 2020 ?

whyisthat.jpg


There is no sane reason for that pricing, the card is 5 years old.
 
Last edited:
Joined
Mar 10, 2010
Messages
11,044 (2.34/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II/Trig
Processor Amd R5 5600G/ Intel 8750H/3800X
Motherboard Crosshair hero8 impact/Asus/crosshair hero 7
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Sapphire refference Rx vega 64 EK waterblocked/Rtx 2060/GTX 1060
Storage Silicon power 1TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dellshiter
Case Lianli p0-11 dynamic/strix scar2/aero cool shiter
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock /850 watt ?
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506

bug

Joined
May 22, 2015
Messages
11,243 (3.99/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Stop giving Nvidia a pass on it's greed during the pandemic. They dedicated a portion of their production just to mining cards,
Which mining cards?
allowed the bulk sale of card directly to miners,
Nvidia doesn't make video cards. They "make" a small amount of reference ones, but that it. What exactly did they "allow"?
and jacked up MSRP of cards released later like the 3080 Ti to cash in as much possible.
That they did and mining was probably one of the reasons. But you have no proof it was the only reason.

Let's cool off a bit and separate facts from guesses, shall we?
 

freeagent

Moderator
Staff member
Joined
Sep 16, 2018
Messages
5,205 (3.24/day)
Location
Winnipeg, Canada
System Name Street Meat
Processor AMD R7 5900X
Motherboard Asus Strix B550-XE
Cooling Thermalright Frost Commander 140 White
Memory 2x8 G.Skill Trident Z Royal 3200C14, 2x8GB G.Skill Trident Z Black and White 3200 C14
Video Card(s) EVGA RTX 3070 Ti FTW3 Ultra
Storage WD SN850 1TB, SN750 1TB, SN750 500GB, Asus Hyper M.2, 2x Intel 545S 256GB, 1TB Toshiba spinner
Display(s) LG 50UP7100
Case Fractal Torrent Compact RGB
Audio Device(s) JBL 2.1 Deep Bass
Power Supply EVGA G+ 750, Monster HDP1800
Mouse Zowie EC2 Evo
Keyboard Logitech G213
Software Yes
Benchmark Scores Yes
Don't kid yourself if you think Nvidia doesn't control the flow. There was no shortage from the suppliers for them. I don't buy it. They are big enough to have everything to build cards with well in advance, they have the buying power, they have the presence. They controlled the flow of numbers out the door, they controlled what cards went where, they controlled who got what, which means they controlled the market. Profits up 300%? Can't say I am surprised.
 
Joined
Nov 15, 2021
Messages
1,568 (3.48/day)
Location
Knoxville, TN, USA
System Name Work Computer | Laptop | Unfinished Computer
Processor Core i7-6700 | Core i7-7700HQ | Ryzen 5 5600X
Motherboard Dell Q170 | Lenovo ThinkPad | Gigabyte Aorus Elite Wi-Fi
Cooling A fan? | Some Fans | Truly Custom Loop
Memory 4x4GB Crucial 2133 C17 | 2x8GB Assorted | 4x8GB Corsair Vengeance RGB 3600 C26
Video Card(s) Dell Radeon R7 450 | Intel HD 630 + Quadro M620 Mobile | RTX 2080 Ti FE
Storage Crucial BX500 2TB | WD Black SN750 512GB + Intel H20 512GB + 1TB HDD | TBD
Display(s) 3x LG QHD 32" GSM5B96 | Internal 1080p | TBD
Case Dell | Lenovo ThinkPad | Heavily Modified Phanteks P400
Audio Device(s) Half-broken Earbuds | Headphones | TBD
Power Supply Dell TFX Non-standard | Lenovo 225W | EVGA BQ 650W
Mouse Logitech M275 | Monster No-Name $7 Gaming Mouse| TBD
Keyboard Logitech K345 | Lenovo's Incredible Keyboard | TBD
Software Quotesoft, Adobe Acrobat, DaVinci Resolve | GAMES | MOAR GAMES
Joined
Jul 13, 2016
Messages
1,653 (0.69/day)
Processor Ryzen 7700X
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) EVGA 1080 Ti
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
What the actual fuck does die size have to do with the price that NVIDIA has to pay TSMC for GPUs manufactured on a leading-edge node? Nothing, that's what.

I wish people would educate themselves before spouting such stupidity.

Die size is everything when it comes to the price Nvidia pays.

You want an exact comparison on the latest node? Great, let's compare the 4080 to the 4090.

You are talking about 60% of the die size at 75% of the cost. Mind you that's before you consider that yields increase exponentially as you reduce die size. The 4080 isn't expensive because leading edge nodes are expensive, it's expensive because Nvidia thinks it can get away with it. I'd be surprised if the 4080 had even half the relative cost compared to the 4090 for Nvidia simply because of how much better that smaller chip will yield.

I tend to agree. Smaller die? Yes, but a wafer now costs $20,000 instead of $7,000.

As pointed out above, even relative to Nvidia products using the exact same node pricing doesn't add up.

FYI the $20,000 figure you are quoting is incorrect. $20,000 is for 3nm and that's only at the launch. Nodes costs decrease over time, which is why AMD isn't paying anywhere near the $16,900 for it's 5nm wafers. This is also ignoring yields, which are higher on TSMCs newer nodes than it's 7nm nodes. If the cost per wafer goes up but the yield increases those two factors can very well offset each other.

I have pointed this out elsewhere in the TPU forms (check those out for more details) as well but recent wafer price increases are not the largest price increases in GPU history. If 3nm only costs 20,000 at launch that would be a very tame sized increase over the price of 5nm. Historically GPU prices do no increase each time wafer cost increases and in some cases GPU price has actually decreased. Wafer cost is only a single variable, something many defenders of Nvidia's pricing point to despite this fact and historical data.

Which mining cards?


Nvidia doesn't make video cards. They "make" a small amount of reference ones, but that it. What exactly did they "allow"?

Nvidia controls the size of branding on the box, how petty they can be with GPU allotment, what their partners can call their cards, ect. We all remember the GPP and how that brought forth AIBs revealing just how controlling Nvidia is of anything with an Nvidia GPU in it. You are telling me they weren't aware that AIBs were selling cards directly to miners? No chance Nvidia wasn't aware and consenting.

That they did and mining was probably one of the reasons. But you have no proof it was the only reason.

Let's cool off a bit and separate facts from guesses, shall we?

Individually these actions would have been anti-consumer, together they prove that Nvidia had the appropriate Mens Rea that fits the idea that Nvidia was catering to miners.

Nvidia may have had more reasons of course but those are incidental to the conversation.
 
Last edited:
Joined
Nov 15, 2021
Messages
1,568 (3.48/day)
Location
Knoxville, TN, USA
System Name Work Computer | Laptop | Unfinished Computer
Processor Core i7-6700 | Core i7-7700HQ | Ryzen 5 5600X
Motherboard Dell Q170 | Lenovo ThinkPad | Gigabyte Aorus Elite Wi-Fi
Cooling A fan? | Some Fans | Truly Custom Loop
Memory 4x4GB Crucial 2133 C17 | 2x8GB Assorted | 4x8GB Corsair Vengeance RGB 3600 C26
Video Card(s) Dell Radeon R7 450 | Intel HD 630 + Quadro M620 Mobile | RTX 2080 Ti FE
Storage Crucial BX500 2TB | WD Black SN750 512GB + Intel H20 512GB + 1TB HDD | TBD
Display(s) 3x LG QHD 32" GSM5B96 | Internal 1080p | TBD
Case Dell | Lenovo ThinkPad | Heavily Modified Phanteks P400
Audio Device(s) Half-broken Earbuds | Headphones | TBD
Power Supply Dell TFX Non-standard | Lenovo 225W | EVGA BQ 650W
Mouse Logitech M275 | Monster No-Name $7 Gaming Mouse| TBD
Keyboard Logitech K345 | Lenovo's Incredible Keyboard | TBD
Software Quotesoft, Adobe Acrobat, DaVinci Resolve | GAMES | MOAR GAMES
FYI the $20,000 figure you are quoting is incorrect. $20,000 is for 3nm and that's only at the launch.
Launch price of 5nm node was $16K. TSMC had at least one (I think 2) major price increase since, and Nvidia is using a custom 4nm node. I think that it is probably around $18K. However, reading into it, it looks like Samsung 8N may well have been between $7K and $9K per wafer.

It was a bit of a hyperbole to point out that a smaller die with more transistors isn't necessarily cheaper.

That being said, I did some calculations by actual specifications that consumers "should" care about (shader count, RAM bandwidth, VRAM quantity, TBP, etc) and found that there was no way in heck to assign a value to each of these that could account for the 4080 costing more than around $900.
 
Joined
Jul 13, 2016
Messages
1,653 (0.69/day)
Processor Ryzen 7700X
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) EVGA 1080 Ti
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
Launch price of 5nm node was $16K. TSMC had at least one (I think 2) major price increase since, and Nvidia is using a custom 4nm node. I think that it is probably around $18K. However, reading into it, it looks like Samsung 8N may well have been between $7K and $9K per wafer.

It was a bit of a hyperbole to point out that a smaller die with more transistors isn't necessarily cheaper.

That being said, I did some calculations by actual specifications that consumers "should" care about (shader count, RAM bandwidth, VRAM quantity, TBP, etc) and found that there was no way in heck to assign a value to each of these that could account for the 4080 costing more than around $900.

The node Nvidia is using is essentially 5nm+, same as how TSMC's 6nm is just 7nm+. If it does cost more, it's likely a very small amount. Otherwise it's not really worth it. Akin to TSMC 6nm, the density increase is extremely small with the upside that it's compatible with 7nm. In essence Nvidia could have taped out for 5nm and the design would have been compatible with TSMC's 4nm.

Here's TSMC's page for the 5nm which references 4nm: https://www.tsmc.com/english/dedicatedFoundry/technology/logic/l_5nm

4nm doesn't have it's own section due to the reasons stated above and on TSMC's website.

Yes, the 5nm price estimate were in the $16,000 - $17,000 range. Current estimates at over 2 1/2 years later place that at around $10,000.

As you pointed out, even under the most generous circumstances it's hard to explain the price of the 4080.
 

bug

Joined
May 22, 2015
Messages
11,243 (3.99/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Yes, because overpriced, repurposed defective dies moved so large volumes...
Nvidia controls the size of branding on the box, how petty they can be with GPU allotment, what their partners can call their cards, ect. We all remember the GPP and how that brought forth AIBs revealing just how controlling Nvidia is of anything with an Nvidia GPU in it. You are telling me they weren't aware that AIBs were selling cards directly to miners? No chance Nvidia wasn't aware and consenting.
Of course Nvidia was aware. Consenting? I'm going to go out on a limb and guess you don't have a shred of evidence of that.
Individually these actions would have been anti-consumer, together they prove that Nvidia had the appropriate Mens Rea that fits the idea that Nvidia was catering to miners.

Nvidia may have had more reasons of course but those are incidental to the conversation.
All you have proven is that you can pick, choose and misrepresent to make your point. Pretty much the opposite of what I have suggested.
 
Last edited:
Joined
Jul 13, 2016
Messages
1,653 (0.69/day)
Processor Ryzen 7700X
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) EVGA 1080 Ti
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
Yes, because overpriced, repurposed defective dies moved so large volumes...

You are going to need to provide a source for that. I remember rumors suggesting this but it was never verified that mining GPUs could have never became gaming GPUs. The rumors also claimed that they would ease GPU prices and supply, which never happened.

Of course Nvidia was aware. Consenting? I'm going to go out on a limb and guess you don't have a shred of evidence of that.

All you have proven is that you can pick, choose and misrepresent to make your point. Pretty much the opposite of what I have suggested.

Given that we know Nvidia controls everything tightly, Nvidia being aware of the mass sales of GPUs to miners is consent.

In my last comment I detailed their ability to commit such a act, of which the evidence is strong. Saying I provided no evidence is functionally incorrect, proving one has the mental state to commit an action is one of the most important things prosecution has to do in court.

Your logic seems to be that only direct evidence is admissible but even in court that is not the case even in criminal court where the bar is higher. In civil court (which is the closest equivalent) the bar for conviction is a preponderance of the evidence (or 51% likely or greater).

If we demonstrate that Nvidia, whom had vast control over their product and partners, did nothing when prices skyrocketed while also providing evidence to show they had the mental state, motive, and past behavior to directly or indirectly allow this to happen we've just demonstrated, with a preponderance of the evidence, that Nvidia is complicit in the price increases and is culpable as a result.
 
Joined
Feb 18, 2005
Messages
3,976 (0.61/day)
Location
Ikenai borderline!
Historically GPU prices do no increase each time wafer cost increases and in some cases GPU price has actually decreased.
Historically doesn't mean shit because older nodes didn't require vastly expensive and slow EUV machines. Stop comparing apples to oranges.
 
Joined
Jul 13, 2016
Messages
1,653 (0.69/day)
Processor Ryzen 7700X
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) EVGA 1080 Ti
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
Historically doesn't mean shit because older nodes didn't require vastly expensive and slow EUV machines. Stop comparing apples to oranges.

You already made this point earlier to which I replied with this:
Die size is everything when it comes to the price Nvidia pays.

You want an exact comparison on the latest node? Great, let's compare the 4080 to the 4090.

You are talking about 60% of the die size at 75% of the cost. Mind you that's before you consider that yields increase exponentially as you reduce die size. The 4080 isn't expensive because leading edge nodes are expensive, it's expensive because Nvidia thinks it can get away with it. I'd be surprised if the 4080 had even half the relative cost compared to the 4090 for Nvidia simply because of how much better that smaller chip will yield.

Of which you never replied to.

Past nodes have been vastly more expensive than: https://www.techpowerup.com/301393/...-cpus-gpus-to-be-more-expensive?cp=1#comments

I commented on that very article upon this very topic:
A 25% increase is actually pretty small. Here is that compared to other nodes provided on the chart in this article:

90nm to 40nm -> 30% price increase
40nm to 28nm -> 15% price increase
28nm to 10nm -> 100% price increase
10nm to 7nm -> 66.66% price increase

To say that such a modest price increase in the cost of wafers means GPU prices will increase belies the fact that we've had far more costly wafer price increases in the past yet GPU prices either did not increase or the price increase was very modest.

Using wafer pricing to justify increasingly high GPU costs is misleading because it's only a single factor in the cost of a GPU and ignores other factors, like how costs are spread across an increasingly large customer base.

It wasn't until Turing that GPU prices exploded and that was clearly motivated by greed and not node costs given Nvidia was using a much cheaper Samsung 8nm while AMD was selling their cards for less while using the more expensive TSMC 7nm.

We've seen price hikes of GPUs for several generations in a row now, the xx80 class GPU is now more than DOUBLE what it was with the 980.

If GPU prices do increase it's due to greed, plain and simple. Nvidia and AMD have already jacked GPU prices up enough to cover the next 20 years of wafer price increases.

Mind you the numbers above only include those from the provided graphs. In comparison to the cost increases of prior nodes, 5nm is normal and 3nm is below normal.

Again though, wafer pricing is by far not the only variable determining card pricing. The larger the GPU market has grown, the more R&D costs are spread out. There's also the fact that TSMC's 5nm yield's better than TSMC's 7nm. 3nm might have even higher yields give the need for less patterning. They mean even less to AMD, who demonstrated that a 16 core CPU costs less than half of that same CPU but monolithic. Those saving only scale up compared to a monolithic design as you increase the total die area. I can only imagine the kind of yield and cost savings AMD is going to get from the 7000 series being that GPUs are larger than CPUs. Just a few other things to consider in addition to wafer pricing.
 
Joined
Nov 15, 2021
Messages
1,568 (3.48/day)
Location
Knoxville, TN, USA
System Name Work Computer | Laptop | Unfinished Computer
Processor Core i7-6700 | Core i7-7700HQ | Ryzen 5 5600X
Motherboard Dell Q170 | Lenovo ThinkPad | Gigabyte Aorus Elite Wi-Fi
Cooling A fan? | Some Fans | Truly Custom Loop
Memory 4x4GB Crucial 2133 C17 | 2x8GB Assorted | 4x8GB Corsair Vengeance RGB 3600 C26
Video Card(s) Dell Radeon R7 450 | Intel HD 630 + Quadro M620 Mobile | RTX 2080 Ti FE
Storage Crucial BX500 2TB | WD Black SN750 512GB + Intel H20 512GB + 1TB HDD | TBD
Display(s) 3x LG QHD 32" GSM5B96 | Internal 1080p | TBD
Case Dell | Lenovo ThinkPad | Heavily Modified Phanteks P400
Audio Device(s) Half-broken Earbuds | Headphones | TBD
Power Supply Dell TFX Non-standard | Lenovo 225W | EVGA BQ 650W
Mouse Logitech M275 | Monster No-Name $7 Gaming Mouse| TBD
Keyboard Logitech K345 | Lenovo's Incredible Keyboard | TBD
Software Quotesoft, Adobe Acrobat, DaVinci Resolve | GAMES | MOAR GAMES
I can only imagine the kind of yield and cost savings AMD is going to get from the 7000 series being that GPUs are larger than CPUs.
+1 to this. Even if the main die is quite large, the savings will be considerable, as yields decrease exponentially with die size.
 
Joined
Jan 5, 2006
Messages
14,970 (2.40/day)
System Name AlderLake / Laptop
Processor Intel i7 12700K / Intel i3 7100U
Motherboard Gigabyte Z690 Aorus Master / HP 83A3 (U3E1)
Cooling Noctua NH-U12A 2 fans + Thermal Grizzly Kryonaut Extreme + 5 case fans / Fan
Memory 32GB DDR5 Corsair Dominator Platinum RGB 6000MHz CL36 / 8GB DDR4 HyperX CL13
Video Card(s) MSI RTX 2070 Super Gaming X Trio / Intel HD620
Storage Samsung 980 Pro 1TB + 970 Evo 500GB + 850 Pro 512GB + 860 Evo 1TB x2 / Samsung 256GB M.2 SSD
Display(s) 23.8" Dell S2417DG 165Hz G-Sync 1440p / 14" 1080p IPS Glossy
Case Be quiet! Silent Base 600 - Window / HP Pavilion
Audio Device(s) Panasonic SA-PMX94 / Realtek onboard + B&O speaker system / Harman Kardon Go + Play / Logitech G533
Power Supply Seasonic Focus Plus Gold 750W / Powerbrick
Mouse Logitech MX Anywhere 2 Laser wireless / Logitech M330 wireless
Keyboard RAPOO E9270P Black 5GHz wireless / HP backlit
Software Windows 11 / Windows 10
Benchmark Scores Cinebench R23 (Single Core) 1936 @ stock Cinebench R23 (Multi Core) 23006 @ stock

bug

Joined
May 22, 2015
Messages
11,243 (3.99/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
You are going to need to provide a source for that. I remember rumors suggesting this but it was never verified that mining GPUs could have never became gaming GPUs. The rumors also claimed that they would ease GPU prices and supply, which never happened.



Given that we know Nvidia controls everything tightly, Nvidia being aware of the mass sales of GPUs to miners is consent.

In my last comment I detailed their ability to commit such a act, of which the evidence is strong. Saying I provided no evidence is functionally incorrect, proving one has the mental state to commit an action is one of the most important things prosecution has to do in court.

Your logic seems to be that only direct evidence is admissible but even in court that is not the case even in criminal court where the bar is higher. In civil court (which is the closest equivalent) the bar for conviction is a preponderance of the evidence (or 51% likely or greater).

If we demonstrate that Nvidia, whom had vast control over their product and partners, did nothing when prices skyrocketed while also providing evidence to show they had the mental state, motive, and past behavior to directly or indirectly allow this to happen we've just demonstrated, with a preponderance of the evidence, that Nvidia is complicit in the price increases and is culpable as a result.
You play a fine game of relying on rumors and "facts", I'm gonna leave you to it.
 
Joined
Jul 13, 2016
Messages
1,653 (0.69/day)
Processor Ryzen 7700X
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) EVGA 1080 Ti
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
You play a fine game of relying on rumors and "facts", I'm gonna leave you to it.

I mean, you are replying to my post which asked you for the source of the rumor you posted.

That's some irony right there. Of course you are going to leave, you can't prove that rumor.
 

bug

Joined
May 22, 2015
Messages
11,243 (3.99/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
I mean, you are replying to my post which asked you for the source of the rumor you posted.

That's some irony right there. Of course you are going to leave, you can't prove that rumor.
Well, if you must insist...

"Unlike the fully unlocked GeForce RTX 3090 Ti, which uses the same GPU but has all 10752 shaders enabled, NVIDIA has disabled some shading units on the CMP 90HX to reach the product's target shader count." Disabled shaders - you don't do that with working dies.

Furthermore: https://www.techpowerup.com/gpu-specs/?mfgr=NVIDIA&generation=Mining GPUs&sort=name
Mining-oriented SKUs dating back to Pascal. Thus, not something concocted during the past mining craze years.

Want a more direct confirmation CMP did not choke GPU supplies? Here: $24m revenue when a card sells for $8k means a grand total of 3,000 units sold.

As for the rest of your claims, I'm going to draw you a picture and apply Occam's razor.

4dummies.jpg

If I were a miner with $$$ to burn, the most straightforward way for me to get my hands on video cards would be from an AIB partner. Your theory where Nvidia somehow forces not one, but all partners to sell to miners to which Nvidia itself has no direct relation is... shall we say, way more far fetched and therefore way more improbable.
 
Joined
Nov 15, 2021
Messages
1,568 (3.48/day)
Location
Knoxville, TN, USA
System Name Work Computer | Laptop | Unfinished Computer
Processor Core i7-6700 | Core i7-7700HQ | Ryzen 5 5600X
Motherboard Dell Q170 | Lenovo ThinkPad | Gigabyte Aorus Elite Wi-Fi
Cooling A fan? | Some Fans | Truly Custom Loop
Memory 4x4GB Crucial 2133 C17 | 2x8GB Assorted | 4x8GB Corsair Vengeance RGB 3600 C26
Video Card(s) Dell Radeon R7 450 | Intel HD 630 + Quadro M620 Mobile | RTX 2080 Ti FE
Storage Crucial BX500 2TB | WD Black SN750 512GB + Intel H20 512GB + 1TB HDD | TBD
Display(s) 3x LG QHD 32" GSM5B96 | Internal 1080p | TBD
Case Dell | Lenovo ThinkPad | Heavily Modified Phanteks P400
Audio Device(s) Half-broken Earbuds | Headphones | TBD
Power Supply Dell TFX Non-standard | Lenovo 225W | EVGA BQ 650W
Mouse Logitech M275 | Monster No-Name $7 Gaming Mouse| TBD
Keyboard Logitech K345 | Lenovo's Incredible Keyboard | TBD
Software Quotesoft, Adobe Acrobat, DaVinci Resolve | GAMES | MOAR GAMES
Unlike the fully unlocked GeForce RTX 3090 Ti, which uses the same GPU but has all 10752 shaders enabled, NVIDIA has disabled some shading units on the CMP 90HX to reach the product's target shader count." Disabled shaders - you don't do that with working dies.
3090 was the same.
 

bug

Joined
May 22, 2015
Messages
11,243 (3.99/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
3090 was the same.
No, it wasn't. 3090 has 10,496 working shaders, CMP 90HX only has 6,400. Clearly a salvaged part, with almost half the shaders disabled, I'd say.
 
Joined
Nov 15, 2021
Messages
1,568 (3.48/day)
Location
Knoxville, TN, USA
System Name Work Computer | Laptop | Unfinished Computer
Processor Core i7-6700 | Core i7-7700HQ | Ryzen 5 5600X
Motherboard Dell Q170 | Lenovo ThinkPad | Gigabyte Aorus Elite Wi-Fi
Cooling A fan? | Some Fans | Truly Custom Loop
Memory 4x4GB Crucial 2133 C17 | 2x8GB Assorted | 4x8GB Corsair Vengeance RGB 3600 C26
Video Card(s) Dell Radeon R7 450 | Intel HD 630 + Quadro M620 Mobile | RTX 2080 Ti FE
Storage Crucial BX500 2TB | WD Black SN750 512GB + Intel H20 512GB + 1TB HDD | TBD
Display(s) 3x LG QHD 32" GSM5B96 | Internal 1080p | TBD
Case Dell | Lenovo ThinkPad | Heavily Modified Phanteks P400
Audio Device(s) Half-broken Earbuds | Headphones | TBD
Power Supply Dell TFX Non-standard | Lenovo 225W | EVGA BQ 650W
Mouse Logitech M275 | Monster No-Name $7 Gaming Mouse| TBD
Keyboard Logitech K345 | Lenovo's Incredible Keyboard | TBD
Software Quotesoft, Adobe Acrobat, DaVinci Resolve | GAMES | MOAR GAMES
No, it wasn't. 3090 has 10,496 working shaders, CMP 90HX only has 6,400. Clearly a salvaged part, with almost half the shaders disabled, I'd say.
I just meant that some were disabled - doesn't mean "can't make a gaming card"

Heck, they even made a 3070 ti with the GA102, only 6,144 shaders.
 
Top