• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon "Navy Flounder" Features 40CU, 192-bit GDDR6 Memory

Joined
May 2, 2017
Messages
7,762 (3.05/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
You should look up the difference in cost compared to a couple of years ago. Last 5 years we already see a quadrupled price per waffer at TMSC.
Yes, that is true. At the same time chip densities have increased dramatically, allowing for the same performance level at much smaller die sizes. Besides, more performance also necessitates a bigger die, so it's not like this disproportionately hurts the value of low performance GPUs. The same value scaling should apply regardless of wafer costs, though absolute pricing might of course change due to this.
 
Joined
Feb 13, 2012
Messages
522 (0.12/day)
You sound like you're arguing that perf/$ scales linearly across GPU lineups. This has never been the case. Except for the very low end, where value is always terrible, you always get far more bang for your buck in the $150-400 mid-range than anything above. I mean, just look at the data:

Results are similar at other resolutions, though more expensive GPUs "improve" at 4k. As for value increasing as you drop in price: just look at the 5600 XT compared to the 5700 XT in those charts. Same architecture, same generation, same die, yet the cheaper card delivers significantly higher perf/W than the more expensive one.

As for your comparison: you're comparing across generations, so any value comparison is inherently skewed to the point of being invalid. If it weren't the case that you got more bang for your buck in a new generation, something would be very wrong. As it admittedly has been for a couple of generations now, and the 3080 hasn't fixed it either, just brought us back closer to where things should be. It is absolutely to be expected that all future GPUs from both manufacturers at lower points in their product stacks will deliver significantly better perf/$ than the 3080. That's the nature of premium products: you pay more for the privilege of having the best. $700 GPUs have never been a good value proposition.

We're literally a week post launch. Demand has been crazy, scalpers with bots have run rampant, and everything is sold out. Give it a while to normalize before you comment on "real-world" prices. And while Nvidia's previous "here's the MSRP, here's the somehow premium, but also base-line FE card at a premium price" shenanigans and the near-nonexistence of cards at MSRP, to be fair to them they seem to have stopped doing that this time around.

Right. As if I ever said that. Maybe actually read my post? Nice straw man you've got there.

Again: accounted for, if you had bothered to actually read my post.

Let me refresh your memory:

What I'm saying here is that the SoC TDP only accounting for 50% of the PSU's rating, which might include PSU losses due to efficiency sounds a bit low. I'm asking you to source a number that you're stating as fact. Remember, you said:

No source, no mention that this is speculation or even anything beyond established fact. Which is what I was asking you to provide.

Unsourced, based on rumors and speculation. The page says as much.

That is at least a correlation, but correlation does not mean that the rumor you are quoting as fact is actually true. This is what you call speculation.

And here is the core of the matter: you are repeating rumors and "what you think is likely" as if it is indisputable fact. It is of course entirely possible that the PS5 SoC has a TDP somewhere around 175W - but you don't have any actual proof of this. So please stop repeating rumors as if they are facts. That is a really bad habit.
Oh I understood what you said, I was just pointing out that power rating and TDP are related but not the same thing. And I linked the ps4 to show the correlation that TDP rating being 50% of the complete system power rating is pretty normal. And yes we are speculating that's what most of these comments are so I am not sure why you are so riled up about it. Even my original post was mostly making educated guesses of what likely the line up will look like. As for PS5 most of the specs are basically out so it's less of a rumor at this point. But we will wait and see
 
Joined
May 2, 2017
Messages
7,762 (3.05/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
You still didn't understand my point, I'm talking about the MSRP not the current prices due to low offer, the official price is set higher by nVidia for all markets compared to the US and Canada by sometimes stupid margins even in Taiwan, so the price/performance of the card is very poor.
Uhm, that's the opposite of what you just said:
I'm not talking about official prices around the world
But okay, that makes sense to a certain degree. How much of this is down to currency fluctuations, different levels of VAT, and so on? Likely quite a lot. The economic situation around the world has also changed quite dramatically during the past decade, with the US dollar strengthening itself compared to a lot of currencies, which sadly tends to have knock-on effects for importing industries (i.e. the higher prices impact multiple levels of a business, driving up costs and prices even more). There's also signs indicating a change in attitude where the US has previously been the cash cow, with a lot of other countries being given relatively advantageous pricing for various reasons (such as EU prices with VAT coming very close to US prices without sales tax). This seems to be gone now across most industries. Sadly that means stuff gets more expensive for most of us. But it's also an across-the-board increase - turing GPUs were certainly more expensive than any other series of GPUs here in Norway and Sweden, as were Pascal before them, etc.

Oh I understood what you said, I was just pointing out that power rating and TDP are related but not the same thing. And I linked the ps4 to show the correlation that TDP rating being 50% of the complete system power rating is pretty normal. And yes we are speculating that's what most of these comments are so I am not sure why you are so riled up about it. Even my original post was mostly making educated guesses of what likely the line up will look like. As for PS5 most of the specs are basically out so it's less of a rumor at this point. But we will wait and see
All I did was ask for a source for the 175W number in your original post, as a) I hadn't heard it before, and b) it sounded quite low. IMO it still does, and as the number you stated as fact is an unsourced and unsubstantiated rumor I'll hold off until we see the actual power draw of the console. I don't think it'll be 350W by any means, but 175W at those clock speeds sounds too good to be true.
This has been going around Reddit, and basically sums up the rumors:

View attachment 169691

Its a bit of a meme and non-serious, but I think yall get the gist. There's so many conflicting rumors on Navi its hard to take any of these "leaks" seriously.
That looks about right. I think people are reading too much into a lot of things. It may well be that the 192/256-bit numbers are some sort of trickery (most leaks are controlled and on purpose, after all), it might indeed have HBM2e, or it might just have a plain GDDR6 bus. Or some other weirdness, like a cache. Until we have something more concrete on our hands, this is pretty useless. Interesting on paper, but useless without actual information.
 
Joined
Feb 13, 2012
Messages
522 (0.12/day)
All I did was ask for a source for the 175W number in your original post, as a) I hadn't heard it before, and b) it sounded quite low. IMO it still does, and as the number you stated as fact is an unsourced and unsubstantiated rumor I'll hold off until we see the actual power draw of the console. I don't think it'll be 350W by any means, but 175W at those clock speeds sounds too good to be true.

Reasonable enough, it's only speculation so nothing is concrete. I just personally think it's doable. Lets lay down some of the actual facts or published info that we can relate to:
RX5700 has the same number of cores as the ps5 (2304)
RX5700 has a TDP of 225W
RDNA2 is 50% more efficient than RDNA1
according to AMD

now we can do our speculation/analysis:
If we do the math we can safely assume that 2304 RDNA2 shaders are 50% more efficient than RDNA1 shaders, so that leaves us with 150w at the same performance as rx5700, but in this case; it is clocked around 10-15% higher so that could account for the 175w rather than 150w. Also having cpu cores increases overall package (but not by much since it is clocked low and relies on smart shift technology)

Other things to consider:
Clockspeed for the ps5 is the turbo speed
TSMC 7nm is more mature
AMD is more likely to clock chips higher and sell them as a higher tier product than to clock them low at the sweet spot of performance/watt (this is in relation to this article and my first comment)
 
Joined
Oct 22, 2014
Messages
13,210 (3.81/day)
Location
Sunshine Coast
System Name Black Box
Processor Intel Xeon E3-1260L v5
Motherboard MSI E3 KRAIT Gaming v5
Cooling Tt tower + 120mm Tt fan
Memory G.Skill 16GB 3600 C18
Video Card(s) Asus GTX 970 Mini
Storage Kingston A2000 512Gb NVME
Display(s) AOC 24" Freesync 1m.s. 75Hz
Case Corsair 450D High Air Flow.
Audio Device(s) No need.
Power Supply FSP Aurum 650W
Mouse Yes
Keyboard Of course
Software W10 Pro 64 bit
All these rumours surrounding Navy Flounder sound fishy to me. :p
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,147 (2.95/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
All these rumours surrounding Navy Flounder sound fishy to me. :p
It sounds like the Kobayashi Maru has set sail for the promise land.
 
Joined
Oct 22, 2014
Messages
13,210 (3.81/day)
Location
Sunshine Coast
System Name Black Box
Processor Intel Xeon E3-1260L v5
Motherboard MSI E3 KRAIT Gaming v5
Cooling Tt tower + 120mm Tt fan
Memory G.Skill 16GB 3600 C18
Video Card(s) Asus GTX 970 Mini
Storage Kingston A2000 512Gb NVME
Display(s) AOC 24" Freesync 1m.s. 75Hz
Case Corsair 450D High Air Flow.
Audio Device(s) No need.
Power Supply FSP Aurum 650W
Mouse Yes
Keyboard Of course
Software W10 Pro 64 bit
D

Deleted member 185088

Guest
But okay, that makes sense to a certain degree. How much of this is down to currency fluctuations, different levels of VAT, and so on? Likely quite a lot. The economic situation around the world has also changed quite dramatically during the past decade, with the US dollar strengthening itself compared to a lot of currencies, which sadly tends to have knock-on effects for importing industries (i.e. the higher prices impact multiple levels of a business, driving up costs and prices even more). There's also signs indicating a change in attitude where the US has previously been the cash cow, with a lot of other countries being given relatively advantageous pricing for various reasons (such as EU prices with VAT coming very close to US prices without sales tax). This seems to be gone now across most industries. Sadly that means stuff gets more expensive for most of us. But it's also an across-the-board increase - turing GPUs were certainly more expensive than any other series of GPUs here in Norway and Sweden, as were Pascal before them, etc.
While I agree with your logic, unfortunately it doesn't hold in all markets, I feel it's nVidia's fault and its partners, given that comparable products don't have the same unjustified price hike, like products from Intel, AMD and consoles/phone makers.
 
Joined
Jul 10, 2015
Messages
748 (0.23/day)
Location
Sokovia
System Name Alienation from family
Processor i7 7700k
Motherboard Hero VIII
Cooling Macho revB
Memory 16gb Hyperx
Video Card(s) Asus 1080ti Strix OC
Storage 960evo 500gb
Display(s) AOC 4k
Case Define R2 XL
Power Supply Be f*ing Quiet 600W M Gold
Mouse NoName
Keyboard NoNameless HP
Software You have nothing on me
Benchmark Scores Personal record 100m sprint: 60m
All these rumours surrounding Navy Flounder sound fishy to me. :p

Flounder:
verb (used without object) : to struggle with stumbling or plunging movements (usually followed by about, along, on, through, etc.)
CMO of AMD is on a slippery road

Second thought, maybe it is water cooled. Fury RX.
 
Joined
Feb 18, 2005
Messages
5,238 (0.75/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Logitech G613
Software Windows 10 Professional x64
Redgamingtech and Moore's Law is Dead leaked about the cache:


Textures are not required for intensive calculations, what a cache is used for is for storing values that need to be accessed multiple times for multiple calculations.
It might be misinformation from AMD, but it's not completely crazy, especially if you listen to Mark Cerney's diatribe in the second video.

Yes yes, Moore's Law is Dead, that super impartial oracle who deems the 3080 "underwhelming" despite it beating the pants off everything else in the market, and says RDNA2 will crush it despite zero evidence. Definitely a reliable source.

I don't know about you, but I prefer my leaks not to come from the bleeding, distended rectum of an idiot fanboy.
 
Joined
Sep 26, 2012
Messages
860 (0.20/day)
Location
Australia
System Name ATHENA
Processor AMD 7950X
Motherboard ASUS Crosshair X670E Extreme
Cooling Noctua NH-D15S, 7 x Noctua NF-A14 industrialPPC IP67 2000RPM
Memory 2x32GB Trident Z RGB 6000Mhz CL30
Video Card(s) ASUS 4090 Strix
Storage 3 x Kingston Fury 4TB, 4 x Samsung 870 QVO
Display(s) Alienware AW3821DW, Wacom Cintiq Pro 15
Case Fractal Design Torrent
Audio Device(s) Topping A90/D90 MQA, Fluid FPX7 Fader Pro, Beyerdynamic T1 G2, Beyerdynamic MMX300
Power Supply ASUS THOR 1600T
Mouse Xtrfy MZ1 - Zy' Rail, Logitech MX Vertical, Logitech MX Master 3
Keyboard Logitech G915 TKL
VR HMD Oculus Quest 2
Software Windows 11 + OpenSUSE MicroOS
Yes yes, Moore's Law is Dead, that super impartial oracle who deems the 3080 "underwhelming" despite it beating the pants off everything else in the market, and says RDNA2 will crush it despite zero evidence. Definitely a reliable source.

I don't know about you, but I prefer my leaks not to come from the bleeding, distended rectum of an idiot fanboy.

He was pretty reliable about leaks on the 3080 and 3090.

Also, Ampere IS underwhelming.

1\ Performance scales directly in line with power usage from a 2080 Ti.
2\ It draws buckets of power
3\ It fails when it boosts over 2Ghz
4\ It was a paper launch
5\ And isn't even value for money. You can't get any of these cards for MSRP except founders editions, and there are so little founders editions it makes that essentially irrelevant.

The only redeeming factor of Ampere at this point are they are the fastest cards around. But don't mistake it, this is fermi all over again.
 
Joined
May 15, 2020
Messages
697 (0.49/day)
Location
France
System Name Home
Processor Ryzen 3600X
Motherboard MSI Tomahawk 450 MAX
Cooling Noctua NH-U14S
Memory 16GB Crucial Ballistix 3600 MHz DDR4 CAS 16
Video Card(s) MSI RX 5700XT EVOKE OC
Storage Samsung 970 PRO 512 GB
Display(s) ASUS VA326HR + MSI Optix G24C4
Case MSI - MAG Forge 100M
Power Supply Aerocool Lux RGB M 650W
Yes yes, Moore's Law is Dead, that super impartial oracle who deems the 3080 "underwhelming" despite it beating the pants off everything else in the market, and says RDNA2 will crush it despite zero evidence. Definitely a reliable source.

I don't know about you, but I prefer my leaks not to come from the bleeding, distended rectum of an idiot fanboy.
You speak like somebody who didn't watch the video. Anyways, I put the time scroll to the point where he discusses the memory management overhaul, that's the point there.

As to the quality of the leaks, I'm quite satisfied with the quality of the leaks and analyses Tom puts out, I've known from him the performance of the Nvidia cards 6 months ago it was right on the money the whole long.

As for the underwhelming aspect, I'm personally not underwhelmed, it's pretty much what I was expecting with the 3000 series. I agree about the hype though, people who believe this is the biggest generational leap are people who have known GPU's only for 4 years. It's a decent generation, who appears huge only because the generation before it was mediocre and expensive.
 
Last edited:
Joined
Feb 18, 2005
Messages
5,238 (0.75/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Logitech G613
Software Windows 10 Professional x64
He was pretty reliable about leaks on the 3080 and 3090.

Also, Ampere IS underwhelming.

1\ Performance scales directly in line with power usage from a 2080 Ti.
2\ It draws buckets of power
3\ It fails when it boosts over 2Ghz
4\ It was a paper launch
5\ And isn't even value for money. You can't get any of these cards for MSRP except founders editions, and there are so little founders editions it makes that essentially irrelevant.

The only redeeming factor of Ampere at this point are they are the fastest cards around. But don't mistake it, this is fermi all over again.

1. Wrong: https://www.techpowerup.com/review/nvidia-geforce-rtx-3080-founders-edition/31.html "Around 300 W is only 10% more power draw than the RTX 2080 Ti—while achieving +25% gaming performance."
2. Undisputed - but it puts that power to good use.
3. Anecdotal and uncorroborated. Could easily be PSUs that are not up to scratch.
4. No it wasn't. High demand is not a paper launch. Buy yourself a dictionary.
5. Supply and demand is NVIDIA's fault now?
 
Joined
May 2, 2017
Messages
7,762 (3.05/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
While I agree with your logic, unfortunately it doesn't hold in all markets, I feel it's nVidia's fault and its partners, given that comparable products don't have the same unjustified price hike, like products from Intel, AMD and consoles/phone makers.
Well, I guess things change differently depending on your location. Here in the Nordics, prices have increased significantly across the board over the past decade. That's mostly due to the NOK/SEK to USD conversion rate, which made a big jump around 2015 or so, but as I said also due to knock-on effects from this. The same applies to prices in EUR though, as the same USD price jump can be seen there. This largely accounds for the change in practices where previously USD MSRP w/o tax ~= EU MSRP w/tax, simply because the EUR (and closely linked currencies) used to be worth more relative to the USD. That means that GPUs, consoles, phones, whatever - they've all become noticeably more expensive.
Reasonable enough, it's only speculation so nothing is concrete. I just personally think it's doable. Lets lay down some of the actual facts or published info that we can relate to:
RX5700 has the same number of cores as the ps5 (2304)
RX5700 has a TDP of 225W
RDNA2 is 50% more efficient than RDNA1
according to AMD

now we can do our speculation/analysis:
If we do the math we can safely assume that 2304 RDNA2 shaders are 50% more efficient than RDNA1 shaders, so that leaves us with 150w at the same performance as rx5700, but in this case; it is clocked around 10-15% higher so that could account for the 175w rather than 150w. Also having cpu cores increases overall package (but not by much since it is clocked low and relies on smart shift technology)

Other things to consider:
Clockspeed for the ps5 is the turbo speed
TSMC 7nm is more mature
AMD is more likely to clock chips higher and sell them as a higher tier product than to clock them low at the sweet spot of performance/watt (this is in relation to this article and my first comment)
That is of course possible, but remember that power increases exponentially as clock speeds increase, so a 10-15% increase in clocks never results in a 10-15% increase in power draw - something more along the lines of 25-35% is far more likely. Which is part of why I'm skeptical of this. Sony's rated clocks are as you say peak boost clocks, but they have promised that the console will run at or near those clocks for the vast majority of use cases. That means that you're running a slightly overclocked 4900H or HS (the consoles have the same reduced cache sizes as Renoir IIRC, so let's be generous and say they manage 3.5GHz all-core at 40W) and an overclocked 5700 within the SoC TDP. That leaves X minus 40W for the GPU. Your numbers then mean they would be able to run an overclocked 5700 equivalent at just 135W. If this was approached through a wide-and-slow, more CUs but lower clocks approach (like the XSX), I would be inclined to agree with you that it would be possible given the promised efficiency improvements (up to 50%, though "up to" makes that a very loose promise) and node improvements. But for a chip of the same width with clocks pushed likely as high as they are able to get them? We have plenty of data to go on for AMD GPU implementations like that (5700 XT vs 5700, RX 590 vs 580, etc.), and what that data shows us is that power consumption makes a very significant jump to reach those higher clocks. And while Smart Shift will of course help some with power balancing, it won't have that much to work with given the likely 40-50W power envelope of the CPU. Even lightly threaded games are unlikely to drop below 30W of CPU power consumption after all, so even that gives the GPU just 155W to work with.

You're also too optimistic in thinking that 50% perf/W increase is across the board in all cases. The wording was - very deliberately, as this was an investor call - up to 50%. That likely means a worst case vs best case scenario comparison, so something like a 5700 XT compared to the 6000-series equivalent of a 5600 XT. The PS5 GPU with its high clocks does not meet the criteria for being a best case scenario for efficiency. Of course that was stated a while ago, and they might have managed more than 50% best-case-scenario improvements, but that still doesn't mean we're likely to get 50% improvement when clocks are pushed high.
All these rumours surrounding Navy Flounder sound fishy to me. :p
It sounds like the Kobayashi Maru has set sail for the promise land.
I hope it doesn't hit a reef and flounder.
Oh right puns and stuff. Uhhh... food, fish, bottom feeding joke or something.... oh right. I hope my joke doesn't fall flat on anyone.
Flounder:
verb (used without object) : to struggle with stumbling or plunging movements (usually followed by about, along, on, through, etc.)
CMO of AMD is on a slippery road
All of those fish code names just made me think of Dr. Seuss.
 
Joined
Sep 26, 2012
Messages
860 (0.20/day)
Location
Australia
System Name ATHENA
Processor AMD 7950X
Motherboard ASUS Crosshair X670E Extreme
Cooling Noctua NH-D15S, 7 x Noctua NF-A14 industrialPPC IP67 2000RPM
Memory 2x32GB Trident Z RGB 6000Mhz CL30
Video Card(s) ASUS 4090 Strix
Storage 3 x Kingston Fury 4TB, 4 x Samsung 870 QVO
Display(s) Alienware AW3821DW, Wacom Cintiq Pro 15
Case Fractal Design Torrent
Audio Device(s) Topping A90/D90 MQA, Fluid FPX7 Fader Pro, Beyerdynamic T1 G2, Beyerdynamic MMX300
Power Supply ASUS THOR 1600T
Mouse Xtrfy MZ1 - Zy' Rail, Logitech MX Vertical, Logitech MX Master 3
Keyboard Logitech G915 TKL
VR HMD Oculus Quest 2
Software Windows 11 + OpenSUSE MicroOS
1. Wrong: https://www.techpowerup.com/review/nvidia-geforce-rtx-3080-founders-edition/31.html "Around 300 W is only 10% more power draw than the RTX 2080 Ti—while achieving +25% gaming performance."
2. Undisputed - but it puts that power to good use.
3. Anecdotal and uncorroborated. Could easily be PSUs that are not up to scratch.
4. No it wasn't. High demand is not a paper launch. Buy yourself a dictionary.
5. Supply and demand is NVIDIA's fault now?

1\ If you are going to quote techpowerup, at least use the right graph of performance per watt - https://www.techpowerup.com/review/nvidia-geforce-rtx-3080-founders-edition/35.html. Its 3% more efficient on a 3080, and exactly the same on a 3090 in performance per watt to a 2080 Ti.
2\ Linear is linear, its the fastest GPU, but its also the most power hungry. Honestly, I expect more from a new architecture + node drop.
3\ Fairly well corroborated between multiple users and now a few in the tech press.
4\ Pull the other one. The tech press knew weeks before hand that there would be limited availability and reported as such. Low and behold.
5\ It's pretty obvious that there is fuck all margin between what Nvidia set the MSRP as and what it sells to AIB's for. Hence only Nvidia's cards are at MSRP and even the cheapest AIB cards are over MSRP.

Its somewhat funny, you blasted in claiming fanboi's everywhere, but man, aren't you the biggest one of all.
 
Joined
Jan 11, 2005
Messages
1,491 (0.21/day)
Location
66 feet from the ground
System Name 2nd AMD puppy
Processor FX-8350 vishera
Motherboard Gigabyte GA-970A-UD3
Cooling Cooler Master Hyper TX2
Memory 16 Gb DDR3:8GB Kingston HyperX Beast + 8Gb G.Skill Sniper(by courtesy of tabascosauz &TPU)
Video Card(s) Sapphire RX 580 Nitro+;1450/2000 Mhz
Storage SSD :840 pro 128 Gb;Iridium pro 240Gb ; HDD 2xWD-1Tb
Display(s) Benq XL2730Z 144 Hz freesync
Case NZXT 820 PHANTOM
Audio Device(s) Audigy SE with Logitech Z-5500
Power Supply Riotoro Enigma G2 850W
Mouse Razer copperhead / Gamdias zeus (by courtesy of sneekypeet & TPU)
Keyboard MS Sidewinder x4
Software win10 64bit ltsc
Benchmark Scores irrelevant for me
4. No it wasn't. High demand is not a paper launch. Buy yourself a dictionary.
5. Supply and demand is NVIDIA's fault now?

unfortunately these points are NV faults period; they rushed the launch without having proper stock ; is not the 1st time when a company do this and backfire; even they cancelled the bot/scalper orders still no availability ..... it looks like a paper launch mostly because on ebay you can buy paper drawings of the cards ... and these are on stock...
 
Joined
May 2, 2017
Messages
7,762 (3.05/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
1\ If you are going to quote techpowerup, at least use the right graph of performance per watt - https://www.techpowerup.com/review/nvidia-geforce-rtx-3080-founders-edition/35.html. Its 3% more efficient on a 3080, and exactly the same on a 3090 in performance per watt to a 2080 Ti.
2\ Linear is linear, its the fastest GPU, but its also the most power hungry. Honestly, I expect more from a new architecture + node drop.
3\ Fairly well corroborated between multiple users and now a few in the tech press.
4\ Pull the other one. The tech press knew weeks before hand that there would be limited availability and reported as such. Low and behold.
5\ It's pretty obvious that there is fuck all margin between what Nvidia set the MSRP as and what it sells to AIB's for. Hence only Nvidia's cards are at MSRP and even the cheapest AIB cards are over MSRP.

Its somewhat funny, you blasted in claiming fanboi's everywhere, but man, aren't you the biggest one of all.
1 and 2: Power usage varies by resolution, and many games are power limited at 1080p for these GPUs, making power draw numbers below 1440p inaccurate. Besides, TPU bases all of their efficiency graphs on one power measurement (!) at 4k, meaning in CPU limited scenarios the actual power usage is likely to be notably lower than what is reported - but we can't know by how much, as this wasn't measured. There is an improvement, but it's most visible at 4k, and at lower resolutions (even when not CPU limited) it's not particularly impressive.
3: We still don't know what is causing this or what is happening. Might be a serious issue, might be a tiny firmware fix, might be PEBCAK.
4:
Not a paper launch, but very high demand, also (dramatically) inflated by scalpers using bots to vacuum up inventory.
5: Margins are absolutely razor thin, though given just how dense and expensive the FE PCB is, it should be entirely possible for AIB partners to make custom PCB cards at decent quality and sell them at MSRP - they just won't have as good components as the FE. This is a bit shitty from Nvidia's side, but it's better than them setting a precedent for MSPR+$100 as a baseline price like they did previously.
He was pretty reliable about leaks on the 3080 and 3090.
As to the quality of the leaks, I'm quite satisfied with the quality of the leaks and analyses Tom puts out
That is besides the point, and simply not how you validate your sources. That channel has time and time again presented rumors and garbage data as fact or "leaks", information that has since been shown to have been entirely wrong. That some of what is presented is accurate does in no way make up for this. As the saying goes, even a stopped clock is correct twice a day. MLID is a fundamentally untrustworthy source of information, and anything and everything presented there should be viewed as very dubious. Some of it might turn out to be true, but it's impossible to know what until we have confirmation from somewhere else. Until we can see a significant portion of time when all data presented turns out to be factually accurate, none of what is presented can be trusted to be true.
unfortunately these points are NV faults period; they rushed the launch without having proper stock ; is not the 1st time when a company do this and backfire; even they cancelled the bot/scalper orders still no availability ..... it looks like a paper launch mostly because on ebay you can buy paper drawings of the cards ... and these are on stock...
See the GN video above. Stock levels are according to partners the same as or higher than previous launches. Nvidia certainly isn't blameless for the current situation, but this is not a paper launch.
 
Joined
May 15, 2020
Messages
697 (0.49/day)
Location
France
System Name Home
Processor Ryzen 3600X
Motherboard MSI Tomahawk 450 MAX
Cooling Noctua NH-U14S
Memory 16GB Crucial Ballistix 3600 MHz DDR4 CAS 16
Video Card(s) MSI RX 5700XT EVOKE OC
Storage Samsung 970 PRO 512 GB
Display(s) ASUS VA326HR + MSI Optix G24C4
Case MSI - MAG Forge 100M
Power Supply Aerocool Lux RGB M 650W
That is besides the point, and simply not how you validate your sources. That channel has time and time again presented rumors and garbage data as fact or "leaks", information that has since been shown to have been entirely wrong. That some of what is presented is accurate does in no way make up for this. As the saying goes, even a stopped clock is correct twice a day. MLID is a fundamentally untrustworthy source of information, and anything and everything presented there should be viewed as very dubious. Some of it might turn out to be true, but it's impossible to know what until we have confirmation from somewhere else. Until we can see a significant portion of time when all data presented turns out to be factually accurate, none of what is presented can be trusted to be true.
I completely disagree with you there, but that's way off-topic. I'm pretty versed with analyzing information and on many points, Tom gets out relevant information better than others and before others, and you can always use your BS filter if you have one. I like his analyses and I'm comfortable with my ability to extract valid information out of them.
In any case, the part about the memory architecture overhaul is simply an extract from Cerney's PS5 discourse, so it's basically open-source information, not a leak.
 
Joined
May 2, 2017
Messages
7,762 (3.05/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
I completely disagree with you there, but that's way off-topic. I'm pretty versed with analyzing information and on many points, Tom gets out relevant information better than others and before others, and you can always use your BS filter if you have one. I like his analyses and I'm comfortable with my ability to extract valid information out of them.
In any case, the part about the memory architecture overhaul is simply an extract from Cerney's PS5 discourse, so it's basically open-source information, not a leak.
Again, that really shouldn't be how you evaluate your sources. A source that has repeatedly presented inaccurate information without sufficiently underscoring the speculative nature of this or without then retracting that information immediately upon it being disproved is inherently untrustworthy. It then doesn't matter if their analyses are generally decent or whatever else they might do: the basis on which they perform their work can't be trusted, thus the work itself can't be trusted. That carries over even into cases where the data they are working from is widely known to be accurate, as it demonstrates an attitude towards thorough and proper treatment of data that is severely lacking. Your "BS filter" doesn't matter, as unless you are prescient (in which case, why the need for watching stuff like that?) you can't know which parts of the basis for the analysis are accurate or not. Not to mention that even the necessity of a "BS filter" when watching any type of analysis seriously underscores the low quality of the analysis. "It's good once you filter out the bad bits" ... well, yeah, but so is drinking sewage.
 
Joined
Feb 18, 2005
Messages
5,238 (0.75/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Logitech G613
Software Windows 10 Professional x64
Again, that really shouldn't be how you evaluate your sources. A source that has repeatedly presented inaccurate information without sufficiently underscoring the speculative nature of this or without then retracting that information immediately upon it being disproved is inherently untrustworthy. It then doesn't matter if their analyses are generally decent or whatever else they might do: the basis on which they perform their work can't be trusted, thus the work itself can't be trusted. That carries over even into cases where the data they are working from is widely known to be accurate, as it demonstrates an attitude towards thorough and proper treatment of data that is severely lacking. Your "BS filter" doesn't matter, as unless you are prescient (in which case, why the need for watching stuff like that?) you can't know which parts of the basis for the analysis are accurate or not. Not to mention that even the necessity of a "BS filter" when watching any type of analysis seriously underscores the low quality of the analysis. "It's good once you filter out the bad bits" ... well, yeah, but so is drinking sewage.

The thing is, with trash channels like MLID it's not even filtering good from bad, because there is no good: it's regurgitated from legitimate sources like Gamers Nexus in order to make MLID appear legitimate, and he then abuses that apparent legitimacy to peddle his half-baked bullshit. Result, people who aren't good at discerning trash from quality believe it all and fall back on "but he was right regarding XXX (that he copied from a real source) so he must be right on YYY (nonsense that he crapped out)".

Melding your lies with the mainstream's truth in order to make your lies appear truthful is the oldest trick in the book when it comes to manipulating discourse and public opinion (see: Russia and US elections), and unfortunately most people choose news sources based on whether that source agrees with their worldview, rather than how trustworthy said source is. They also have a penchant for doubling down and defending "their" news source when said the credibility of said source is brought into question (instead of holding it accountable), or handwaving the source's inaccuracy away with excuses such as "everyone gets it wrong now and then". Except the dodgy sources get it wrong time and time again.

Make no mistake though, MLID is laughing all the way to the bank with every cent of ad revenue he gets from every chump who watches his reddit clickbait videos. Anyone who wants to reward a liar for his lies, that's your business - but don't expect me to do the same.
 
Last edited:
D

Deleted member 185088

Guest
Yes yes, Moore's Law is Dead, that super impartial oracle who deems the 3080 "underwhelming" despite it beating the pants off everything else in the market, and says RDNA2 will crush it despite zero evidence. Definitely a reliable source.

I don't know about you, but I prefer my leaks not to come from the bleeding, distended rectum of an idiot fanboy.
Ampere is underwhelming because it's expensive the xx80 should've been 600$ and not come with only 10gb of vram. nVidia mislead people by comparing it to Turing which was a disaster price wise.
But I totally agree his analysis are messy and biased. I'm still waiting for updated PS5 hardware...
 
Joined
May 15, 2020
Messages
697 (0.49/day)
Location
France
System Name Home
Processor Ryzen 3600X
Motherboard MSI Tomahawk 450 MAX
Cooling Noctua NH-U14S
Memory 16GB Crucial Ballistix 3600 MHz DDR4 CAS 16
Video Card(s) MSI RX 5700XT EVOKE OC
Storage Samsung 970 PRO 512 GB
Display(s) ASUS VA326HR + MSI Optix G24C4
Case MSI - MAG Forge 100M
Power Supply Aerocool Lux RGB M 650W
Wow, you guys are so sincere in your opinions, it almost convinces me to consider my sources.
The only problem is, your posts reek of bias and ad hominem arguments. So thank you, but no thank you, I value Tom's leaks much more than your opinions, so we'll agree to disagree on this one.
Anyway, back on topic, there are some rumors that AMD has rehauled their memory architecture, but nobody's sure about it, although the people that gave this information also supplied RedGamingTech with real photos of the cards. We'll know more about it in a few weeks.
 
Joined
Feb 18, 2005
Messages
5,238 (0.75/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Logitech G613
Software Windows 10 Professional x64
Wow, you guys are so sincere in your opinions, it almost convinces me to consider my sources.
The only problem is, your posts reek of bias and ad hominem arguments. So thank you, but no thank you, I value Tom's leaks much more than your opinions, so we'll agree to disagree on this one.

They also have a penchant for doubling down and defending "their" news source

You don't have to be that predictable, y'know?
 
Top