• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Leaked Benchmark shows Possible NVIDIA MX450 with GDDR6 Memory

Joined
Mar 31, 2020
Messages
1,519 (1.03/day)
A new listing was spotted on the 3DMark results browser for what could be the NVIDIA MX450 laptop GPU. The MX450 is expected to be based on the TU117, the same as the GTX 1650 speculated @_rogame. The leaked benchmark shows the MX450 having a clock speed of 540 MHz and 2 GB of GDDR6 memory. The memory is listed as having a speed of 2505 MHz meaning a potential memory speed of 10Gbit/s. It is interesting to see the shift to GDDR6 in NVIDIA's suite of products likely due to a shortage in GDDR5 or simply that GDDR6 is now cheaper.

The TU117 GPU found in the GTX 1650 GDDR6 has proven itself to be a solid 1080p gaming option. The chip is manufactured on TSMC's 12 nm process and features 1024 shading units, 64 texture mapping units and 32 ROPs. The MX450 should provide a significant boost over integrated graphics at a TDP of 25 W, and will sit under the GTX 1650 Mobile due to its reduced RAM and power/thermal constraints.



View at TechPowerUp Main Site
 
Joined
Nov 3, 2013
Messages
2,141 (0.56/day)
Location
Serbia
Processor Ryzen 3600
Motherboard X570 I Aorus Pro
Cooling Deepcool AG400
Memory HyperX Fury 2 x 8GB 3200 CL16
Video Card(s) RX 470 Nitro+ 4GB
Storage SX8200 Pro 512 / NV2 512
Display(s) 24G2U
Case NR200P
Power Supply Ion SFX 650
Mouse G703
Keyboard Keychron V1 (Akko Matcha Green) / Apex m500 (gateron milky yellow)
Software W10
It is interesting to see the shift to GDDR6 in NVIDIA's suite of products likely due to a shortage in GDDR5 or simply that GDDR6 is now cheaper.
GDDR6 also uses less power, which is essential for devices that will feature this GPU
 

TheLostSwede

News Editor
Joined
Nov 11, 2004
Messages
16,055 (2.26/day)
Location
Sweden
System Name Overlord Mk MLI
Processor AMD Ryzen 7 7800X3D
Motherboard Gigabyte X670E Aorus Master
Cooling Noctua NH-D15 SE with offsets
Memory 32GB Team T-Create Expert DDR5 6000 MHz @ CL30-34-34-68
Video Card(s) Gainward GeForce RTX 4080 Phantom GS
Storage 1TB Solidigm P44 Pro, 2 TB Corsair MP600 Pro, 2TB Kingston KC3000
Display(s) Acer XV272K LVbmiipruzx 4K@160Hz
Case Fractal Design Torrent Compact
Audio Device(s) Corsair Virtuoso SE
Power Supply be quiet! Pure Power 12 M 850 W
Mouse Logitech G502 Lightspeed
Keyboard Corsair K70 Max
Software Windows 10 Pro
Benchmark Scores https://valid.x86.fr/5za05v
This looks a lot more interesting than the MX350.
 
Joined
Nov 18, 2010
Messages
7,124 (1.45/day)
Location
Rīga, Latvia
System Name HELLSTAR
Processor AMD RYZEN 9 5950X
Motherboard ASUS Strix X570-E
Cooling 2x 360 + 280 rads. 3x Gentle Typhoons, 3x Phanteks T30, 2x TT T140 . EK-Quantum Momentum Monoblock.
Memory 4x8GB G.SKILL Trident Z RGB F4-4133C19D-16GTZR 14-16-12-30-44
Video Card(s) Sapphire Pulse RX 7900XTX + under waterblock.
Storage Optane 900P[W11] + WD BLACK SN850X 4TB + 750 EVO 500GB + 1TB 980PRO[FEDORA]
Display(s) Philips PHL BDM3270 + Acer XV242Y
Case Lian Li O11 Dynamic EVO
Audio Device(s) Sound Blaster ZxR
Power Supply Fractal Design Newton R3 1000W
Mouse Razer Basilisk
Keyboard Razer BlackWidow V3 - Yellow Switch
Software FEDORA 39 / Windows 11 insider
Joined
Feb 18, 2005
Messages
5,238 (0.75/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Logitech G613
Software Windows 10 Professional x64
GDDR6 also uses less power, which is essential for devices that will feature this GPU

Incorrect, it uses more power. GTX 1650 GDDR6 models have to clock their GPUs lower than the GDDR5 models to fit within the same power budget.
 
Joined
Nov 3, 2013
Messages
2,141 (0.56/day)
Location
Serbia
Processor Ryzen 3600
Motherboard X570 I Aorus Pro
Cooling Deepcool AG400
Memory HyperX Fury 2 x 8GB 3200 CL16
Video Card(s) RX 470 Nitro+ 4GB
Storage SX8200 Pro 512 / NV2 512
Display(s) 24G2U
Case NR200P
Power Supply Ion SFX 650
Mouse G703
Keyboard Keychron V1 (Akko Matcha Green) / Apex m500 (gateron milky yellow)
Software W10
Incorrect, it uses more power. GTX 1650 GDDR6 models have to clock their GPUs lower than the GDDR5 models to fit within the same power budget.
Doesnt GDDR6 operate at a lower voltage? Unless the current on them is cranked up over GDDR5.
 
Joined
Feb 18, 2005
Messages
5,238 (0.75/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Logitech G613
Software Windows 10 Professional x64
Doesnt GDDR6 operate at a lower voltage? Unless the current on them is cranked up over GDDR5.

Correct, but GDDR6 operating frequencies are far higher than GDDR5. For example the GTX 1650 GDDR6 uses 12Gbps chips vs the 8Gbps on the GDDR5 model - that 50% extra bandwidth has to come from somewhere, and GDDR6 is really just "GDDR5 version 2" so power usage will scale with frequency in the same manner.
 
Joined
Apr 12, 2013
Messages
6,743 (1.68/day)
Incorrect, it uses more power. GTX 1650 GDDR6 models have to clock their GPUs lower than the GDDR5 models to fit within the same power budget.
And how did you measure GDDR6 power consumption using that, while early models may have been power hungry it's also true that they deliver a lots more bit/W than comparable GDDR5 solutions!
 
Joined
Nov 24, 2017
Messages
853 (0.36/day)
Location
Asia
Processor Intel Core i5 4590
Motherboard Gigabyte Z97x Gaming 3
Cooling Intel Stock Cooler
Memory 8GiB(2x4GiB) DDR3-1600 [800MHz]
Video Card(s) XFX RX 560D 4GiB
Storage Transcend SSD370S 128GB; Toshiba DT01ACA100 1TB HDD
Display(s) Samsung S20D300 20" 768p TN
Case Cooler Master MasterBox E501L
Audio Device(s) Realtek ALC1150
Power Supply Corsair VS450
Mouse A4Tech N-70FX
Software Windows 10 Pro
Benchmark Scores BaseMark GPU : 250 Point in HD 4600
Now people will have this instead MX350 with thier Renoir/TigerLake CPU+4GB RAM+1TB HDD laptop. Nice.
 
Joined
Feb 18, 2005
Messages
5,238 (0.75/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Logitech G613
Software Windows 10 Professional x64
And how did you measure GDDR6 power consumption using that, while early models may have been power hungry it's also true that they deliver a lots more bit/W than comparable GDDR5 solutions!

There's no good reason I can think of for NVIDIA to clock the GDDR6 models' GPUs lower, except for keeping the TBP similar to the GDDR5 models so that the same boards and coolers can be used.
 
Joined
Apr 12, 2013
Messages
6,743 (1.68/day)
That doesn't support what you claimed i.e. GDDR6 uses more power, it's conjecture at best.

Correct, but GDDR6 operating frequencies are far higher than GDDR5. For example the GTX 1650 GDDR6 uses 12Gbps chips vs the 8Gbps on the GDDR5 model - that 50% extra bandwidth has to come from somewhere, and GDDR6 is really just "GDDR5 version 2" so power usage will scale with frequency in the same manner.
You do know that GDDR6 is based on DDR4 right, just like GDDR5 was based on DDR3, you don't suppose they're lying about regular desktop memory power numbers as well?
 
Joined
Feb 18, 2005
Messages
5,238 (0.75/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Logitech G613
Software Windows 10 Professional x64
That doesn't support what you claimed i.e. GDDR6 uses more power, it's conjecture at best.

Yes, it's conjecture. But if you have any other explanation for why NVIDIA would clock their GPUs slower on GDDR6 models, please feel free to share.

You do know that GDDR6 is based on DDR4 right, just like GDDR5 was based on DDR3, you don't suppose they're lying about regular desktop memory power numbers as well?

Where did I say anybody was lying about anything? The fact of the matter is that increasing frequency is going to consume more power.
 
Last edited:
Joined
Apr 12, 2013
Messages
6,743 (1.68/day)
Where did I say anybody was lying about anything? The fact of the matter is that increasing frequency is going to consume more power.
The actual frequency isn't increased that much, remember GDDR5x - this is the same. The effective rate has increased much more than the actual frequency, GDDR memory runs at QDR. Unless you are going to to say DDR4 consumes more power than DDR3 or LPddr4 wrt LPddr4x?
 
Joined
Jul 5, 2013
Messages
25,559 (6.49/day)
Incorrect, it uses more power. GTX 1650 GDDR6 models have to clock their GPUs lower than the GDDR5 models to fit within the same power budget.
That is a misconception. At similar clocks, GDDR6 uses much less energy and produces less heat than GDDR5.
 
Joined
Jul 29, 2019
Messages
77 (0.04/day)
Yes, it's conjecture. But if you have any other explanation for why NVIDIA would clock their GPUs slower on GDDR6 models, please feel free to share.
50% higher bandwidth would screw up performance and pricing in their convoluted midrange space.
 
Joined
Sep 15, 2007
Messages
3,944 (0.65/day)
Location
Police/Nanny State of America
Processor OCed 5800X3D
Motherboard Asucks C6H
Cooling Air
Memory 32GB
Video Card(s) OCed 6800XT
Storage NVMees
Display(s) 32" Dull curved 1440
Case Freebie glass idk
Audio Device(s) Sennheiser
Power Supply Don't even remember
50% higher bandwidth would screw up performance and pricing in their convoluted midrange space.

Surely, you jest. What performance? :roll:
 
Joined
Apr 12, 2013
Messages
6,743 (1.68/day)
That is a misconception. At similar clocks, GDDR6 uses much less energy and produces less heat than GDDR5.
I'm not sure it's a "popular" misconception, if at all. Any new memory tech is generally more efficient & less power hungry than the previous gen, exceptions being something like LPDDR4x vs say regular DDR5. The higher efficiency is one of the most crucial reasons why manufacturers switch to newer, better mem besides the increased bandwidth.
 
Joined
Mar 28, 2020
Messages
1,643 (1.11/day)
And how did you measure GDDR6 power consumption using that, while early models may have been power hungry it's also true that they deliver a lots more bit/W than comparable GDDR5 solutions!

I feel when pushed to specs, GDDR6 draws more power than a GDDR5, i.e. at 14Gbps vs 8Gbps. This is very clear when you compare GTX 1660 vs 1660 Super, where the only change is the memory. I don't deny the upgrade of memory improves performance significantly, but it will increase the power envelop of the GPU for sure.

In this case, one of the reasons for a very low core clockspeed is likely attributed to the GDDR6 used. The second reason is if the rumor is true that this is using the TU117 chip, we have witnessed that it requires quite a lot of power when jumping from GTX 1650 to 1650 Super. To shrink the 100+ W to 25W means a significant reduction in clockspeed.

In fact, I think the MX350 should be sufficient to fend off competition for now. MX450 should be based on a new fab, because 12nm is clearly struggling with power requirement with a hefty improvement in specs.
 
Last edited:
Joined
Feb 20, 2019
Messages
7,285 (3.86/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
I feel when pushed to specs, GDDR6 draws more power than a GDDR5, i.e. at 14Gbps vs 8Gbps. This is very clear when you compare GTX 1660 vs 1660 Super, where the only change is the memory. I don't deny the upgrade of memory improves performance significantly, but it will increase the power envelop of the GPU for sure.

The assumption I made from that was that the 1660 is being held back by 8Gbps VRAM and by switching to GDDR6 the bottleneck was removed, allowing the GPU to run unhindered, which has an obvious increase in power consumption.

In any modern GPU, VRAM is a relatively small percentage of the power usage. Even if you double the power consumption of GDDR5, it would make such a small difference in the total board power consumption that you'd be hard pressed to separate it from the margin of error in your measurements (and all evidence points towards GDDR6 actually consuming less power clock-for-clock - don't forget the 14Gbps GDDR6 in the 1660 Super is actually a lower clock than the 8Gbps GDDR5 in the vanilla 1660)

Meanwhile, the power cost of extra GPU performance goes up exponentially thanks to the P=I^2R and V=IR relationship as part of the power usage calculations. If you want 10% higher clocks that'll likely require 15% more voltage, which will result in a power increase of ~25%. That's why overclocking guzzles so much energy. 25% more juice for 10% extra clockspeed.

So yeah, the 1660 super uses 15-20% more power than the 1660, and provides about 12-15% more performance. I 100% guarantee you that the consumption increase is due to raised GPU utilisation resulting in higher core clocks. I'm making an educated guess here, but it correlates with everything else unlike the insane assumption that lower-clocked, more efficient memory is somehow driving the power consumption up by 15-20% total board power which - if it was caused by the memory change alone - would imply that GDDR6 uses around 900% more power than GDDR5, not the 20% less power claimed by Micron/Samsung.
 
Joined
Feb 18, 2005
Messages
5,238 (0.75/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Logitech G613
Software Windows 10 Professional x64
In any modern GPU, VRAM is a relatively small percentage of the power usage.

GN did an excellent breakdown of why AMD went with HBM for Fury and Vega, and there are various important tidbits therein, to quote, "We also know that an RX 480 uses 40-50W for its 8GB". RX 480 TBP is north of 150W, so you are claiming that a 27 - 33% total power consumption is "relatively small"? Laughable.

Granted, GPU memory controllers have almost certainly become more efficient since RX 480, but I would still be surprised if GDDR consumes under 20% of the TBP in even the latest cards.

don't forget the 14Gbps GDDR6 in the 1660 Super is actually a lower clock than the 8Gbps GDDR5 in the vanilla 1660

Like everyone else, your assumption that clock speed is the only factor in power usage is manifestly incorrect. GDDR5 is quadruple-pumped, GDDR6 octuple-pumped, you really think that pushing twice the amount of data through at the same time is free? The effective clock speed is quoted for a reason, it's not just a marketing term.

I 100% guarantee you that the consumption increase is due to raised GPU utilisation resulting in higher core clocks.

I 100% guarantee you're wrong, again. GN compared the GDDR5 and GDDR6 models of the 1650 and the GDDR6 model draws more power at its stock clocks. With GPU clocks normalised to the GDDR5 model's, it draws yet more power.
 
Joined
Feb 20, 2019
Messages
7,285 (3.86/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
The point of my post seems to have gone over your head, despite the fact that you are clearly understanding the concepts behind it. The GN video I just watched is beautiful because it's the same GPU and they clock-match the GDDR5 and GDDR6 versions.

The point *I* was making is that the GDDR5 is bottlenecking the 1650 and that the GDDR6's extra bandwidth allows the core to do more work, which obviously requires more power. I know you understand that fact despite the matched clock speeds because of this thing you just said:
Like everyone else, your assumption that clock speed is the only factor in power usage is manifestly incorrect. GDDR5 is quadruple-pumped, GDDR6 octuple-pumped, you really think that pushing twice the amount of data through at the same time is free?

And then you immediately say something that puts you back in the "Like everyone else, your assmption that clockspeed is the only factor in power usage is manifestly incorrect" category, by outright stating that the normalised clocks mean that the power consuption must be the different VRAM, only:
I 100% guarantee you're wrong, again. GN compared the GDDR5 and GDDR6 models of the 1650 and the GDDR6 model draws more power at its stock clocks. With GPU clocks normalised to the GDDR5 model's, it draws yet more power.

Which side are you taking? You clearly understand that clockspeed is not the single factor determining power use, but you're then immediately using normalised clockspeed to defend your argument that the power difference must be the VRAM and only the VRAM. You can't have both!

I admit that the line "We also know that an RX 480 uses 40-50W for its 8GB" was a bit of an eye-opener for me. I'm not disputing GN, but as a counter argument it's clear that not all GDDR5 consumes that much. The 1070 Max-Q has a total TDP of just 80W, I really don't believe that the the 8GB of GDDR5 in a 1070Max-Q uses 40-50W. Let's face it, if it was using 50W, then that means that the GP104 is somehow providing decent performance on just 30W. That's pretty asburd. At best, I think we can assume that AMD struggle to make an efficient memory controller where Nvidia have that part nailed. At worse, It's possible that GN were wrong? I doubt it. Steve Burke is pretty passionate about GPUs and knows his stuff.
 
Top