• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Announces Three New Mobile GPUs With Spring 2022 Availability

TheLostSwede

News Editor
Joined
Nov 11, 2004
Messages
18,543 (2.48/day)
Location
Sweden
System Name Overlord Mk MLI
Processor AMD Ryzen 7 7800X3D
Motherboard Gigabyte X670E Aorus Master
Cooling Noctua NH-D15 SE with offsets
Memory 32GB Team T-Create Expert DDR5 6000 MHz @ CL30-34-34-68
Video Card(s) Gainward GeForce RTX 4080 Phantom GS
Storage 1TB Solidigm P44 Pro, 2 TB Corsair MP600 Pro, 2TB Kingston KC3000
Display(s) Acer XV272K LVbmiipruzx 4K@160Hz
Case Fractal Design Torrent Compact
Audio Device(s) Corsair Virtuoso SE
Power Supply be quiet! Pure Power 12 M 850 W
Mouse Logitech G502 Lightspeed
Keyboard Corsair K70 Max
Software Windows 10 Pro
Benchmark Scores https://valid.x86.fr/yfsd9w
NVIDIA has just announced three new mobile GPUs, although the question is how new any of them really are, as the model names suggest they're anything but. First up is the GeForce RTX 2050, which should be based on the Turing architecture. The other two GPUs are the GeForce MX550 and MX570, both presumably based on the Ampere architecture, although NVIDIA hasn't confirmed the specifics.

The GeForce RTX 2050 features 2048 CUDA cores, which is more than the mobile RTX 2060, but it has lower clock speeds and a vastly lower power draw at 30-45 Watts depending on the notebook design choices and cooling. It's also limited to 4 GB of 64-bit GDDR6 memory, which puts this in GeForce MX territory when it comes to memory bandwidth, as NVIDIA quotes an up to memory bandwidth of a mere 112 GB/s.




The two GeForce MX parts also support GDDR6 memory, but beyond that, NVIDIA hasn't released anything tangible in terms of the specs. NVIDIA mentions that the GeForce MX550 will replace the MX450, while stating both GPUs are intended to boost the performance for video and photo editing on the move, with the MX570 also being suitable for gaming. All three GPUs are said to ship in laptops sometime this coming spring.

Update: According to a tweet by ComputerBase, who has confirmed the information with Nvidia, the RTX 2050 and the MX570 are based on Ampere and the GA107 GPU, with the MX550 being based on Turin and the TU117 GPU. The MX570 is also said to support DLSS and "limited" RTX features, whatever that means.

View at TechPowerUp Main Site
 
More like col(crying out loud)... "regress" :(
Even though I paid a lot more than I should have, I'm glad I started having parts ordered for my new rig build back in March-April. The damn fans didn't get here until two weeks ago... so I should be going to pick it up within the next week and have it before Christmas holiday. :rockout:
 
More like col(crying out loud)... "regress" :(
Even though I paid a lot more than I should have, I'm glad I started having parts ordered for my new rig build back in March-April. The damn fans didn't get here until two weeks ago... so I should be going to pick it up within the next week and have it before Christmas holiday. :rockout:

i'm still happy with my gtx 1070 setup overall. for some reason I gain like 20-30 fps when i hook it up to a monitor, and I have no idea why because I already have it set to "discrete" in the motherboard bios. so i should have same performance regardless. either way i am a happy camper. it runs FFXIV Endwalker on high settings at 120 fps at 1080p. been lots of fun, speaking of which i am going to go play some more now :rockout:
 
i'm still happy with my gtx 1070 setup overall. for some reason I gain like 20-30 fps when i hook it up to a monitor, and I have no idea why because I already have it set to "discrete" in the motherboard bios. so i should have same performance regardless. either way i am a happy camper. it runs FFXIV Endwalker on high settings at 120 fps at 1080p. been lots of fun, speaking of which i am going to go play some more now :rockout:

Your screen is no longer sipping power budget from the laptop brick perhaps?
 
i'm still happy with my gtx 1070 setup overall. for some reason I gain like 20-30 fps when i hook it up to a monitor, and I have no idea why because I already have it set to "discrete" in the motherboard bios. so i should have same performance regardless. either way i am a happy camper. it runs FFXIV Endwalker on high settings at 120 fps at 1080p. been lots of fun, speaking of which i am going to go play some more now :rockout:
that's because the rendered frames still goes through the intel iGPU and then to the internal display while with external display connected, the dGPU drives it directly without having to go through the iGPU first.
 
that's because the rendered frames still goes through the intel iGPU and then to the internal display while with external display connected, the dGPU drives it directly without having to go through the iGPU first.

that shouldn't be the case when you have option for "Discrete" in mobo bios, which i do.

Your screen is no longer sipping power budget from the laptop brick perhaps?

this might be it. who knows
 
that shouldn't be the case when you have option for "Discrete" in mobo bios, which i do.

your laptop still using hard mux? interesting, what model & make is it?
 
that shouldn't be the case when you have option for "Discrete" in mobo bios, which i do.



this might be it. who knows

That function might not be a proper mux switch. (Data is still going through the iGPU instead of bypassing it completely.) What's the make and model of your lappy?
 
lol... "progress"
Will be interesting to compare battery life of the Radeon 6500 XT in laptops versus the RTX 2050 (super OLD).
 
I am 99% certain the 2050 is actually an Ampere GPU. The GFLOPS/Watt for the 45W spec is 134.44, while the 30W spec is 157.70. That is in line with Ampere Mobile GFLOPS/Watt, not Turing Mobile GFLOPS/Watt. Turing Mobile was in the range of 52-83, while Ampere Mobile's current range is 89-191. Source

i'm still happy with my gtx 1070 setup overall. for some reason I gain like 20-30 fps when i hook it up to a monitor, and I have no idea why because I already have it set to "discrete" in the motherboard bios. so i should have same performance regardless. either way i am a happy camper. it runs FFXIV Endwalker on high settings at 120 fps at 1080p. been lots of fun, speaking of which i am going to go play some more now :rockout:
If you have the iGPU enabled, disable it in device manager and see if your FPS goes up on the laptop display.
 
@TheLostSwede :

RTX 2050 = GA107
MX570 = GA107
MX550 = TU117

MX550 also doesn't support RT.

Source: Computerbase
Updated the article accordingly. There wasn't enough details available at the time I wrote it and other sites hinted at what I originally wrote.

Also:
 
Last edited:
Nvidia has a history of lower image quality in games 3D and in desktop 2D. This is due to lower quality fonts, and damaged, fake, dull colours and lower textures resolution in 3D.

I would stay away from their products, I don't buy anything Nvidia..
 
Nvidia has a history of lower image quality in games 3D and in desktop 2D. This is due to lower quality fonts, and damaged, fake, dull colours and lower textures resolution in 3D.

I would stay away from their products, I don't buy anything Nvidia..

AMD to the moon!
 
Honestly, I'm done with chunky gaming laptops (have been personally for a decade already, getting to that point with recommendations and work purchases for 3D/VR presentation also).

All that matters is what nVidia can achieve in under 50W. If you want more than 50W of dGPU gaming in a portable form factor you pretty much NEED to sit down at a desk and find a power outlet, at which point you don't really need a laptop. A decade ago, travelling to a location for LAN play was a reasonable thing, but the world has changed dramatically since then, COVID being just one of the factors for this shift in trends.

As far as I can tell, Nvidia don't really have any compelling Ampere chips in the sub-50W range yet. Turing made do with the 1650 Max-Q variants at 35W which were barely fast enough to justify themselves over some of the far more efficient APUs like the 4700U and 5700U which at 25W all-in allowed for far slimmer, cooler, quieter, lighter, longer-lasting laptops that can still game.

Unfortunately these are still based on the GA107 and we've already seen how disappointing and underwhelming that is at lower TDPs - The 3050 isn't exactly an easy performance recommendation even at the default 80W TDP. A cut-down version, running far slower than that? It may struggle to differentiate itself from the ancient 1650 and any wins it gains over the old 1650 will likely come at increased performance/Watt which kills the appeal.

I think the real problem with Ampere is that Samsung 8nm simply isn't very power-efficient. Yes, it's dense, but performance/Watt isn't where it shines. Sadly, there aren't any compelling TSMC 7nm GPUs making their way into laptops right now so we're all stuck with 2019-esque performance/Watt solutions.
 
Last edited:
  • Like
Reactions: ARF
What's the point of packing 2048 CUDA cores with an extremely low TDP and 64-bit memory bus? Would it not make more sense to use a smaller or (more) defective chip? :kookoo:
 
What's the point of packing 2048 CUDA cores with an extremely low TDP and 64-bit memory bus? Would it not make more sense to use a smaller or (more) defective chip? :kookoo:
Won't that improve it's compute power?
 
I'm really looking forward to nvidia releasing Riva tnt again.
 
Nvidia has a history of lower image quality in games 3D and in desktop 2D. This is due to lower quality fonts, and damaged, fake, dull colours and lower textures resolution in 3D.

I would stay away from their products, I don't buy anything Nvidia..

I honestly thought the AMD vs. NVIDIA color accuracy conspiracy died off a decade ago. :kookoo:

There are no differences between both vendors' image quality when outputting the same signal and the same pixel format. What may cause this is incorrect settings reported by either EDID or manual user input, as NVIDIA supports chroma compression in older hardware generations (including YCbCr 4:2:0 and 4K 60Hz on Kepler GPU with DVI-D DL/HDMI 1.4 port), while AMD did not do so until Polaris.

Nowadays every monitor and every GPU should be capable of displaying a full-range RGB at 8 bpc signal at a minimum.
 
I honestly thought the AMD vs. NVIDIA color accuracy conspiracy died off a decade ago. :kookoo:

There are no differences between both vendors' image quality when outputting the same signal and the same pixel format. What may cause this is incorrect settings reported by either EDID or manual user input, as NVIDIA supports chroma compression in older hardware generations (including YCbCr 4:2:0 and 4K 60Hz on Kepler GPU with DVI-D DL/HDMI 1.4 port), while AMD did not do so until Polaris.

Nowadays every monitor and every GPU should be capable of displaying a full-range RGB at 8 bpc signal at a minimum.

Well, I think the reason for this topic "die off" is that the users themselves use so different models of screens, monitors, displays, panels and own private settings, that an objective comparison is very difficult to perform.

But look at the green colour of the grass between RX 6900 XT on the right and RTX 3090 on the left.

1639933571689.png


RX 6900 XT vs RTX 3090 - Test in 8 Games l 4K l - YouTube
 

Attachments

  • 1639933572661.png
    1639933572661.png
    3.3 MB · Views: 85
Well, I think the reason for this topic "die off" is that the users themselves use so different models of screens, monitors, displays, panels and own private settings, that an objective comparison is very difficult to perform.

But look at the green colour of the grass between RX 6900 XT on the right and RTX 3090 on the left.

Maybe I am blind, maybe I have an RTX 3090... I can't tell the color of the grass apart? Nor could I ever tell a difference between this card and the Radeon VII I had before it? And you especially wouldn't be able to see it after video compression on YouTube to begin with, nor are the scenes and lighting on that comparison exactly identical - you would need two reference gradation images being displayed on reference monitors, to even begin to make any comparison valid.

Sorry bro, that stuff is hearsay... has always been. The color reproduction on both companies' GPUs is fully accurate when set correctly.
 
I'm fairly certain that the last time AMD vs Nvidia image quality came up people put the matter to rest with binary XOR filters on screencaps of identical frames from repeatable benchmarks.

In the bad old days there was ATi's Quack3.exe and Nvidia's FX series cheating on aniso filtering. Both companies fixed those cheats once called out by the media. After that we had Nvidia very slow on the uptake with gamma-corrected AA for MSAA, that was the last real rendering difference between the two companies, around the time that AMD bought ATi.

There are however some exceptionally dumb issues with the Nvidia driver from this decade, one of which is still an issue even to this day:
  • A minor one that existed until quite recently (10-series cards) was that the driver defaults would drop filtering quality down a notch. That no longer seems to happen but was effectively Nvidia offering "Balanced" presets for driver settings where the game didn't request a specific setting, so many people unaware of this buried option deep in the ancient Nvidia driver advanced 3D settings menu was only getting "Quality" filtering, not "High quality" filtering:

    1639942957504.png


  • A more serious issue even present today (tested on two HDMI displays with both a 2060 and 3060Ti) is that the driver detects the display device as limited RGB (16-235) which, if you don't realise that and game in a well-lit room could easily just mistake it for muted colours and dull highlights. I know for sure that it doesn't happen with all HDMI displays on Nvidia cards but it's a big enough problem that I often see youtubers/streamers capturing output where black is not black, clearly it's 16-235 in three channels. If you look at an image-comparison video on YT of various side by side videos and the radeon card looks to have better contrast, it's because of this messed up, unfixed bug in the drivers that is subtle enough for many people to miss.

    1639943321230.png
 
I'm fairly certain that the last time AMD vs Nvidia image quality came up people put the matter to rest with binary XOR filters on screencaps of identical frames from repeatable benchmarks.

In the bad old days there was ATi's Quack3.exe and Nvidia's FX series cheating on aniso filtering. Both companies fixed those cheats once called out by the media. After that we had Nvidia very slow on the uptake with gamma-corrected AA for MSAA, that was the last real rendering difference between the two companies, around the time that AMD bought ATi.

There are however some exceptionally dumb issues with the Nvidia driver from this decade, one of which is still an issue even to this day:
  • A minor one that existed until quite recently (10-series cards) was that the driver defaults would drop filtering quality down a notch. That no longer seems to happen but was effectively Nvidia offering "Balanced" presets for driver settings where the game didn't request a specific setting, so many people unaware of this buried option deep in the ancient Nvidia driver advanced 3D settings menu was only getting "Quality" filtering, not "High quality" filtering:

    View attachment 229435

  • A more serious issue even present today (tested on two HDMI displays with both a 2060 and 3060Ti) is that the driver detects the display device as limited RGB (16-235) which, if you don't realise that and game in a well-lit room could easily just mistake it for muted colours and dull highlights. I know for sure that it doesn't happen with all HDMI displays on Nvidia cards but it's a big enough problem that I often see youtubers/streamers capturing output where black is not black, clearly it's 16-235 in three channels. If you look at an image-comparison video on YT of various side by side videos and the radeon card looks to have better contrast, it's because of this messed up, unfixed bug in the drivers that is subtle enough for many people to miss.

    View attachment 229437
There has also been an easy fix for these: never assume that a driver update preserved your settings. ;)
 
Back
Top