Saturday, December 18th 2021

NVIDIA Announces Three New Mobile GPUs With Spring 2022 Availability

NVIDIA has just announced three new mobile GPUs, although the question is how new any of them really are, as the model names suggest they're anything but. First up is the GeForce RTX 2050, which should be based on the Turing architecture. The other two GPUs are the GeForce MX550 and MX570, both presumably based on the Ampere architecture, although NVIDIA hasn't confirmed the specifics.

The GeForce RTX 2050 features 2048 CUDA cores, which is more than the mobile RTX 2060, but it has lower clock speeds and a vastly lower power draw at 30-45 Watts depending on the notebook design choices and cooling. It's also limited to 4 GB of 64-bit GDDR6 memory, which puts this in GeForce MX territory when it comes to memory bandwidth, as NVIDIA quotes an up to memory bandwidth of a mere 112 GB/s.
The two GeForce MX parts also support GDDR6 memory, but beyond that, NVIDIA hasn't released anything tangible in terms of the specs. NVIDIA mentions that the GeForce MX550 will replace the MX450, while stating both GPUs are intended to boost the performance for video and photo editing on the move, with the MX570 also being suitable for gaming. All three GPUs are said to ship in laptops sometime this coming spring.

Update: According to a tweet by ComputerBase, who has confirmed the information with Nvidia, the RTX 2050 and the MX570 are based on Ampere and the GA107 GPU, with the MX550 being based on Turin and the TU117 GPU. The MX570 is also said to support DLSS and "limited" RTX features, whatever that means.
Sources: Nvidia, Nvidia 2, @ComputerBase
Add your own comment

33 Comments on NVIDIA Announces Three New Mobile GPUs With Spring 2022 Availability

#2
MentalAcetylide
More like col(crying out loud)... "regress" :(
Even though I paid a lot more than I should have, I'm glad I started having parts ordered for my new rig build back in March-April. The damn fans didn't get here until two weeks ago... so I should be going to pick it up within the next week and have it before Christmas holiday. :rockout:
Posted on Reply
#3
Space Lynx
Astronaut
MentalAcetylideMore like col(crying out loud)... "regress" :(
Even though I paid a lot more than I should have, I'm glad I started having parts ordered for my new rig build back in March-April. The damn fans didn't get here until two weeks ago... so I should be going to pick it up within the next week and have it before Christmas holiday. :rockout:
i'm still happy with my gtx 1070 setup overall. for some reason I gain like 20-30 fps when i hook it up to a monitor, and I have no idea why because I already have it set to "discrete" in the motherboard bios. so i should have same performance regardless. either way i am a happy camper. it runs FFXIV Endwalker on high settings at 120 fps at 1080p. been lots of fun, speaking of which i am going to go play some more now :rockout:
Posted on Reply
#4
Vayra86
lynx29i'm still happy with my gtx 1070 setup overall. for some reason I gain like 20-30 fps when i hook it up to a monitor, and I have no idea why because I already have it set to "discrete" in the motherboard bios. so i should have same performance regardless. either way i am a happy camper. it runs FFXIV Endwalker on high settings at 120 fps at 1080p. been lots of fun, speaking of which i am going to go play some more now :rockout:
Your screen is no longer sipping power budget from the laptop brick perhaps?
Posted on Reply
#5
bencrutz
lynx29i'm still happy with my gtx 1070 setup overall. for some reason I gain like 20-30 fps when i hook it up to a monitor, and I have no idea why because I already have it set to "discrete" in the motherboard bios. so i should have same performance regardless. either way i am a happy camper. it runs FFXIV Endwalker on high settings at 120 fps at 1080p. been lots of fun, speaking of which i am going to go play some more now :rockout:
that's because the rendered frames still goes through the intel iGPU and then to the internal display while with external display connected, the dGPU drives it directly without having to go through the iGPU first.
Posted on Reply
#6
Richards
This hood for consumer's.. dlss will help making these micro gpu's more powerful
Posted on Reply
#7
Space Lynx
Astronaut
bencrutzthat's because the rendered frames still goes through the intel iGPU and then to the internal display while with external display connected, the dGPU drives it directly without having to go through the iGPU first.
that shouldn't be the case when you have option for "Discrete" in mobo bios, which i do.
Vayra86Your screen is no longer sipping power budget from the laptop brick perhaps?
this might be it. who knows
Posted on Reply
#8
bencrutz
lynx29that shouldn't be the case when you have option for "Discrete" in mobo bios, which i do.
your laptop still using hard mux? interesting, what model & make is it?
Posted on Reply
#9
Cheeseball
Not a Potato
lynx29that shouldn't be the case when you have option for "Discrete" in mobo bios, which i do.



this might be it. who knows
That function might not be a proper mux switch. (Data is still going through the iGPU instead of bypassing it completely.) What's the make and model of your lappy?
Posted on Reply
#10
Garrus
lynx29lol... "progress"
Will be interesting to compare battery life of the Radeon 6500 XT in laptops versus the RTX 2050 (super OLD).
Posted on Reply
#12
Berfs1
I am 99% certain the 2050 is actually an Ampere GPU. The GFLOPS/Watt for the 45W spec is 134.44, while the 30W spec is 157.70. That is in line with Ampere Mobile GFLOPS/Watt, not Turing Mobile GFLOPS/Watt. Turing Mobile was in the range of 52-83, while Ampere Mobile's current range is 89-191. Source
lynx29i'm still happy with my gtx 1070 setup overall. for some reason I gain like 20-30 fps when i hook it up to a monitor, and I have no idea why because I already have it set to "discrete" in the motherboard bios. so i should have same performance regardless. either way i am a happy camper. it runs FFXIV Endwalker on high settings at 120 fps at 1080p. been lots of fun, speaking of which i am going to go play some more now :rockout:
If you have the iGPU enabled, disable it in device manager and see if your FPS goes up on the laptop display.
Posted on Reply
#14
ARF
Nvidia has a history of lower image quality in games 3D and in desktop 2D. This is due to lower quality fonts, and damaged, fake, dull colours and lower textures resolution in 3D.

I would stay away from their products, I don't buy anything Nvidia..
Posted on Reply
#15
Space Lynx
Astronaut
ARFNvidia has a history of lower image quality in games 3D and in desktop 2D. This is due to lower quality fonts, and damaged, fake, dull colours and lower textures resolution in 3D.

I would stay away from their products, I don't buy anything Nvidia..
AMD to the moon!
Posted on Reply
#16
Chrispy_
Honestly, I'm done with chunky gaming laptops (have been personally for a decade already, getting to that point with recommendations and work purchases for 3D/VR presentation also).

All that matters is what nVidia can achieve in under 50W. If you want more than 50W of dGPU gaming in a portable form factor you pretty much NEED to sit down at a desk and find a power outlet, at which point you don't really need a laptop. A decade ago, travelling to a location for LAN play was a reasonable thing, but the world has changed dramatically since then, COVID being just one of the factors for this shift in trends.

As far as I can tell, Nvidia don't really have any compelling Ampere chips in the sub-50W range yet. Turing made do with the 1650 Max-Q variants at 35W which were barely fast enough to justify themselves over some of the far more efficient APUs like the 4700U and 5700U which at 25W all-in allowed for far slimmer, cooler, quieter, lighter, longer-lasting laptops that can still game.

Unfortunately these are still based on the GA107 and we've already seen how disappointing and underwhelming that is at lower TDPs - The 3050 isn't exactly an easy performance recommendation even at the default 80W TDP. A cut-down version, running far slower than that? It may struggle to differentiate itself from the ancient 1650 and any wins it gains over the old 1650 will likely come at increased performance/Watt which kills the appeal.

I think the real problem with Ampere is that Samsung 8nm simply isn't very power-efficient. Yes, it's dense, but performance/Watt isn't where it shines. Sadly, there aren't any compelling TSMC 7nm GPUs making their way into laptops right now so we're all stuck with 2019-esque performance/Watt solutions.
Posted on Reply
#17
AusWolf
What's the point of packing 2048 CUDA cores with an extremely low TDP and 64-bit memory bus? Would it not make more sense to use a smaller or (more) defective chip? :kookoo:
Posted on Reply
#18
Caring1
AusWolfWhat's the point of packing 2048 CUDA cores with an extremely low TDP and 64-bit memory bus? Would it not make more sense to use a smaller or (more) defective chip? :kookoo:
Won't that improve it's compute power?
Posted on Reply
#19
r9
I'm really looking forward to nvidia releasing Riva tnt again.
Posted on Reply
#20
Dr. Dro
ARFNvidia has a history of lower image quality in games 3D and in desktop 2D. This is due to lower quality fonts, and damaged, fake, dull colours and lower textures resolution in 3D.

I would stay away from their products, I don't buy anything Nvidia..
I honestly thought the AMD vs. NVIDIA color accuracy conspiracy died off a decade ago. :kookoo:

There are no differences between both vendors' image quality when outputting the same signal and the same pixel format. What may cause this is incorrect settings reported by either EDID or manual user input, as NVIDIA supports chroma compression in older hardware generations (including YCbCr 4:2:0 and 4K 60Hz on Kepler GPU with DVI-D DL/HDMI 1.4 port), while AMD did not do so until Polaris.

Nowadays every monitor and every GPU should be capable of displaying a full-range RGB at 8 bpc signal at a minimum.
Posted on Reply
#21
ARF
Dr. DroI honestly thought the AMD vs. NVIDIA color accuracy conspiracy died off a decade ago. :kookoo:

There are no differences between both vendors' image quality when outputting the same signal and the same pixel format. What may cause this is incorrect settings reported by either EDID or manual user input, as NVIDIA supports chroma compression in older hardware generations (including YCbCr 4:2:0 and 4K 60Hz on Kepler GPU with DVI-D DL/HDMI 1.4 port), while AMD did not do so until Polaris.

Nowadays every monitor and every GPU should be capable of displaying a full-range RGB at 8 bpc signal at a minimum.
Well, I think the reason for this topic "die off" is that the users themselves use so different models of screens, monitors, displays, panels and own private settings, that an objective comparison is very difficult to perform.

But look at the green colour of the grass between RX 6900 XT on the right and RTX 3090 on the left.



RX 6900 XT vs RTX 3090 - Test in 8 Games l 4K l - YouTube
Posted on Reply
#22
Dr. Dro
ARFWell, I think the reason for this topic "die off" is that the users themselves use so different models of screens, monitors, displays, panels and own private settings, that an objective comparison is very difficult to perform.

But look at the green colour of the grass between RX 6900 XT on the right and RTX 3090 on the left.
Maybe I am blind, maybe I have an RTX 3090... I can't tell the color of the grass apart? Nor could I ever tell a difference between this card and the Radeon VII I had before it? And you especially wouldn't be able to see it after video compression on YouTube to begin with, nor are the scenes and lighting on that comparison exactly identical - you would need two reference gradation images being displayed on reference monitors, to even begin to make any comparison valid.

Sorry bro, that stuff is hearsay... has always been. The color reproduction on both companies' GPUs is fully accurate when set correctly.
Posted on Reply
#23
Chrispy_
I'm fairly certain that the last time AMD vs Nvidia image quality came up people put the matter to rest with binary XOR filters on screencaps of identical frames from repeatable benchmarks.

In the bad old days there was ATi's Quack3.exe and Nvidia's FX series cheating on aniso filtering. Both companies fixed those cheats once called out by the media. After that we had Nvidia very slow on the uptake with gamma-corrected AA for MSAA, that was the last real rendering difference between the two companies, around the time that AMD bought ATi.

There are however some exceptionally dumb issues with the Nvidia driver from this decade, one of which is still an issue even to this day:
  • A minor one that existed until quite recently (10-series cards) was that the driver defaults would drop filtering quality down a notch. That no longer seems to happen but was effectively Nvidia offering "Balanced" presets for driver settings where the game didn't request a specific setting, so many people unaware of this buried option deep in the ancient Nvidia driver advanced 3D settings menu was only getting "Quality" filtering, not "High quality" filtering:

  • A more serious issue even present today (tested on two HDMI displays with both a 2060 and 3060Ti) is that the driver detects the display device as limited RGB (16-235) which, if you don't realise that and game in a well-lit room could easily just mistake it for muted colours and dull highlights. I know for sure that it doesn't happen with all HDMI displays on Nvidia cards but it's a big enough problem that I often see youtubers/streamers capturing output where black is not black, clearly it's 16-235 in three channels. If you look at an image-comparison video on YT of various side by side videos and the radeon card looks to have better contrast, it's because of this messed up, unfixed bug in the drivers that is subtle enough for many people to miss.

Posted on Reply
#24
AusWolf
Chrispy_I'm fairly certain that the last time AMD vs Nvidia image quality came up people put the matter to rest with binary XOR filters on screencaps of identical frames from repeatable benchmarks.

In the bad old days there was ATi's Quack3.exe and Nvidia's FX series cheating on aniso filtering. Both companies fixed those cheats once called out by the media. After that we had Nvidia very slow on the uptake with gamma-corrected AA for MSAA, that was the last real rendering difference between the two companies, around the time that AMD bought ATi.

There are however some exceptionally dumb issues with the Nvidia driver from this decade, one of which is still an issue even to this day:
  • A minor one that existed until quite recently (10-series cards) was that the driver defaults would drop filtering quality down a notch. That no longer seems to happen but was effectively Nvidia offering "Balanced" presets for driver settings where the game didn't request a specific setting, so many people unaware of this buried option deep in the ancient Nvidia driver advanced 3D settings menu was only getting "Quality" filtering, not "High quality" filtering:

  • A more serious issue even present today (tested on two HDMI displays with both a 2060 and 3060Ti) is that the driver detects the display device as limited RGB (16-235) which, if you don't realise that and game in a well-lit room could easily just mistake it for muted colours and dull highlights. I know for sure that it doesn't happen with all HDMI displays on Nvidia cards but it's a big enough problem that I often see youtubers/streamers capturing output where black is not black, clearly it's 16-235 in three channels. If you look at an image-comparison video on YT of various side by side videos and the radeon card looks to have better contrast, it's because of this messed up, unfixed bug in the drivers that is subtle enough for many people to miss.

There has also been an easy fix for these: never assume that a driver update preserved your settings. ;)
Posted on Reply
#25
Chrispy_
AusWolfThere has also been an easy fix for these: never assume that a driver update preserved your settings. ;)
Oh, forget "preserving your settings", this affects anyone installing an Nvidia GPU on an HDMI display; if you are unlucky and have a "winning" combo that auto-detects as limited dynamic range it will be wrong automatically from the beginning and it'll re-wrong every driver update without fail. You have to be aware of it, know to change it, and know that it'll always need resetting after each and every update.

I guess this is why so many videos still exist in shitty limited RGB with grey blacks and poor contrast. It's even spawned several AMD vs Nvidia comparison videos where the AMD video clearly has higher contrast than the Nvidia video and the ignorant vlogger is just presenting the two as "differences" when they simply need to set their drivers to output at full dynamic range.

Yes, they're ignorant, but no this is not their fault. It's Nvidia's fault and their drivers have been a trainwreck for years. I kind of wish I was running an AMD GPU right now because the Nvidia control panel makes me want to rant about negligence and bad UI every time I have to actually use it (which is mercifully almost never).
Posted on Reply
Add your own comment
Apr 19th, 2024 16:02 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts