• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Announces Three New Mobile GPUs With Spring 2022 Availability

There has also been an easy fix for these: never assume that a driver update preserved your settings. ;)
Oh, forget "preserving your settings", this affects anyone installing an Nvidia GPU on an HDMI display; if you are unlucky and have a "winning" combo that auto-detects as limited dynamic range it will be wrong automatically from the beginning and it'll re-wrong every driver update without fail. You have to be aware of it, know to change it, and know that it'll always need resetting after each and every update.

I guess this is why so many videos still exist in shitty limited RGB with grey blacks and poor contrast. It's even spawned several AMD vs Nvidia comparison videos where the AMD video clearly has higher contrast than the Nvidia video and the ignorant vlogger is just presenting the two as "differences" when they simply need to set their drivers to output at full dynamic range.

Yes, they're ignorant, but no this is not their fault. It's Nvidia's fault and their drivers have been a trainwreck for years. I kind of wish I was running an AMD GPU right now because the Nvidia control panel makes me want to rant about negligence and bad UI every time I have to actually use it (which is mercifully almost never).
 
Last edited:
Nvidia has a history of lower image quality in games 3D and in desktop 2D. This is due to lower quality fonts, and damaged, fake, dull colours and lower textures resolution in 3D.

I would stay away from their products, I don't buy anything Nvidia..
Happy to learn about your personal AMD preferences under an Nvidia article.
 
What's the point of packing 2048 CUDA cores with an extremely low TDP and 64-bit memory bus? Would it not make more sense to use a smaller or (more) defective chip? :kookoo:

What's the point of a very weak discrete gpu on a laptop when igps are getting almost-ish as fast anyway?

Updated the article accordingly. There wasn't enough details available at the time I wrote it and other sites hinted at what I originally wrote.

Also:

There were recent headlines of the upcoming AMD Rembrandt apus doing 2700 on time spy so what's the point of having a slightly better dgpu as well?
 
What's the point of a very weak discrete gpu on a laptop when igps are getting almost-ish as fast anyway?



There were recent headlines of the upcoming AMD Rembrandt apus doing 2700 on time spy so what's the point of having a slightly better dgpu as well?
For Intel-based laptops? :confused:
 
Oh, forget "preserving your settings", this affects anyone installing an Nvidia GPU on an HDMI display; if you are unlucky and have a "winning" combo that auto-detects as limited dynamic range it will be wrong automatically from the beginning and it'll re-wrong every driver update without fail. You have to be aware of it, know to change it, and know that it'll always need resetting after each and every update.

I guess this is why so many videos still exist in shitty limited RGB with grey blacks and poor contrast. It's even spawned several AMD vs Nvidia comparison videos where the AMD video clearly has higher contrast than the Nvidia video and the ignorant vlogger is just presenting the two as "differences" when they simply need to set their drivers to output at full dynamic range.

Yes, they're ignorant, but no this is not their fault. It's Nvidia's fault and their drivers have been a trainwreck for years. I kind of wish I was running an AMD GPU right now because the Nvidia control panel makes me want to rant about negligence and bad UI every time I have to actually use it (which is mercifully almost never).
Like I said: Don't be lazy, check your settings. ;) Don't assume that any driver or program gets installed with your preferred settings by default - nvidia, AMD or otherwise.

As for the UI, I have no issues with it. It has been the same for the last 15 years, which to me, is just convenient. I hate re-learning to navigate menus just because some random UI engineer decided that I should. AMD's new drivers look nice, but their menu structure is overcomplicated, imo. It's like nvidia's Control Panel and GeForce Experience under one app. At least nvidia gives you the option to install and use them separately.
 
Last edited:
As for the UI, I have no issues with it. It has been the same for the last 15 years, which to me, is just convenient. I hate re-learning to navigate menus just because some random UI engineer decided that I should. AMD's new drivers look nice, but their menu structure is overcomplicated, imo. It's like nvidia's Control Panel and GeForce Experience under one app. At least nvidia gives you the option to install and use them separately.
Haven't even realized that or time simply just goes way too fast. I remember being like "man that new Nvidia panel sucks" when that came back then.
 
Haven't even realized that or time simply just goes way too fast. I remember being like "man that new Nvidia panel sucks" when that came back then.
I remember, Control Panel looked exactly the same with my 7800 GS on Windows XP as it does nowadays. Some people might consider it negative, but I quite like its simplicity and function-orientedness. :)
 
I remember, Control Panel looked exactly the same with my 7800 GS on Windows XP as it does nowadays. Some people might consider it negative, but I quite like its simplicity and function-orientedness. :)
"Don't fix it if it ain't broken" :)
 
For Intel-based laptops? :confused:

Tiger lake is doing between between 1500-1800 on time spy already, Alder Lake with DDR5/LPDDR5 and more mature architecture should give it a nice boost (let's ballkpark it to between 2000 and 2300), an extra chip to power and cool is not worth it for between ~25% to 40% more performance imo
 
Back
Top