• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA Sneaks Less Powerful GeForce MX150 Variant Into Ultrabooks

But in this case it is the same chip, an overclockers dream you might say.

Very few laptops have that capability. You'd also have to assume that the motherboard/PSU can delivery that extra power. Laptops especially are rated at a specific spec for a reason. Unless it's a gaming oriented system, I wouldn't risk it.
 
I ordered this and i am quite happy with it.
https://www.newegg.com/Product/Product.aspx?Item=N82E16834316267

It has its issues but its the only affordable and powerful 2-1 i can find and traveling internationally makes having a 2-1 a godsend. I have my issues like Acer locked BIOS. It took forever to get the CPU to run at 25W TDP through software and some other things like the IPS screen has bad backlight bleed but nothing like some lenovos BLB so thats good.

The build quality is great too but it has soldered RAM so only 8GB...16GB version isn't in the wild.

I have many small beefs like i always due with laptops but this is shockingly powerful and nice for 900-1000 bucks for the specs and quality.

I really dont know why they can't throw in better specs like this thing into ultrabooks. MX150 is trash compared to my 2-1.

-.1v and 25W TDP on that intel CPU with 35W TDP short boost is amazing. 25W TDP runs at like 85C while playing war thunder/NS2 in South Korea and the GPU runs just fine. I havent overclocked the GPU because i don't know the specs but it was plenty fast enough to enjoy 1080P 60hz!!!
 
Typically, you won't see me say anything nice about nVidia, but I don't think they are in the wrong here. If the only thing different is the clocks then nothing was 'snuck' although perhaps it should have a low power moniker or something. Now, if they started jacking with core counts, shaders, or memory specs - we have a problem.

The only thing I think they did wrong was not actually list official specs for the MX150. I see no reason not to list the minimum specs for the MX150.

Very few laptops have that capability. You'd also have to assume that the motherboard/PSU can delivery that extra power. Laptops especially are rated at a specific spec for a reason. Unless it's a gaming oriented system, I wouldn't risk it.

It's not really the extra power that I'd be worried about, but the extra heat.
 
And this only aids nVIDIA I guess?
Bad yeilds? Making cash by naming a lower spec GPU the same as one already out with more grunt?
Who knows... Does it effect me? No
 
And this only aids nVIDIA I guess?
Bad yeilds? Making cash by naming a lower spec GPU the same as one already out with more grunt?
Who knows... Does it effect me? No
I see this as nvidia helping OEMs boost marketing and profits. I dont see how this benefits nvidia.... This would hurt nvidia not OEMs
 
The only thing I think they did wrong was not actually list official specs for the MX150. I see no reason not to list the minimum specs for the MX150.

I agree for the most part but it is possible that another company will have an ultrabook with shittier thermals and run even lower than what nVidia has posted. If anything, they should just post core,shader, cuda, bus width, etc.
 
This ain't anything new.
But they should stop doing this, misleading consumers...

same as I7 in ultrabooks that can be 2 core...
 
I agree for the most part but it is possible that another company will have an ultrabook with shittier thermals and run even lower than what nVidia has posted. If anything, they should just post core,shader, cuda, bus width, etc.

From what I gather, nVidia is setting the minimum specs for the lower power version still. It is just up to the manufacturers to pick which version they want to use. If they can't design a laptop around the lower power version, and the dedicated graphics ends up thermal throttling below the base clock, then that isn't nVidia or anyone else's fault but the laptop manufacturer.

Also, even with the low power version's specs set in stone, we can still have laptops using the same specced GPUs getting different performance. We see that in the benchmarks in the first post. The reason being that some likely were getting too hot and dropping from the boost clock, while others were in better designed machines with better cooling and didn't drop as far away from the boost clock. But as long as they were staying at or above the base clock, that is acceptable.

And I'd be willing to bet that if you took the higher performance MX150 and put it in any of those laptops that have the weaker version, the performance wouldn't really go up any. Because I'm guess they all would thermal throttle and the performance would be the same. I mean, it is pretty obvious that even the ones with the weaker version MX150 were still being limited by thermals.
 
From what I gather, nVidia is setting the minimum specs for the lower power version still. It is just up to the manufacturers to pick which version they want to use. If they can't design a laptop around the lower power version, and the dedicated graphics ends up thermal throttling below the base clock, then that isn't nVidia or anyone else's fault but the laptop manufacturer.

Also, even with the low power version's specs set in stone, we can still have laptops using the same specced GPUs getting different performance. We see that in the benchmarks in the first post. The reason being that some likely were getting too hot and dropping from the boost clock, while others were in better designed machines with better cooling and didn't drop as far away from the boost clock. But as long as they were staying at or above the base clock, that is acceptable.

And I'd be willing to bet that if you took the higher performance MX150 and put it in any of those laptops that have the weaker version, the performance wouldn't really go up any. Because I'm guess they all would thermal throttle and the performance would be the same. I mean, it is pretty obvious that even the ones with the weaker version MX150 were still being limited by thermals.

there is also BIOS settings that are designed to keep total power of system under X TDP. Some laptops have universal power limits that will downclock the CPU or GPU if total power draw breaks X wattage. It might not even be throttling. This was an issue with some laptops having hidden BIOS settings that throttled the whole system. I forgot what laptops years back had that.
 
Last edited by a moderator:
While that is said and done, where's the MX110?
The MX130 finally shows up and it is in a build with the i7, with a higher price than an affordable($600) laptop with MX150 to boot.
Seriously wanted a cheap dedicated graphics chip so I don't have to rely on Dual-Channel memory, since the price of the laptop(under $400) would most likely come in Single.
 
So what about this one :confused:
GPU-Z.2.8.0_2018-03-24_13-50-28.png
 
Last edited:
MX150 is the entry-level GPU geared towards OEMs that never had clock speed and TDP specs in place. it's rather interesting that Nvidia provides no ranges whatsoever on their page though.
From varying sources, the TDP range is 10-36W, with standard implementation being 25W?
 
Or NVIDIA slandering news titles work as a click bait?

It's the clickbait Nvidia "bashing" news that annoy you ? You know what annoys me ? Your fame-war inducing comments that I always see :
 
Last edited by a moderator:
If I'm not mistaken, having the same GPU [short] model sold under different configurations is not new to the mobile scene, notebook OEMs have been selling GPUs with less dedicated memory for a long time now, yet look at the Nvidia specs for those GPUs and you'll only see the highest capacity listed. Back then, it was the notebook makers' jobs to tell how much memory that GPU has. I don't see this as any different, and Nvidia's materials on the matter clearly points to that fact...

I'd shake my pitchfork at the notebook brands first, personally.
 
Perhaps i miss understood something: where the ultrabooks being sold with the better GPU before? If yes, then the whole "they did it to fit the power envelope" argument doesn't fly @ all.

Assuming "yes" to the question above, the way i see it is not that nVidia changed the GPU to a less powerful version but that they purposely let the GPU model's specs vague in the "original version" and can therefore change it to a substantially less performing model while having "the same specs". As such, you could theoretically buy 2 of these "exact models" and have one perform 20+% more then the other. Dunno what you dudes call that but i call that "legal fraud".
 
thats not misleading consumers......i7 is generally the top CPU in its designated category (now they have i9s).....are you trolling with that comment?
.


Absolutely not.
I've seen some replace a 6700K GTX1070 with a I7 laptop with 1070 and thought it was just as good.
Yes, It happens ALOT!

or think their I7 is on par with the desktop.
 
there is also BIOS settings that are designed to keep total power of system under X TDP. Some laptops have universal power limits that will downclock the CPU or GPU if total power draw breaks X wattage. It might not even be throttling. This was an issue with some laptops having hidden BIOS settings that throttled the whole system. I forgot what laptops years back had that.

Yeah, but after reading the reviews for the laptops in question here, it is obviously a heat issue. These 13" utlra-thins just don't have the space for adequate cooling. The processors aren't able to keep their max turbos either in a lot of them.

Perhaps i miss understood something: where the ultrabooks being sold with the better GPU before? If yes, then the whole "they did it to fit the power envelope" argument doesn't fly @ all.

The answer is no. The higher power version seems to always be used in larger laptops, while the lower power version is in the 13" ultrabooks.
 
Last edited by a moderator:
  • Like
Reactions: HTC
The answer is no. The higher power version seems to always be used in larger laptops, while the lower power version is in the 13" ultrabooks.

I C.

I take it then that those ultrabooks didn't exist @ all with the "original" MX150 GPU and only when the "recent" MX150 came about they were introduced. If so, then there's absolutely no problem, other then nVidia "trying" to confuse the potential buyer by not providing "proper" specs: had they provided the specs, this issue would not arise @ all. Labeling 2 GPUs with substantially different performance the same (MX150) certainly does not help.
 
Last edited:
I C.

I take it then that those ultrabooks didn't exits @ all with the "original" MX150 GPU and only when the "recent" MX150 came about they were introduced. If so, then there's absolutely no problem, other then nVidia "trying" to confuse the potential buyer by not providing "proper" specs: had they provided the specs, this issue would not arise @ all.

Correct, the same model laptop never used different versions of the MX150.
 
  • Like
Reactions: HTC
Low quality post by HopelesslyFaithful
Absolutely not.
I've seen some replace a 6700K GTX1070 with a I7 laptop with 1070 and thought it was just as good.
Yes, It happens ALOT!

or think their I7 is on par with the desktop.
It kind of is misleading when you have Dual-Core i7s and then sell Quad-Core i5s, both in laptops, both in the same product line.

I mean, if you present a normal person with the choice between the i5-7440HQ or the i7-7500U, they will think the i7 is the better processor if they just go by the name.
People being ignorant and stupid does not make a well documented and formulated categorizing system deceptive. Holy crap.....I really excepted better from you newtekie.

Your last sentence is 100% bullcrap and patently false.

The name states i5 mid tear/features for its processor category. 7 stands for generation. 440 has a clear meaning too....a bit more complex. HQ/U also have a meaning. So going by the name it is clear 1 is a ULV while the other is a high performance quad core.

Can you be more dishonest?

Hell here is a better explanation and from intel. How in the world are you blaming intel for people being blatantly stupid....this is why we cant have nice things.
processor-number-core-i7-8650u-16x9.png.rendition.intel.web.480.270.png

https://www.intel.com/content/www/us/en/processors/processor-numbers.html
 
Thats just the higher clocked 1D10 variant overclocked even more baby!
Yeah I don't think Nvidia can be hanged for this, the OEM wanted something to fit in the 10W TDP & you get an underclocked MX150. It's clocks can easily be doubled, unless there's a hard TDP limit &/or these are the lower MX150 bins. Kinda like the RX 560D IIRC :wtf:
 
Alright, let me show the problem with this MX150, when a company also has an MX130, and why I feel Nvidia should take a hit on this. The gaps in performance are capable of edging comfortably into each others' territory. Take special note of that Cloud Gate 720p, the extreme width of the red bar on the MX150 there with min / max perf almost doubled in points. Also: Fire Strike 1080p: MX130 average scores higher than MX150 minimums. So much for a 'generational leap forward'... Keep in mind MX130 is Maxwell, thus less efficient.

https://www.notebookcheck.net/GeForce-MX130-vs-GeForce-MX150_8132_8000.247598.0.html

1521904362852.png
 
Alright, let me show the problem with this MX150, when a company also has an MX130, and why I feel Nvidia should take a hit on this. The gaps in performance are capable of edging comfortably into each others' territory. Take special note of that Cloud Gate 720p, the extreme width of the red bar on the MX150 there with min / max perf almost doubled in points. Also: Fire Strike 1080p: MX130 average scores higher than MX150 minimums. So much for a 'generational leap forward'... Keep in mind MX130 is Maxwell, thus less efficient.

Yes, in a few synthetics it looks like there isn't much performance difference. But did you look at the actual game tests? In any of the tests that actually have a MX130 score, the MX150's minimums beat the MX130.

FF XV low 720p: MX130 = 22.5FPS MX150 Minimum = 25.2FPS
Assisin's Creed Origins Low 720p: MX130 = 29FPS MX150 Minimum = 42FPS
Middle Earth: SoW Low 720p: MX130 = 43FPS MX150 Minimum = 47FPS
Rocket League Low 720p: MX130 = 94FPS MX150 Minimum = 127FPS

So, obviously, even the weaker version of the MX150 is still outperforming the MX130 in real world tests.

But, again, the biggest part of this is likely going to come down to the thermals of the laptop. Which is why the more efficient, and less heat outputting, MX150 is performing better than the MX130 in the real world tests. Because, even notebookcheck's own reviews on laptops with the weaker MX150 note that it will boost to over 1600MHz when the laptop is cool. The problem is these 13" ultrathins don't keep things cool for very long, again the same reviews note that even the processor begins to lower the boost under load too due to heat.

And the MX130's TPD is 30W for the configuration that is coming close to matching the weaker MX150. While the weaker MX150 only has a TDP of 10W. So that is why the weaker MX150 exists. If they would put a top MX130 in one of these 13" laptops, it would throttle so hard it wouldn't even come close to scoring as well as it did in those synthetics, forget about the real world tests. I doubt it would even be able to maintain its base clock.
 
Last edited:
Back
Top