• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Mobile NVIDIA GeForce RTX GPUs Will Vary Wildly in Performance, Clocks Lowered Substantially

Joined
Mar 6, 2017
Messages
3,204 (1.24/day)
Location
North East Ohio, USA
System Name My Ryzen 7 7700X Super Computer
Processor AMD Ryzen 7 7700X
Motherboard Gigabyte B650 Aorus Elite AX
Cooling DeepCool AK620 with Arctic Silver 5
Memory 2x16GB G.Skill Trident Z5 NEO DDR5 EXPO (CL30)
Video Card(s) XFX AMD Radeon RX 7900 GRE
Storage Samsung 980 EVO 1 TB NVMe SSD (System Drive), Samsung 970 EVO 500 GB NVMe SSD (Game Drive)
Display(s) Acer Nitro XV272U (DisplayPort) and Acer Nitro XV270U (DisplayPort)
Case Lian Li LANCOOL II MESH C
Audio Device(s) On-Board Sound / Sony WH-XB910N Bluetooth Headphones
Power Supply MSI A850GF
Mouse Logitech M705
Keyboard Steelseries
Software Windows 11 Pro 64-bit
Benchmark Scores https://valid.x86.fr/liwjs3
So I ask (anyone) again... how should they advertise it? For those who are up in arms... what would be the best way to advertise this??? How do you pin down performance metrics and clocks when it can vary 500 MHz and several % performance difference from min to max clocks? 2080m is different from 2080 Max-Q. Its a different name. I expect differences in a Honda Accord LE to the Special Edition as well. Do I know what those are? No, but I can search and see the differences in the dealership or online (much like this card, look up specs and performance analysis)
You got me on that one, I don't know. You can only dumb things down so much.
 
Joined
Jul 3, 2018
Messages
847 (0.40/day)
Location
Haswell, USA
System Name Bruh
Processor 10700K 5.3Ghz 1.35v| i7 7920HQ 3.6Ghz -180Mv |
Motherboard Z490 TUF Wifi | Apple QMS180 |
Cooling EVGA 360MM | Laptop HS |
Memory DDR4 32GB 3600Mhz CL16 | LPDDR3 16GB 2133Mhz CL20 |
Video Card(s) Asus ROG Strix 3080 (2100Mhz/18Ghz)|Radeon Pro 560 (1150Mhz/1655Mhz)|
Storage Many SSDs, ~24TB HDD/8TB SSD
Display(s) S2719DGF, HP Z27i, Z24n| 1800P 15.4" + ZR30W + iPad Pro 10.5 2017
Case NR600 | MBP 2017 15" Silver | MSI GE62VR | Elite 120 Advanced
Audio Device(s) Lol imagine caring about audio
Power Supply 850GQ | Apple 87W USB-C |
Mouse Whatever I have on hand + trackpads (Lanchead TE)
Keyboard HyperX Origins Alloy idk
Software W10 20H2|W10 1903 LTSC/MacOS 11
Benchmark Scores No.
Wow! It's almost like laptop cooling has gotten worse over time and given the power requirements for RTX cards, they have to limit them to these insane power targets!
Whats funny is that the fully fledged RTX 2080 in the P775TM1 was only around 20FPS faster than the 1080 in the same laptop, both 9900K's/32GB RAM and 150W TDP
 

HTC

Joined
Apr 1, 2008
Messages
4,601 (0.79/day)
Location
Portugal
System Name HTC's System
Processor Ryzen 5 2600X
Motherboard Asrock Taichi X370
Cooling NH-C14, with the AM4 mounting kit
Memory G.Skill Kit 16GB DDR4 F4 - 3200 C16D - 16 GTZB
Video Card(s) Sapphire Nitro+ Radeon RX 480 OC 4 GB
Storage 1 Samsung NVMe 960 EVO 250 GB + 1 3.5" Seagate IronWolf Pro 6TB 7200RPM 256MB SATA III
Display(s) LG 27UD58
Case Fractal Design Define R6 USB-C
Audio Device(s) Onboard
Power Supply Corsair TX 850M 80+ Gold
Mouse Razer Deathadder Elite
Software Ubuntu 19.04 LTS
Wow! It's almost like laptop cooling has gotten worse over time and given the power requirements for RTX cards, they have to limit them to these insane power targets!
Whats funny is that the fully fledged RTX 2080 in the P775TM1 was only around 20FPS faster than the 1080 in the same laptop, both 9900K's/32GB RAM and 150W TDP

The problem is that the chips have become smaller but the TDP hasn't decreased as much (in proportion), meaning the heat is much more concentrated, which can be circumvented in desktop environment using beefier coolers but not in laptop environment, where space is quite limited.
 
Joined
Jan 8, 2017
Messages
8,860 (3.36/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Larger processors at lower clocks tend to be much more power efficient. The fact that they have to pare back the clocks so much suggests these were hardly made with mobile in mind.
 
Low quality post by Franzen4Real
Joined
Jun 22, 2014
Messages
428 (0.12/day)
System Name Desktop / "Console"
Processor Ryzen 5950X / Ryzen 5800X
Motherboard Asus X570 Hero / Asus X570-i
Cooling EK AIO Elite 280 / Cryorig C1
Memory 32GB Gskill Trident DDR4-3600 CL16 / 16GB Crucial Ballistix DDR4-3600 CL16
Video Card(s) RTX 4090 FE / RTX 2080ti FE
Storage 1TB Samsung 980 Pro, 1TB Sabrent Rocket 4 Plus NVME / 1TB Sabrent Rocket 4 NVME, 1TB Intel 660P
Display(s) Alienware AW3423DW / LG 65CX Oled
Case Lian Li O11 Mini / Sliger CL530 Conswole
Audio Device(s) Sony AVR, SVS speakers & subs / Marantz AVR, SVS speakers & subs
Power Supply ROG Loki 1000 / Silverstone SX800
VR HMD Quest 3
Ahhh, I did. I get you now. :)
...however I do not agree with the toxicity levels and lack of reporting posts instead of replying (not you shur)... its the same people creating the same toxic environment here. I need to find my ignore button and stop worrying about the sinking ship since nobody else appears to.

Believe me, you're not the only one... And the ignore button has been broken for quite some time unfortunately.

"more wood for the fires, loud neighbors....
......The toxicity of our city"
 
Low quality post by EarthDog
Joined
Dec 31, 2009
Messages
19,366 (3.72/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
It works just fine. The problem is clicking on the 'show content' button... haha!

This thread looks much better without all the noise, bait, insults, and BS. Self moderating the BS FTW! :)
 
Last edited:
Joined
Sep 17, 2014
Messages
20,775 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Clockspeed varies wildly not on SKU, but on environmental conditions, right? That is nothing new either, right?

Ugh, so annoying when I try to read the first post to get this, important, information and its nowhere to be found. Soooooooooooooooooooo tired of copy paste news...

This isn't really different from desktops. Better cooling solution allow for higher TDPs which in turn allow for higher frequencies. The only thing different is manufacturers get more leeway to play with the TDP.

This summarizes it perfectly. And at the same time, manufacturers, along with Nvidia as the happy provider of model names/product stacks, loves to play along. Why is a significantly worse performing card 'called' the same as its desktop 'equivalent' SKU. That is deceptive. It is the same deception as Intel calling dualcores an i7 because they have HT... and on desktop you find them as an i3. These companies know it is deceptive, and they know that it works. All marketing they do, contains these same product names. Nowhere will you see the clocks advertised along with it, nor the performance gaps between desktop and laptop 'equivalents'. Only by reading the fine print will you discover this difference - and you don't know what you don't know.

We can debate Darwinism and while I fully agree that people are required to do their research/due diligence, I still think its a horrible practice.

As for 'how should they do it'... simple. Get the closest performing desktop equivalent model name and put that sticker on your laptop part. That goes for both the Max-Q's and the regular variants. If I recall, Nvidia used to put an "M" at the end until Pascal. That was also a distinction less deceptive than the way they are marketing it since Max-Q launched.
 
Last edited:
Joined
Nov 29, 2016
Messages
667 (0.25/day)
System Name Unimatrix
Processor Intel i9-9900K @ 5.0GHz
Motherboard ASRock x390 Taichi Ultimate
Cooling Custom Loop
Memory 32GB GSkill TridentZ RGB DDR4 @ 3400MHz 14-14-14-32
Video Card(s) EVGA 2080 with Heatkiller Water Block
Storage 2x Samsung 960 Pro 512GB M.2 SSD in RAID 0, 1x WD Blue 1TB M.2 SSD
Display(s) Alienware 34" Ultrawide 3440x1440
Case CoolerMaster P500M Mesh
Power Supply Seasonic Prime Titanium 850W
Keyboard Corsair K75
Benchmark Scores Really Really High
"Users should do their due research when it comes to the point of choosing what mobile solution sporting one of these NVIDIA GPUs they should choose."

People should be doing that all the time.

Keep in mind we don't have many benchmarks comparing TDP with DXR on and off. Without DXR, it's possible these will offer decent performance (definitely not desktop levels, though).

No. It doesn't work like that. There's no such thing as DXR off. That core is part of the chip.
 
Joined
Mar 10, 2014
Messages
1,793 (0.49/day)
This summarizes it perfectly. And at the same time, manufacturers, along with Nvidia as the happy provider of model names/product stacks, loves to play along. Why is a significantly worse performing card 'called' the same as its desktop 'equivalent' SKU. That is deceptive. It is the same deception as Intel calling dualcores an i7 because they have HT... and on desktop you find them as an i3. These companies know it is deceptive, and they know that it works.

We can debate Darwinism and while I fully agree that people are required to do their research/due diligence, I still think its a horrible practice.

As for 'how should they do it'... simple. Get the closest performing desktop equivalent model name and put that sticker on your laptop part. That goes for both the Max-Q's and the regular variants. If I recall, Nvidia used to put an "M" at the end until Pascal. That was also a distinction less deceptive than the way they are marketing it since Max-Q launched.

Uhh, I would argue that current naming scheme is way better than it used to be. At least Nvidia is really using same chips on same named skus. I.E. in Fermi times gtx580m was not even gf110 gpu, it was tier lower gf114.
 
Joined
Nov 29, 2016
Messages
667 (0.25/day)
System Name Unimatrix
Processor Intel i9-9900K @ 5.0GHz
Motherboard ASRock x390 Taichi Ultimate
Cooling Custom Loop
Memory 32GB GSkill TridentZ RGB DDR4 @ 3400MHz 14-14-14-32
Video Card(s) EVGA 2080 with Heatkiller Water Block
Storage 2x Samsung 960 Pro 512GB M.2 SSD in RAID 0, 1x WD Blue 1TB M.2 SSD
Display(s) Alienware 34" Ultrawide 3440x1440
Case CoolerMaster P500M Mesh
Power Supply Seasonic Prime Titanium 850W
Keyboard Corsair K75
Benchmark Scores Really Really High
People need to stop comparing MaxQ vs. regular cards vs. desktop cards. A 1080 MaxQ and a regular 1080 in a laptop didn't have the same performance and both are slower than the desktop version of the 1080. That's common sense. The desktop version has better cooling, better power, can boost longer.

This stupid article is just getting people to argue about stupid things that's obvious. The performance will vary because some laptops have bad cooling etc.

Max-Q defines the size of the card and it different from their mobile coutnerparts, correct. This is nothing new. MAx-Q designs are typically found in less expensive, but typically more power efficient, and thin models. It was expected upon release of them last gen and nothing has changed here. They are a different tier compared to their more robust mobile and especially desktop counterparts. This should not be a surprise. :)

This, from NVIDIA, explicitly explains what Max-Q is: https://www.nvidia.com/en-us/geforce/gaming-laptops/max-q/
or this: https://blogs.nvidia.com/blog/2018/12/14/what-is-max-q/

Why we are comparing last gen desktop cards with this gen Max-Q is beyond me...

Anyway, enjoy gents. :)

Thank god there's someone that understands...
 

Raevenlord

News Editor
Joined
Aug 12, 2016
Messages
3,755 (1.35/day)
Location
Portugal
System Name The Ryzening
Processor AMD Ryzen 9 5900X
Motherboard MSI X570 MAG TOMAHAWK
Cooling Lian Li Galahad 360mm AIO
Memory 32 GB G.Skill Trident Z F4-3733 (4x 8 GB)
Video Card(s) Gigabyte RTX 3070 Ti
Storage Boot: Transcend MTE220S 2TB, Kintson A2000 1TB, Seagate Firewolf Pro 14 TB
Display(s) Acer Nitro VG270UP (1440p 144 Hz IPS)
Case Lian Li O11DX Dynamic White
Audio Device(s) iFi Audio Zen DAC
Power Supply Seasonic Focus+ 750 W
Mouse Cooler Master Masterkeys Lite L
Keyboard Cooler Master Masterkeys Lite L
Software Windows 10 x64
Joined
Jul 3, 2018
Messages
847 (0.40/day)
Location
Haswell, USA
System Name Bruh
Processor 10700K 5.3Ghz 1.35v| i7 7920HQ 3.6Ghz -180Mv |
Motherboard Z490 TUF Wifi | Apple QMS180 |
Cooling EVGA 360MM | Laptop HS |
Memory DDR4 32GB 3600Mhz CL16 | LPDDR3 16GB 2133Mhz CL20 |
Video Card(s) Asus ROG Strix 3080 (2100Mhz/18Ghz)|Radeon Pro 560 (1150Mhz/1655Mhz)|
Storage Many SSDs, ~24TB HDD/8TB SSD
Display(s) S2719DGF, HP Z27i, Z24n| 1800P 15.4" + ZR30W + iPad Pro 10.5 2017
Case NR600 | MBP 2017 15" Silver | MSI GE62VR | Elite 120 Advanced
Audio Device(s) Lol imagine caring about audio
Power Supply 850GQ | Apple 87W USB-C |
Mouse Whatever I have on hand + trackpads (Lanchead TE)
Keyboard HyperX Origins Alloy idk
Software W10 20H2|W10 1903 LTSC/MacOS 11
Benchmark Scores No.
The problem is that the chips have become smaller but the TDP hasn't decreased as much (in proportion), meaning the heat is much more concentrated, which can be circumvented in desktop environment using beefier coolers but not in laptop environment, where space is quite limited.
The size of die's has gone up, as well as the TDP as cooling capacity has gone down
Makes me miss the days of the Clevo X7200 and the M17X R2..
 

bug

Joined
May 22, 2015
Messages
13,161 (4.07/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
This summarizes it perfectly. And at the same time, manufacturers, along with Nvidia as the happy provider of model names/product stacks, loves to play along. Why is a significantly worse performing card 'called' the same as its desktop 'equivalent' SKU. That is deceptive. It is the same deception as Intel calling dualcores an i7 because they have HT... and on desktop you find them as an i3. These companies know it is deceptive, and they know that it works. All marketing they do, contains these same product names. Nowhere will you see the clocks advertised along with it, nor the performance gaps between desktop and laptop 'equivalents'. Only by reading the fine print will you discover this difference - and you don't know what you don't know.

We can debate Darwinism and while I fully agree that people are required to do their research/due diligence, I still think its a horrible practice.

As for 'how should they do it'... simple. Get the closest performing desktop equivalent model name and put that sticker on your laptop part. That goes for both the Max-Q's and the regular variants. If I recall, Nvidia used to put an "M" at the end until Pascal. That was also a distinction less deceptive than the way they are marketing it since Max-Q launched.
Well, from an engineering and historicalpoint of view, mobile GPU used to be quite different from their desktop counterparts. Hell, oftentimes, the mobile GPUs were from a whole previous generation. Nvidia has first started using the same designs across product lines and now, for the first time, they can actually sell you GPUs running the same configuration as their desktop siblings. To me, the constraints of a laptop are well known and I fully expect anything that runs on a laptop to do so considerablly slower than on a desktop (part of the reason I still hate laptops and my main rig is a fully-fledged desktop), but if that is confusing/deceptive to you... well, there's nothing for me to add there.
 
Joined
Jun 16, 2016
Messages
409 (0.14/day)
System Name Baxter
Processor Intel i7-5775C @ 4.2 GHz 1.35 V
Motherboard ASRock Z97-E ITX/AC
Cooling Scythe Big Shuriken 3 with Noctua NF-A12 fan
Memory 16 GB 2400 MHz CL11 HyperX Savage DDR3
Video Card(s) EVGA RTX 2070 Super Black @ 1950 MHz
Storage 1 TB Sabrent Rocket 2242 NVMe SSD (boot), 500 GB Samsung 850 EVO, and 4TB Toshiba X300 7200 RPM HDD
Display(s) Vizio P65-F1 4KTV (4k60 with HDR or 1080p120)
Case Raijintek Ophion
Audio Device(s) HDMI PCM 5.1, Vizio 5.1 surround sound
Power Supply Corsair SF600 Platinum 600 W SFX PSU
Mouse Logitech MX Master 2S
Keyboard Logitech G613 and Microsoft Media Keyboard
This summarizes it perfectly. And at the same time, manufacturers, along with Nvidia as the happy provider of model names/product stacks, loves to play along. Why is a significantly worse performing card 'called' the same as its desktop 'equivalent' SKU. That is deceptive. It is the same deception as Intel calling dualcores an i7 because they have HT... and on desktop you find them as an i3. These companies know it is deceptive, and they know that it works. All marketing they do, contains these same product names. Nowhere will you see the clocks advertised along with it, nor the performance gaps between desktop and laptop 'equivalents'. Only by reading the fine print will you discover this difference - and you don't know what you don't know.

We can debate Darwinism and while I fully agree that people are required to do their research/due diligence, I still think its a horrible practice.

As for 'how should they do it'... simple. Get the closest performing desktop equivalent model name and put that sticker on your laptop part. That goes for both the Max-Q's and the regular variants. If I recall, Nvidia used to put an "M" at the end until Pascal. That was also a distinction less deceptive than the way they are marketing it since Max-Q launched.

There's a distinct difference between the old M GPUs and the new Max-Q ones. Maxwell and earlier, the chips would actually be different between, say, a 970 and a 970m. They would usually have a specific mobility chip that would be wider and lower clocked. Starting with Pascal, the chips got so efficient that Nvidia just kept the same chip for the same branding. They would reduce the clocks slightly, but the performance would be generally the same. Then the Max-Q chips came and dropped the clocks and voltages even further, but it is still the same chip.

I have no problem with them selling the same chip in a graphics card and a laptop and having the same branding. And Max-Q is just a factory underclocked and undervolted part. I do agree that they need to have a better distinction between the performance levels, but there's a reason they don't put an "m" after these chips. They are the desktop chips in a mobile form factor.
 
Joined
Feb 12, 2015
Messages
1,104 (0.33/day)
Well the approach they took with the MX150 was incredibly successful, so why wouldn't they?

On the surface the MX150 is a 25w cheap mobile card that theoretically should outperform its desktop counterpart (GT 1030) by a solid 5-10%. However the most prevalent version of the MX150 was actually a version limited to an 8w TDP. I have such a laptop with one of these, and in stock configuration it is almost worse than having Intel graphics - and that is because it starts jumping between 500MHz and 1200MHz after 5 min of gaming (causing immense stutter).

However if you force clocks to stay at 1150MHz, overclock the memory, and undervolt it substantially - it can actually match the GT 1030 within an 8w profile. Very cool - but annoying for non-overclockers. I expect the same of these RTX Laptop cards.

Nvidia wants to be able to say they can fit a GTX 2060 in an ultrabook...... even if it will perform like an AMD APU under prolonged load lol. But to those up to the challenge, I bet you can get it close to 1070 performance after undervolting.

Uhh, I would argue that current naming scheme is way better than it used to be. At least Nvidia is really using same chips on same named skus. I.E. in Fermi times gtx580m was not even gf110 gpu, it was tier lower gf114.

Even funnier was the GTX 480m was a GTX 465 running at (no joke) only around 300-400MHz if I remember correctly lol. It had incredible reliability and overheating issues (like all of Fermi) while also only really being usable in laptops the size of a desktop.
 
Joined
Feb 3, 2017
Messages
3,475 (1.33/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
Joined
Mar 10, 2014
Messages
1,793 (0.49/day)
I have a notebook with a mxm slot how do I buy one of these gpus when they launch?

Contact your notebook vendor if it's even possible. They usually don't want to make upgrading possible. If it's possible then there's sites like eurocom where to buy upgrade kits, which costs a lot(i.e. gtx 1070 with all needed heatsinks and mounting bits costs $1k). In fact it's so expensive that you would better of buying new notebook anyway.
 
Joined
Dec 31, 2009
Messages
19,366 (3.72/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
nope, I have no intention of waiting, its better to get a desktop gpu with an external box and use it at nvidiasmi -pl (half of the supported power)
why would you do that when an external.box can handle a full size card and cooling?
 
Joined
Jul 3, 2018
Messages
847 (0.40/day)
Location
Haswell, USA
System Name Bruh
Processor 10700K 5.3Ghz 1.35v| i7 7920HQ 3.6Ghz -180Mv |
Motherboard Z490 TUF Wifi | Apple QMS180 |
Cooling EVGA 360MM | Laptop HS |
Memory DDR4 32GB 3600Mhz CL16 | LPDDR3 16GB 2133Mhz CL20 |
Video Card(s) Asus ROG Strix 3080 (2100Mhz/18Ghz)|Radeon Pro 560 (1150Mhz/1655Mhz)|
Storage Many SSDs, ~24TB HDD/8TB SSD
Display(s) S2719DGF, HP Z27i, Z24n| 1800P 15.4" + ZR30W + iPad Pro 10.5 2017
Case NR600 | MBP 2017 15" Silver | MSI GE62VR | Elite 120 Advanced
Audio Device(s) Lol imagine caring about audio
Power Supply 850GQ | Apple 87W USB-C |
Mouse Whatever I have on hand + trackpads (Lanchead TE)
Keyboard HyperX Origins Alloy idk
Software W10 20H2|W10 1903 LTSC/MacOS 11
Benchmark Scores No.
I have a notebook with a mxm slot how do I buy one of these gpus when they launch?
Likely they won't fit in whatever laptop you have unless you have one of the most recent Clevo's or GT7x or GT8x from MSI

Contact your notebook vendor if it's even possible. They usually don't want to make upgrading possible. If it's possible then there's sites like eurocom where to buy upgrade kits, which costs a lot(i.e. gtx 1070 with all needed heatsinks and mounting bits costs $1k). In fact it's so expensive that you would better of buying new notebook anyway.
Eurocom is hot steaming garbage
You can get a proper 1070 MXM card for around the $500 mark, still not great, but GL finding a laptop with a 1070 for around $500
 
Joined
Mar 18, 2015
Messages
2,960 (0.90/day)
Location
Long Island
This really isn't complicated.

GTX 10-Series GPUs in laptops and mobile devices already had to be downclocked to avoid thermal and power consumption issues.

If the RTX chips are hotter and more power hungry, then the downclocking will have to be more aggressive assuming cooling stays the same, otherwise they will simply burn.

End of story. Done. Finished. That's that. The laws of physics dictate no less.

The laws of physics, well thermodynamics, don't change ... but the comparison is not apples and apples. What are we comparing ?

We can compare the models by series ... in which case the 2080 will likely use more power than the 1080. But the apples and apples comparison must be on the basis of performance delivered ... in which the 2080 should be compared with something close in performance such as the 1080 Ti

MSI 1080 Ti Gaming = 282 watts peak gaming per TPU testing
MSI 2080 Gaming = 244 watts peak gaming per TPU testing

Of course manufacturers and vendors will do their best to hide which of the 20X0 series GPU is being used. And nVidia is supplying what their cleints are asking for, people are going to choose a stylish laptop and will want 2080 performance. The ones who won'tlook poast the web price or store placard will go home with a thin mas market lappie with a heavily downclocked 2080 . Others will spend less on a well designed, preferably custom built laptop with a hefty cooling system and a 2070 with similar performance.

Be like an engineer, forget about "sexy" thin laptops, purchase from a quality outfit known for high performance, or better yet "custom built" and verify that you are getting what you want.... If it doesn't weight 5.5 pounds (15") or 7.5_ pounds (17"), it's not likely going to support Qmax AND deliver long battery life

RTX-OPS - 37T versus T53T
Giga Rays/s - 5 versus 7
Boost Clock (MHz) - 1095 versus 1590
Base Clock (MHz) - 735 versus 1380
Thermal Design Power - 80 W versus 150+ W

Here's MSI's offerings ... had to configure more than Id like extras wise to get apples and apples.

MSI GS75 Stealth 202- $3,227
17.3" FHD, IPS-Level 144Hz 3ms 100%sRGB 72%NTSC
NVIDIA GeForce RTX 2080 8GB GDDR6 w/ Qmax
8th Generation Intel® Core™ Coffee Lake i7-8750H
16GB (1x16GB) DDR4 2666MHz SO-DIMM Memory
512GB Samsung 970 PRO M.2 NVMe Solid State Drive
2TB Western Digital BLUE SATA III Solid State Disk Drive
802.11AC wireless LAN + Bluetooth
4.96 lbs with 4-cell Battery

MSI GE75 Raider 048 - $3,198
17.3" FHD, IPS-Level 144Hz 3ms 100%sRGB 72%NTSC
NVIDIA GeForce RTX 2080 8GB GDDR6
8th Generation Intel® Core™ Coffee Lake i7-8750H
16GB (1x16GB) DDR4 2666MHz SO-DIMM Memory
512GB Samsung 970 PRO M.2 NVMe Solid State Drive
2TB Western Digital BLUE SATA III Solid State Disk Drive
802.11AC wireless LAN + Bluetooth
5.75 lbs with 6-cell Battery

So the QMax is an extra $29 but battey is smaller ... Id be concerned about the extra power combinerd with smaller power and perhaps less robust cooling.
 
Last edited:
Top