Friday, May 1st 2020

Schenker Announces XMG Ultra Laptop Featuring up to Intel Core i9-10900K and up to NVIDIA RTX 2080 SUPER

Schenker today announced the release of their XMG Ultra laptops, which have been purpose/built as desktop alternatives. This means there are no limitations on hardware, and that portability or battery life aren't crucial factors - power is. To that effect, the XMG Ultra launches with a Z490/based motherboard and support for up to a ten-core Intel Core i9-10900K (the Core i7-10700K, with 8 cores, and the Core i7-10600K, with six cores, are also available). You can pair these CPUs with NVIDIA's RTX 2060 SUPER, 2070 SUPER, or 2080 SUPER for close to ultimate performance when it comes to available hardware. You can configure your XMG Ultra with up to 128 GB of system RAM.

Monitor options include a 17,3" 1080p 240 Hz, G-Sync panel, or an Ultra HD G-Sync panel with the same diagonal. The XMG Ultra's call to fame is that it is the first announced laptop with a 10-core Intel solution. And, since you'd be hard pressed to find an AMD offering that packs a comparable CPU with these very same graphics solutions (since OEMs, for some reason, have maxed out AMD CPU + NVIDIA GPU combos with up an RTX 2070 non-SUPER graphics card), this may be your best bet at getting a decent CPU paired with maximum mobile GPU power. Bear in mind that a pretty standard configuration will, however, set you back some €2,799.
Sources: XMG via Bestware, via Videocardz
Add your own comment

16 Comments on Schenker Announces XMG Ultra Laptop Featuring up to Intel Core i9-10900K and up to NVIDIA RTX 2080 SUPER

#1
Tomgang
Why am I afraid we have to do with a hot pile of noisy mess here...

I think I would rather like to have that laptop with the ryzen 9 3950X in eco mode. If I had to take any laptop with a desktop cpu.
Posted on Reply
#2
P4-630
Raevenlordand the Core i7-10600K, with six cores
i5-10600K you mean.
Posted on Reply
#3
Tom Yum
I'm sure you could pair a 10900K and 2080 Super, but given 10900K is rumoured to consume 250W at boost, and 2080 Super is 80-150W, this thing become so thermally limited so quickly that you might as well option a 10600 and 2070, they'd perform the same after 20 seconds or so.
Posted on Reply
#4
watzupken
I think this combination will cause the laptop to melt. The main problem will be to keep both components cool such that they are working at the optimal performance. In most cases, I feel the CPU and GPU will be heavily throttled due to insufficient cooling. So it makes no sense to go for this option, as oppose to a lower end spec which likely will be able to keep up with less throttling.
Posted on Reply
#5
Lokran88
Tom YumI'm sure you could pair a 10900K and 2080 Super, but given 10900K is rumoured to consume 250W at boost, and 2080 Super is 80-150W, this thing become so thermally limited so quickly that you might as well option a 10600 and 2070, they'd perform the same after 20 seconds or so.
I still don't get why everyone ist saying it's going to draw that much power when boosting in normal scenarios!? I mean I can even push my overclocked 8700k to draw 200 Watts- in Prime95. But this is not what I am doing all day long. In triple A titles or other heavier workloads the average consumption according to hwinfo stays at around 70 watts at 5 GHz allcore with spikes up to 120 watts. Now adding 4 cores and even higher boost the 10900k will obviously draw more and definitely too much for what would be possible on a different node but still I personally don't think it's going to have this insane 250 watts power draw in daily tasks- not talking about small ffts in prime95.
Posted on Reply
#6
dont whant to set it"'
Does it come with a portable electricity generating NPP for it in that price plus a virtual endless supply of dry ice or lN2? ,Oh and the cooling pots , how could I missed those? .
Posted on Reply
#7
bug
What about battery life? Are we still talking hours? Minutes? Seconds? Blink and you'll miss it?

Joking aside, all the drawbacks of a DTR that have been with us for years, still apply.
Posted on Reply
#8
matar
All that power for a FHD 1080p Need to be 4K or at least 1440p then ok , I have an Aorus X7 DT with an i7-6820HK 16GB DDR4 and a full desktop GTX 980 with 8GB Video RAM YES 8GB and any game at 1080p i can run super easy maxed out .
resources.aorus.com/Product/Spec/X7%20DT
Posted on Reply
#9
bug
matarAll that power for a FHD 1080p Need to be 4K or at least 1440p then ok , I have an Aorus X7 DT with an i7-6820HK 16GB DDR4 and a full desktop GTX 980 with 8GB Video RAM YES 8GB and any game at 1080p i can run super easy maxed out .
resources.aorus.com/Product/Spec/X7 DT
You're probably expected to plug it into a real monitor.
Posted on Reply
#10
QUANTUMPHYSICS
I'm of the belief that you absolutely don't need any GPU in a laptop more powerful than the RTX 2060.

Those laptops are between $899 and $1500 on average and for gaming you really don't need much more - unless you can get a 2070 on sale.

The 2080's are really for people using laptops for portable desktop replacement but only if they are connecting to a low latency 4K monitor.
Posted on Reply
#11
bonehead123
WORD(s):

1) Burn baby burn...
2) LN2 Chill pad...
3) Poof.....
Posted on Reply
#12
goodeedidid
2004 called and they want their laptop back..
Posted on Reply
#13
Tom Yum
Lokran88I still don't get why everyone ist saying it's going to draw that much power when boosting in normal scenarios!? I mean I can even push my overclocked 8700k to draw 200 Watts- in Prime95. But this is not what I am doing all day long. In triple A titles or other heavier workloads the average consumption according to hwinfo stays at around 70 watts at 5 GHz allcore with spikes up to 120 watts. Now adding 4 cores and even higher boost the 10900k will obviously draw more and definitely too much for what would be possible on a different node but still I personally don't think it's going to have this insane 250 watts power draw in daily tasks- not talking about small ffts in prime95.
Intel themselves have said 250W is their recommended power requirements for boost power draw, from Anandtech ' Not only that, despite the 125 W TDP listed on the box, Intel states that the turbo power recommendation is 250 W'. And yes you don't stay in boost all day, but half the point of these higher end processors (and particularly 10th gen) is its higher boost. If you discount that, you might as well ignore the 10900K and go with the 10700k or 10600k and have the same or higher sustained performance, which was my point.
Posted on Reply
#14
Lokran88
Tom YumIntel themselves have said 250W is their recommended power requirements for boost power draw, from Anandtech ' Not only that, despite the 125 W TDP listed on the box, Intel states that the turbo power recommendation is 250 W'. And yes you don't stay in boost all day, but half the point of these higher end processors (and particularly 10th gen) is its higher boost. If you discount that, you might as well ignore the 10900K and go with the 10700k or 10600k and have the same or higher sustained performance, which was my point.
I get your point and of course you are right that Intel themselves recommend these 250 Watts.
I guess that I am Just being too optimistic, thinking that it's more like a worst case scenario and that it is not likely to reach these levels of power consumption when it's boosting to 5.3 GHz on one or two cores or doing allcore boost in games or just some daily tasks in Windows.
But of course heavier workloads will obviously need this much power If Intel says so.
Posted on Reply
#15
watzupken
bugWhat about battery life? Are we still talking hours? Minutes? Seconds? Blink and you'll miss it?

Joking aside, all the drawbacks of a DTR that have been with us for years, still apply.
Battery life is sometimes not a good metrics. Based on my experience, CPU and GPU tend to throttle a lot when its getting power from the battery. So it depends on how aggressive is the throttling, which will artificially boost battery life. Sadly most reviews shows performance when the laptops are connected to the mains, and determine battery life through looping of some light work/ benchmark, which does not expose the lost in performance due to the heavy throttling.
Lokran88I still don't get why everyone ist saying it's going to draw that much power when boosting in normal scenarios!? I mean I can even push my overclocked 8700k to draw 200 Watts- in Prime95. But this is not what I am doing all day long. In triple A titles or other heavier workloads the average consumption according to hwinfo stays at around 70 watts at 5 GHz allcore with spikes up to 120 watts. Now adding 4 cores and even higher boost the 10900k will obviously draw more and definitely too much for what would be possible on a different node but still I personally don't think it's going to have this insane 250 watts power draw in daily tasks- not talking about small ffts in prime95.
Whether its drawing 120W or 250W, it is physically impossible for a laptop to dissipate that much heat, especially when you have another high end GPU adding to the heat output, which makes it doubly harder. I've owned a 17 inch DTR before, and I can tell you it is not possible to even get close to sustaining 4Ghz on all cores on a 4 core processor, not to mention a 10 core chip. They can slap as many heatpipes as they want in the laptop, but the bottleneck is on the heatsink which is tinny as compared even to a humble Coolermaster Hyper 212 which fails to keep a 9900K cool.

250W is highly possible on a desktop with some high end cooling since Intel is advertising a 4.7Ghz all core turbo ( I am sure it is going to need more than 250W since most 9900K at 5Ghz all core turbo is already drawing around 250W).
Posted on Reply
#16
bug
Lokran88I get your point and of course you are right that Intel themselves recommend these 250 Watts.
I guess that I am Just being too optimistic, thinking that it's more like a worst case scenario and that it is not likely to reach these levels of power consumption when it's boosting to 5.3 GHz on one or two cores or doing allcore boost in games or just some daily tasks in Windows.
But of course heavier workloads will obviously need this much power If Intel says so.
I guess what people fail to grasp is these 250W (and others, in various chips) are only for transient conditions. As long as the board supports the listed TDP the chips will still work normally. It just won't boost as high when given the opportunity is the board can't supply the juice.

I know it looks like shady tactics from Intel, but the truth is, CPUs are just getting that complicated. AMD is in a better position since their power scheme seems to draw all there is from the CPU out of the box, there's nothing left that increasing TDP could help with.
Posted on Reply
Add your own comment
Apr 19th, 2024 08:39 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts