• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel 11th Gen Core "Tiger Lake" & Xe Graphics Launch Event: Live Blog

So why aren't there any desktop models with this 10nm Willow Cove business? We seriously have to wait into next year for the 1200 socket refresh?
 
Did anybody actually look in the side notes?
Did Intel just compared a Intel system running 4267MHz memory against AMD 3200MHz

Ohhhh that's a big nono
That's ugly, even for intel...
 
Did anybody actually look in the side notes?
Did Intel just compared a Intel system running 4267MHz memory against AMD 3200MHz

Ohhhh that's a big nono
That's ugly, even for intel...

Maybe everyone already knows that LPDDR4 and DDR4 share neither the same lineage nor performance characteristics, and therefore comparing them directly on MT/s is to compare apples to oranges?

The dual-channel LPDDR4 is within spec for the Intel system and the dual-channel DDR4 also within spec for the AMD system. Are you about to tell me that Ice Lake running LPDDR4-3733 is also an unfair advantage for Intel?
 
Maybe everyone already knows that LPDDR4 and DDR4 don't share the same the same lineage, and therefore comparing them directly on MT/s is to compare apples to oranges?

The dual-channel LPDDR4 is within spec for the Intel system and the dual-channel DDR4 also within spec for the AMD system.
I do agree that they differ. But then, why not share bandwidth, timings?
It seens to me they are not comparing CPU's, but actually Intel's new CPU against some lenovo that happens to use AMD Ryzen mobile CPU in it.
 
I do agree that they differ. But then, why not share bandwidth, timings?
It seens to me they are not comparing CPU's, but actually Intel's new CPU against some lenovo that happens to use AMD Ryzen mobile CPU in it.

That "some Lenovo" is as typical and full-fledged a Renoir configuration you're going to find, and knowing how past 14nm and Ice Lake ultraportables have turned out, Intel's own TGL demonstrator is probably pretty close to what a TGL ultrabook will eventually be configurated as. What do you propose, Intel devote their time to create a special performance-enhanced development platform laptop for their competitor's product? This isn't a desktop; you can only use existing Renoir products to represent Renoir performance, and as far as I can see the laptop they chose isn't hamstrung on a hardware front.

Firmware-wise, we don't know. Lenovo does some strange things with power limits on Intel laptops by handicapping them to strictly their TDP. With Renoir, AMD has given OEMs a lot of flexibility when it comes to the performance parameters that they can choose.
 
Last edited:
That "some Lenovo" is as typical and full-fledged a Renoir configuration you're going to find, and knowing how past 14nm and Ice Lake ultraportables have turned out, Intel's own TGL demonstrator is probably pretty close to what a TGL ultrabook will eventually be configurated as. What do you propose, Intel devote their time to create a special performance-enhanced development platform laptop for their competitor's product? This isn't a desktop; you can only use existing Renoir products to represent Renoir performance, and as far as I can see the laptop they chose isn't hamstrung on a hardware front.
Again, I do agree with you, it is not the same thing as comparing desktop CPUs, where you can use the exact same memory or GPU, so you can only look at CPU results.
But, about creating special enhanced system, to be clear, I'm not saying they did, but it would not be the first time. Not so long ago, they presented a Xeon 28 core CPU live demonstration, runing a very, and I mean very special cooling unit to maintain a very high level of overclock, that they never mentioned when presenting it, only days later after people saw a weird photo of a pipe just chilling away from the system.
If it was apples to apples, then show us the full spec of memory, they have a dedicated page to all systems they used, why not put memory specs also?
 
"
Mobile U-Series Gaming Play GTAV with up to 1.29x more FPS with a 11th Gen Intel® Core™ i7-1185G7 vs. AMD Ryzen 7 4800U Intel Core i7-1185G7 Processor Intel® Core™ i7-1185G7 processor (TGL-U) PL1=28W, 4C8T, Memory: 2x16GB LPDDR4-4267Mhz, Storage: Samsung MZVLB512HBJQ-0000, Display Resolution: 1920 x 1080, OS: Microsoft Windows* 10 Pro 10.0.19041.388*, Graphics: Intel® Xe Graphics, Graphics driver: 27.20.100.8431*, Bios version: T14C4II6.92A measured on pre-production Intel reference system *Graphics Driver: Gfx-driver-ci-master-5580 for Doom Eternal *OS: Windows 10 Pro 10.0.19041.450 for Battlefield V Processor: AMD Ryzen 7 4800U processor, 8C16T, Memory: 2x8GB DDR4-3200MHz, Storage: Samsung MZVLB512HBJQ-000L2, Display Resolution: 1920 x 1080, OS: Microsoft Windows* 10 Pro 10.0.19041.388 *, Graphics: AMD Radeon(TM) Graphics, Graphics driver: 27.20.2001.9003*, Bios version: E7CN25WW measured on OEM system based on highest available performance profile. * Graphics Driver: 27.20.2001.13001 for Battlefield V, Gears Tactics, and GRID 2019. * OS: Windows 10 Pro 10.0.19041.450 for Battlefield V and Gears Tactics. As measured by GTAV (1080p Medium) August 20th,





So, a 28W 4C8T 4267Mhz RAM vs a 15W 8C16T with 3200Mhz RAM.


Congrats on shitty comparison Intel, so in all reality Xe is about 25% slower than AMD APU hardware.
 
If it was apples to apples, then show us the full spec of memory, they have a dedicated page to all systems they used, why not put memory specs also?

How exactly would that tell you anything that you don't already know now? They already have the memory specs up on that page. LPDDR4 clocks and timings aren't static like DDR4, they constantly fluctuate with using the laptop, and again, neither of those directly comparable between LPDDR4 and DDR4. If it's an AIDA benchmark run you want Intel to do (which by itself can be laughably bad at representing real-world bandwidth), that would neither seek to further their presentation or our understanding of the product in any way.

Renoir also purportedly supports LPDDR4-4266; the problem is that most Renoir chips end up in thicker laptops with SO-DIMM slots, so no OEMs bother to put in the work to design a LPDDR4-based Renoir laptop because the other 10 SKUs all use soldered or socketed DDR4. You can't test what doesn't exist.

And even had there been a LPDDR4 Renoir ultrabook, I highly doubt the Vega iGPU would see any gains compared to dual-channel DDR4-3200. Same goes for Tiger Lake, there's no reason why LPDDR4-4266 makes it magically faster. LPDDR4 is for power efficiency, not performance. Renoir already achieves fantastic battery life improvements with DDR4.

IIRC, LPDDR4 literally has half the bus width of DDR4.

The point of concern you're looking for here is the continued use of high turbo frequencies that draw power way beyond TDP and PL2. PL1 and PL2 are in the hands of laptop manufacturers, and there's often nothing that can be done if the OEM decides to set PL2 at strictly TDP, which absolutely hobbles your long-term turbo performance. That's something we'll only know from laptop reviewers once the products hit the market.
 
Last edited:
How exactly would that tell you anything that you don't already know now? They already have the memory specs up on that page. LPDDR4 clocks and timings aren't static like DDR4, they constantly fluctuate with using the laptop, and again, neither of those directly comparable between LPDDR4 and DDR4. If it's an AIDA benchmark run you want Intel to do (which by itself can be laughably bad at representing real-world bandwidth), that would neither seek to further their presentation or our understanding of the product in any way.

Renoir also purportedly supports LPDDR4-4266; the problem is that most Renoir chips end up in thicker laptops with SO-DIMM slots, so no OEMs bother to put in the work to design a LPDDR4-based Renoir laptop because the other 10 SKUs all use soldered or socketed DDR4. You can't test what doesn't exist.

And even had there been a LPDDR4 Renoir ultrabook, I highly doubt the Vega iGPU would see any gains compared to dual-channel DDR4-3200. Same goes for Tiger Lake, there's no reason why LPDDR4-4266 makes it magically faster. LPDDR4 is for power efficiency, not performance. Renoir already achieves fantastic battery life improvements with DDR4.

IIRC, LPDDR4 literally has half the bus width of DDR4.

The point of concern you're looking for here is the continued use of high turbo frequencies that draw power way beyond TDP and PL2. PL1 and PL2 are in the hands of laptop manufacturers, and there's often nothing that can be done if the OEM decides to set PL2 at strictly TDP, which absolutely hobbles your long-term turbo performance. That's something we'll only know from laptop reviewers once the products hit the market.
28 W Intel part VS 15 W AMD part, so by default it makes the Intel part 25% shittier just by default, then add in that the Intel part has half the cores, and their testing is using plugged in power profiles with, you are correct, unknown cooling for the Intel part.
 
How exactly would that tell you anything that you don't already know now? They already have the memory specs up on that page. LPDDR4 clocks and timings aren't static like DDR4, they constantly fluctuate with using the laptop, and again, neither of those directly comparable between LPDDR4 and DDR4. If it's an AIDA benchmark run you want Intel to do (which by itself can be laughably bad at representing real-world bandwidth), that would neither seek to further their presentation or our understanding of the product in any way.

Renoir also purportedly supports LPDDR4-4266; the problem is that most Renoir chips end up in thicker laptops with SO-DIMM slots, so no OEMs bother to put in the work to design a LPDDR4-based Renoir laptop because the other 10 SKUs all use soldered or socketed DDR4. You can't test what doesn't exist.

And even had there been a LPDDR4 Renoir ultrabook, I highly doubt the Vega iGPU would see any gains compared to dual-channel DDR4-3200. Same goes for Tiger Lake, there's no reason why LPDDR4-4266 makes it magically faster. LPDDR4 is for power efficiency, not performance. Renoir already achieves fantastic battery life improvements with DDR4.

IIRC, LPDDR4 literally has half the bus width of DDR4.

The point of concern you're looking for here is the continued use of high turbo frequencies that draw power way beyond TDP and PL2. PL1 and PL2 are in the hands of laptop manufacturers, and there's often nothing that can be done if the OEM decides to set PL2 at strictly TDP, which absolutely hobbles your long-term turbo performance. That's something we'll only know from laptop reviewers once the products hit the market.

It's not like a smartphone... LPDDR4X 4266 in dual-channel effectively results in 68Gbps of bandwidth(vs 52Gbps on DDR43200Mhz), and this obviously has an effect on iGPU performance. I won't even comment about the 50w TDP...
 
If they burst to 50W often and guarantee x hours of batterylife, those will get expensive
 
crap50w.JPG

50W at boost vesus a cpu that maxes out at 25W....hmmmm back to it's old pr tricks
 
Well, Intel who claims benchmarks should not matter, apparently these benchmarks matter to them...
 
Did anybody actually look in the side notes?
Did Intel just compared a Intel system running 4267MHz memory against AMD 3200MHz

Ohhhh that's a big nono
That's ugly, even for intel...

That's actually mild considering its from the same guys that hid a 1kw chiller under the table. :roll:
 
*yawns in R7 4800H
 
Renoir also purportedly supports LPDDR4-4266; the problem is that most Renoir chips end up in thicker laptops with SO-DIMM slots, so no OEMs bother to put in the work to design a LPDDR4-based Renoir laptop because the other 10 SKUs all use soldered or socketed DDR4. You can't test what doesn't exist.
Well if you bother putting up a mild effort you can find it, alas Intel's being Intel :shadedshu:
And I wouldn't be surprised in the least bit if OEMs were paid to avoid putting this LPDDR4x on top end Ryzen, especially given it's relatively cheap being the bog standard on all mobiles these days & ICL only supporting up to 3733 MHz!
Lenovo Yoga Slim 7 Specifications
CPUAMD Ryzen 7 4800U
GraphicsAMD Radeon Graphics
RAM16GB DDR4 4266 MHz
SSD512GB SK Hynix SSD
Display14-inch, 1920 x 1080, IPS display
NetworkingIntel Wi-Fi 6 AX 201 (2x2), Bluetooth 5
PortsUSB 3.2 Type-C, USB-C PD, 2x USB 3.2 Gen 1 Type-A, micro SD card reader, HDMI, 3.5 mm headphone jack
Camera720p, IR
Battery60.7 WHr
Power Adapter65W
Operating SystemWindows 10 Home
Dimensions (WxDxH)12.6 x 8.2 x 0.6 inches / 320.6 x 208 x 14.9 mm
Weight3.8 pounds / 1.4 kg
Price (as configured)€999.00, Not available as configured in the United States
 
I hate this Intel built-in CPU graphic chip. It just interfere with my discrete graphic and causes problem. That's why I opt to buy 9900kf instead of the stupid 9900k.
 
I hate this Intel built-in CPU graphic chip. It just interfere with my discrete graphic and causes problem. That's why I opt to buy 9900kf instead of the stupid 9900k.
So how is that working out for you in a laptop?
 
I rather wait for actual results before making a conclusion on Tiger Lake. I feel Intel hyped up TGL quite a fair bit, but the announcement don't bring anything exciting to the table. I certainly will not use their benchmark result as golden considering their shady benchmark tactics. It is certainly not unexpected that they will gimp the AMD system to give themselves an advantage. Memory speed is certainly in favor of Intel when they claim a 60+ % improvement. Graphics are highly sensitive to memory speed.

I still stand by my opinion that this SuperFin is not as great as they claimed to be. I feel is just allow the existing 10nm to take in more power (on top of some efficiency gain), and thus, allow a higher clockspeed. Basically they are just pushing 10nm to be like 14nm to run at higher clockspeed. At least it is good to see healthy IPC improvement from Intel, and not another improvement to clockspeed only. Power requirement increase will show in battery tests.

I hate this Intel built-in CPU graphic chip. It just interfere with my discrete graphic and causes problem. That's why I opt to buy 9900kf instead of the stupid 9900k.
I don't know if it is a driver problem though.
 
15W vs 28W ?

Is it gameover ?
 
i dropped the reading after the "world's best processor" slide ... :roll:

alright maybe i read more thoroughly ... i have a bit of time to waste and a good laugh will not hurt ...
 
12-28W and different memory speeds. So in reality it's not that it probably isn't as fast as AMD's 4 year old Vega iGPUs it's probably worse when equalized for power or memory speeds.

Nice try. Intel manages to make every single chart they show worthless and deceiving without fault. That's quite the feat, I know every company tries to do same but no has been as consistent as Intel. Whenever you see one their benchmarks might as well look at brick wall instead, you're going to find out just as much.
 
Last edited:
12-28W
Nice try. Intel manages to make every single chart they show worthless and deceiving. That's quite the feat, I know every company tries to do same but no has been as consistent as Intel.
That is probably least deceiving part of it. 4800U that they are comparing 1185G7 to is as much a "15W" CPU as 1185G7 is.
 
Last edited:
4800U that they are comparing 1165G7 to is as much a "15W" CPU as 1165G7 is.

We both know which one of two is going to have the bigger discrepancy in terms of power.
 
Back
Top