• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD's Upcoming Mobile Rembrandt APU Makes an Early Appearance

Joined
Jan 14, 2019
Messages
9,872 (5.12/day)
Location
Midlands, UK
System Name Nebulon-B Mk. 4
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 7800 XT
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Benchmark Scores Cinebench R23 single-core: 1,800, multi-core: 18,000. Superposition 1080p Extreme: 9,900.
I don't think gaming laptops are going to get any huge changes as they're mostly limited by cooling and power delivery which depends entirely on performance/Watt which hasn't really moved much in the last 3 years, we're still mostly seeing the same 7nm TSMC and Intel 10nm (rebranded Intel7). Samsung 8 for Ampere doesn't seem to bring much to the table over TSMC12 of Turing's dGPUS, it's arguably worse at the low end when comparing performance/Watt of the 3050 against the 1650Ti etc - possibly because the 3050 is bogged down by additional raytracing die area that the 1650/1660 series don't have to bother with. I've yet to see the 3050 convincingly raytrace anything at playable framerates, so IMO that's a step backwards anyway!
Thank you! :toast:

Some people consider it sacrilege to say anything bad about Ampere as it is nvidia's most advanced architecture, and 8 nm Samsung is just sooo goood... but hey, people, let's look at the facts! The desktop 3060 eats about the same power as a 2070 and performs at the same level as the 2070 - not to mention that the 2070 isn't more than a 1080 with raytracing and DLSS support. Ergo, performance/watt hasn't changed since Pascal. All the higher tier chips throw efficiency out of the window. Ergo, Ampere isn't as advanced as it's advertised to be. It's a slightly reworked Turing built by Samsung instead of TSMC, nothing more.

Chips with such high power and cooling requirements should have no presence in a laptop in my opinion.
 
Joined
Feb 20, 2019
Messages
7,305 (3.86/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
Thank you! :toast:

Some people consider it sacrilege to say anything bad about Ampere as it is nvidia's most advanced architecture, and 8 nm Samsung is just sooo goood... but hey, people, let's look at the facts! The desktop 3060 eats about the same power as a 2070 and performs at the same level as the 2070 - not to mention that the 2070 isn't more than a 1080 with raytracing and DLSS support. Ergo, performance/watt hasn't changed since Pascal. All the higher tier chips throw efficiency out of the window. Ergo, Ampere isn't as advanced as it's advertised to be. It's a slightly reworked Turing built by Samsung instead of TSMC, nothing more.

Chips with such high power and cooling requirements should have no presence in a laptop in my opinion.
I'm genuinely curious to see what Nvidia does with the MX500-series; Raytracing performance at under 100W is barely viable on Ampere even using DLSS as a crutch - it adds die area, cost, and spreads the logic out on the die making clocks lower for any given operating voltage. This is NOT good for 25-50W laptop GPUs.

I read (here, or linked from here at least) about the MX500-series being evolutions of the Turing 16-series architecture, but refined for efficiency and using the slightly more efficient Samsung 8nm process. Again, we don't have a like-for-like comparison of TSMC12FF vs Samsung8 but I don't believe there's much efficiency to be gained from the move. Either way, any improvement is better than none and presumably Samsung8 is less constrained than anything from TSMC right now, so that should also help availability a bit too.

An MX-series chip at, say 28W isn't going to get anyone stoked, but at the same time it may match a 1650 Max-Q at 40W, and realistically that GPU can do everything any current game demands of it. Sure, you're going to be turning down settings just to get a stable 60fps but it's less about how good a 28W GPU looks and more about what you can actually do at all. If the 1650 is the lowest viable dGPU for current gaming, then getting that at 28W instead of 40W is a huge win for anyone trying to run a game on battery, or trying not to cook their testicles on the sofa. 1080p60 at low-medium settings really isn't too bad, especially when you're only seeing it on a 14" or 15" screen instead of a 27" or 32" gaming monitor.

I've said it a hundred times or more, pretty graphics are nice but they don't change the gameplay or game design. As long as the framerate is decent and the graphics settings required to meet that framerate don't look too compromised, then the experience is largely the same, just without needing to spend $2000+ and have something that's either hot/noisy/both.
 
Joined
Jan 14, 2019
Messages
9,872 (5.12/day)
Location
Midlands, UK
System Name Nebulon-B Mk. 4
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 7800 XT
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Benchmark Scores Cinebench R23 single-core: 1,800, multi-core: 18,000. Superposition 1080p Extreme: 9,900.
Again, we don't have a like-for-like comparison of TSMC12FF vs Samsung8 but I don't believe there's much efficiency to be gained from the move.
We actually do. ;) 3060 = 2070 = 1080 in both performance and power consumption. Sure, they're different designs, but does that actually matter? The former two add ray tracing and DLSS and the 3060 has +4 GB extra VRAM - so essentially, 6 RAM chips instead of 8 using a narrower bus which should theoretically need less power, but anyway...

I'm genuinely curious to see what Nvidia does with the MX500-series; Raytracing performance at under 100W is barely viable on Ampere even using DLSS as a crutch - it adds die area, cost, and spreads the logic out on the die making clocks lower for any given operating voltage. This is NOT good for 25-50W laptop GPUs.

...

An MX-series chip at, say 28W isn't going to get anyone stoked, but at the same time it may match a 1650 Max-Q at 40W, and realistically that GPU can do everything any current game demands of it. Sure, you're going to be turning down settings just to get a stable 60fps but it's less about how good a 28W GPU looks and more about what you can actually do at all. If the 1650 is the lowest viable dGPU for current gaming, then getting that at 28W instead of 40W is a huge win for anyone trying to run a game on battery, or trying not to cook their testicles on the sofa. 1080p60 at low-medium settings really isn't too bad, especially when you're only seeing it on a 14" or 15" screen instead of a 27" or 32" gaming monitor.

I've said it a hundred times or more, pretty graphics are nice but they don't change the gameplay or game design. As long as the framerate is decent and the graphics settings required to meet that framerate don't look too compromised, then the experience is largely the same, just without needing to spend $2000+ and have something that's either hot/noisy/both.
I agree, especially since modern games don't offer much extra on "ultra" graphics settings when compared to "high" or "medium". I remember the times when "low" was unplayably ugly, but those times are over.

I also agree that the 1650 is about the lowest viable gaming dGPU nowadays (though the 1050 and the 960 4 GB might still have a few last words), but that's about as high as I would go in a laptop in terms of power consumption. When we're talking about 5+ kg and a requirement to be constantly plugged in, we're not really talking about a laptop anymore. You might as well build a small form factor gaming PC at that level. A laptop should be all about mobility. Heavy and power-hungry components defeat its purpose.

On the other hand, big chips offer some level of configurability. When I drag the power slider down to 71% (125W) on my 2070, I only lose about 5-7% in performance. So who knows, nvidia might be onto something. Personally, I hold APUs at a much higher value than dGPUs on mobile fronts, but we'll see.
 
Joined
Sep 17, 2014
Messages
20,944 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
24 isn't much actually, though I hope it doesn't swell to 48 by the end of the year :slap:


Mobile APU's :wtf:
Yeah... all those gamers left without GPUs are most certainly going to be looking at mobile APUs with more graphical grunt / support for new stuff. Laptop market is the next thing to get higher demand really, it already is. Except laptops shall remain shit and overpriced for gaming, most likely.

But also consider, if this is possible, the option to use mobile APUs in non-mobile systems/ SFF / AIO and maybe even DIY channel.
 
Joined
Feb 20, 2019
Messages
7,305 (3.86/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
I read (here, or linked from here at least) about the MX500-series being evolutions of the Turing 16-series architecture, but refined for efficiency and using the slightly more efficient Samsung 8nm process.
I went and looked it up - and it's bad news, there will be no TU117 sans RT hardware on Samsung 8nm

The MX550 is just a tweak of the useless and inefficient MX450
The MX570 is yet another bottom-of-the-barrel die-harvest for the worst, most defective 3050Ti cores (GA107)

Neither of those are worth talking about, Rembrandt will likely perform better than the MX450 without needing a dGPU and its associated cooling/power budget, making the MX550 a write-off. The MX570 is Ampere which we already know from the 3050 and 3050Ti scales down pretty poorly in terms of power efficiency. If the 3050Ti isn't worth talking about from a performance/Watt perspective, then something that performs far worse in a similar power envelope isn't going to get anyone excited :(
 
Top