• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Mantle allows adding Vram on multi-gpu configs.

Joined
Jan 2, 2015
Messages
1,099 (0.32/day)
Processor FX6350@4.2ghz-i54670k@4ghz
Video Card(s) HD7850-R9290
seems to be with 2 gpu's and 1 monitor you would have half of the screen assigned to each gpu.
 
Joined
Jul 18, 2007
Messages
2,693 (0.44/day)
System Name panda
Processor 6700k
Motherboard sabertooth s
Cooling raystorm block<black ice stealth 240 rad<ek dcc 18w 140 xres
Memory 32gb ripjaw v
Video Card(s) 290x gamer<ntzx g10<antec 920
Storage 950 pro 250gb boot 850 evo pr0n
Display(s) QX2710LED@110hz lg 27ud68p
Case 540 Air
Audio Device(s) nope
Power Supply 750w superflower
Mouse g502
Keyboard shine 3 with grey, black and red caps
Software win 10
Benchmark Scores http://hwbot.org/user/marsey99/
Not Crossfire. This memory addressing Mantle feature is only relevant where there are multiple cards installed.

shit yea, it is the ps4 that has 2 gpu not the bone...
 
Joined
Jun 13, 2012
Messages
1,326 (0.31/day)
Processor i7-13700k
Motherboard Asus Tuf Gaming z790-plus
Cooling Coolermaster Hyper 212 RGB
Memory Corsair Vengeance RGB 32GB DDR5 7000mhz
Video Card(s) Asus Dual Geforce RTX 4070 Super ( 2800mhz @ 1.0volt, ~60mhz overlock -.1volts. 180-190watt draw)
Storage 1x Samsung 980 Pro PCIe4 NVme, 2x Samsung 1tb 850evo SSD, 3x WD drives, 2 seagate
Display(s) Acer Predator XB273u 27inch IPS G-Sync 165hz
Power Supply Corsair RMx Series RM850x (OCZ Z series PSU retired after 13 years of service)
Mouse Logitech G502 hero
Keyboard Logitech G710+
Thats pretty bad paraphrasing by that german site.

You have to put it into context. He was refering to the release of the next gen parts, so no. there was no dx12 when they released.

If you think MS develops DX versions without involving ALL the graphics makers from the beginning, you are crazy.

So at the time, there were no plans for DX12 that AMD was aware of.

Pretty much all covering stories back back he claimed there was going to be no new directx period.

example: http://www.hardwarecanucks.com/news/amd-roy-taylor-directx12/
 
Joined
Jul 18, 2007
Messages
2,693 (0.44/day)
System Name panda
Processor 6700k
Motherboard sabertooth s
Cooling raystorm block<black ice stealth 240 rad<ek dcc 18w 140 xres
Memory 32gb ripjaw v
Video Card(s) 290x gamer<ntzx g10<antec 920
Storage 950 pro 250gb boot 850 evo pr0n
Display(s) QX2710LED@110hz lg 27ud68p
Case 540 Air
Audio Device(s) nope
Power Supply 750w superflower
Mouse g502
Keyboard shine 3 with grey, black and red caps
Software win 10
Benchmark Scores http://hwbot.org/user/marsey99/
i thought it had the same gpu but x2?
 
Joined
Jan 2, 2015
Messages
1,099 (0.32/day)
Processor FX6350@4.2ghz-i54670k@4ghz
Video Card(s) HD7850-R9290
I think the biggest difference in them is that the xbone uses ddr3 and the ps4 uses ddr5 and also has a higher core clock so by every right it should def have more power.
not really sure what microsoft was thinking with ddr3 when they could of had much faster ram.
 
Joined
Apr 19, 2012
Messages
12,062 (2.75/day)
Location
Gypsyland, UK
System Name HP Omen 17
Processor i7 7700HQ
Memory 16GB 2400Mhz DDR4
Video Card(s) GTX 1060
Storage Samsung SM961 256GB + HGST 1TB
Display(s) 1080p IPS G-SYNC 75Hz
Audio Device(s) Bang & Olufsen
Power Supply 230W
Mouse Roccat Kone XTD+
Software Win 10 Pro
The PS4 can do 900p natively while the Xbone has to upscale to HD. The PS4 in general is just slightly more beefy. They both feature the same 8 jaguar cores, however the GPU in the PS4 is allocated 8GB DDR5. In comparison the Xbone uses some weird type of cache RAM too to keep up.
The PS4 has more cores in its GPU and at a higher clock. 1152 cores vs 768.

I don't own either, so I'm not backing either one, just providing specs.

If anything, DX12/Mantle are both there to solve the problem of sucky processors in both consoles and PCs. The thing holding the consoles back is the CPU. The GPU in both consoles is actually perfectly servicable, it's amazing what you can cream out of those tiny GPU's when that's all you're coding for. The bottleneck in every game on all the consoles has been the CPU, because half the cores are taken up by the console OS and other nonsense.

The difference is developers know DX, none of them know mantle, which brings me to my first post. Mantle only gets adopted if developers are given a very very good incentive to do so.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
shit yea, it is the ps4 that has 2 gpu not the bone...
Negative. Both have a single GCN GPU.

I dont think so.. both xbone and ps4 have a 8-core jaguar apu..
PS4 has more compute units than XB1 but neither use Crossfire technology.

I think the biggest difference in them is that the xbone uses ddr3 and the ps4 uses ddr5 and also has a higher core clock so by every right it should def have more power.
not really sure what microsoft was thinking with ddr3 when they could of had much faster ram.
Side by side comparison here:
http://en.wikipedia.org/wiki/Jaguar_(microarchitecture)#Consoles

GDDR is substantially more expensive than DDR. Microsoft compensated for DDR3's comparatively slower speed by using 32 MiB of embedded static RAM (109 GB/s) dedicated to the GPU.


I think it's pretty safe to assume that Mantle was a direct product of working with Sony. Developers likely wanted closer-to-the-metal access and Mantle was the answer.
 
Last edited:
Joined
Jan 2, 2015
Messages
1,099 (0.32/day)
Processor FX6350@4.2ghz-i54670k@4ghz
Video Card(s) HD7850-R9290
that is a huge difference for gcn cores.. I guess microsoft just rides on sub par hardware for higher profit margins. They have more game choices too but all the best games you can get on ps4 too and the controller is better.
not really all about the most powerful either tho.. the nintendo cant come close to besting either but the titles are unique and it shows in game sales.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
Microsoft put their console costs in the motion sensing bar instead of the GPU. Remember, XB1 was supposed to be a home theater system--not just a gaming console.

Nintendo, seeing the failure of the Wii U, is speculated to be developing a console that bests the PS4 and XB1 in hardware.
 
Joined
Jan 2, 2015
Messages
1,099 (0.32/day)
Processor FX6350@4.2ghz-i54670k@4ghz
Video Card(s) HD7850-R9290
and now they sell it without the sensor to compete with sony prices haha

yeah when I first seen the wii u at a friends house I had no idea it existed.. he was saying I was not the only one and they didnt have the best marketing.

seemed like a lot of fun for small kids so I sent one to my son for christmas.. when I talk to him on the phone he always tells me about some new game his mom got him.

edit-I seen recently seen nintendo was making another deal with AMD so hopefully its like you say and further progresses apu tech.
 
Joined
Sep 7, 2011
Messages
2,785 (0.60/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
If anything, DX12/Mantle are both there to solve the problem of sucky processors in both consoles and PCs.
Don't you find it odd that people are taking the position that Mantle will become a dominant API based almost entirely upon the fact that AMD's APU power the consoles......yet the Xbone actually uses DirectX (11.X), and the PlayStation 4 uses Sony's proprietary GNM and GNMX API's?
The difference is developers know DX, none of them know mantle, which brings me to my first post. Mantle only gets adopted if developers are given a very very good incentive to do so.
I think its a given that AMD will need to keep putting its hand in its pocket to keep the game devs coding for Mantle. There is no way in hell that game studio's lock out ~80% of the PC gaming user base (and Microsoft's console business) by marginalizing DX/OGL at the expense of Mantle.
 
Joined
Apr 19, 2012
Messages
12,062 (2.75/day)
Location
Gypsyland, UK
System Name HP Omen 17
Processor i7 7700HQ
Memory 16GB 2400Mhz DDR4
Video Card(s) GTX 1060
Storage Samsung SM961 256GB + HGST 1TB
Display(s) 1080p IPS G-SYNC 75Hz
Audio Device(s) Bang & Olufsen
Power Supply 230W
Mouse Roccat Kone XTD+
Software Win 10 Pro
There is no way in hell that game studio's lock out ~80% of the PC gaming user base (and Microsoft's console business)

But but but mantle is totally going to be an open source SDK as of 2 months ago man. They're not locking out anybody. They're just not giving them any SDK, information or youtube tutorials to help them adopt it.
 
Joined
Jan 2, 2015
Messages
1,099 (0.32/day)
Processor FX6350@4.2ghz-i54670k@4ghz
Video Card(s) HD7850-R9290
haha they really should get moving on it..

we all know dx is going to remain top for pc gaming even if gamers and developers have hate for the os.

I'm really more interested in the hardware.. if the next xb and ps can do 4k over 30fps that will be a big win for apu tech.
 
Joined
Sep 7, 2011
Messages
2,785 (0.60/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
But but but mantle is totally going to be an open source SDK as of 2 months ago man. They're not locking out anybody. They're just not giving them any SDK, information or youtube tutorials to help them adopt it.
Yeah, AMD's pretend timetable aside, can you envisage a scenario where Microsoft says "Y'know, we should just do away with our own software stack and hitch our wagon to AMD's star"?
Microsoft developed DirectX as a concerted effort to control PC gaming at the expense of OpenGL and a raft of fading vendor specific APIs ( Matrox Simple Interface, SGL, S3d, Glide, C Interface, RRedline etc). Somehow I don't see MS handing over application control to the same vendor APIs it fought so hard to marginalize....and that's assuming you don't dismiss the idea out of hand because Microsoft demands total control over pretty much everything it uses. I'm also wondering if people are expecting DirectX to fold, what exactly are they expecting to happen to the rest of D3D ?
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
But but but mantle is totally going to be an open source SDK as of 2 months ago man. They're not locking out anybody. They're just not giving them any SDK, information or youtube tutorials to help them adopt it.
Pretty sure developers interested in implementing Mantle can get directly in contact with people at AMD to make it happen. Case in point: FrostByte 3 engine has Mantle available despite the SDK still being private.

I'm really more interested in the hardware.. if the next xb and ps can do 4k over 30fps that will be a big win for apu tech.
Fat chance. In most games, the XB1 can't even handle 1080p. I was doing that on Windows a decade ago. 4K requires an expontential growth in processing power compared to 1080p.

Microsoft developed DirectX as a concerted effort to control PC gaming at the expense of OpenGL and a raft of fading vendor specific APIs ( Matrox Simple Interface, SGL, S3d, Glide, C Interface, RRedline etc).
The latter is true, not the former. Making games was hell two decades ago because every developer had to design their game to work with many APIs just to enable people to play it. Microsoft stepped in between them with DirectX giving hardware and software manufactures a standard to code for (network, audio, controllers, and video). This is fundamentally what allowed PC gaming to flourish.

OpenGL, riding on the back of DirectX hardware requirements, has been able to flourish in all flavors of *nix (Linux and Mac included).

Somehow I don't see MS handing over application control to the same vendor APIs it fought so hard to marginalize....and that's assuming you don't dismiss the idea out of hand because Microsoft demands total control over pretty much everything it uses. I'm also wondering if people are expecting DirectX to fold, what exactly are they expecting to happen to the rest of D3D ?
Microsoft's DirectX team sits between Intel, AMD, NVIDIA, and game developers. They listen to the needs of all parties and try to work out a compromise that's feasible to hardware manufactures and easy to use for developers. Mantle is really the culmination of the changes DirectX (especially 10) made over the last decade moving away from specialized hardware to general floating point processors.

If Microsoft stopped supporting DirectX, Mantle would be exclusive to AMD instead of a standard (as part of Direct3D 12) supported by AMD, NVIDIA, and Intel. You may not like it but Microsoft, through DirectX standards, prevents monopolization of the market. The only place where Microsoft has really "screwed the pooch" is with XInput. XInput isn't bad: it did succeed in standardizing a controller but that's a controller (Xbox) and not neutral like Microsoft has done with the rest of DirectX. I really wish Microsoft would get sued over that allowing businesses like Sony to code XInput drivers for their PlayStation controllers. Or just roll XInput back into DirectInput.
 
Last edited:
Joined
Jan 2, 2015
Messages
1,099 (0.32/day)
Processor FX6350@4.2ghz-i54670k@4ghz
Video Card(s) HD7850-R9290
believe in the 4k gaming apu and it will come :clap:
 
Joined
Jul 18, 2007
Messages
2,693 (0.44/day)
System Name panda
Processor 6700k
Motherboard sabertooth s
Cooling raystorm block<black ice stealth 240 rad<ek dcc 18w 140 xres
Memory 32gb ripjaw v
Video Card(s) 290x gamer<ntzx g10<antec 920
Storage 950 pro 250gb boot 850 evo pr0n
Display(s) QX2710LED@110hz lg 27ud68p
Case 540 Air
Audio Device(s) nope
Power Supply 750w superflower
Mouse g502
Keyboard shine 3 with grey, black and red caps
Software win 10
Benchmark Scores http://hwbot.org/user/marsey99/
thanks guys, i will admit that i have never really looked into the console specs much. i just knew that one had done something to get more gpu power and with them being apu i assumed they had just thrown another chip in the mix, not had their own apu designed you know.

i do recall it was amd that helped microsoft free up the hardware on the xbone almost right after launch with what i had assumed was mantle and have read a few things stating amd had given it to microsoft to use as it was going to be open source.

now the real question as to why it is still not out....the conspiracist in me is screaming ms are dragging their feet to stop glnext getting any ideas before dx12 is out.

this is going way ot here, but does anyone else recall reading that sony had head hunted the nvidia driver team to do the sotfware for the ps4?

only it was a prospect i found very interesting. sony paying nvidia staff to do linux (?) software for amd hardware...it must be some conflict of interests somewhere surely?
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,147 (2.94/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
seems to be with 2 gpu's and 1 monitor you would have half of the screen assigned to each gpu.
There is spit frame rendering and alternate frame rendering. I'm sure you can figure out which each one does by their names. AFR has traditionally been faster but has had the micro-stutter problem but SFR doesn't always scale or render properly.
believe in the 4k gaming apu and it will come :clap:
Dreaming doesn't make it come true unless you're one of the engineers doing it. Doing games is 4k will be a bear and I don't think we'll see it do well for quite some time.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
only it was a prospect i found very interesting. sony paying nvidia staff to do linux (?) software for amd hardware...it must be some conflict of interests somewhere surely?
Linus Torvald is pissed off at NVIDIA for failing to help develop the open platform AMD is actively supporting on Linux. NVIDIA is making a lot of enemies right now while AMD is making a lot of friends (including Sony, Nintendo, and Microsoft).
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,147 (2.94/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
Linus Torvald is pissed off at NVIDIA for failing to help develop the open platform AMD is actively supporting on Linux. NVIDIA is making a lot of enemies right now while AMD is making a lot of friends (including Sony, Nintendo, and Microsoft).
Let's not forget that AMD is starting to produce Opterons with ARM cores as well which is an interesting dynamic in the computing world. I don't want to be overly optimistic, but I'm hoping that AMD hiding behind the curtain for so long will come out and surprise us with some new technology. It's long overdue.
 
Joined
Jan 2, 2015
Messages
1,099 (0.32/day)
Processor FX6350@4.2ghz-i54670k@4ghz
Video Card(s) HD7850-R9290
idk it may seem like dreaming about something far away but look at what a single 290 can do at 4k with uber settings. I bet if you they made a apu with the graphics power of like a 280x it would have potential to do 4k for consoles.
that in itself is a leap for apu's but what about all the people buying 4k tv's.. dont they want a console that can play on it.
shit maybe they should slow down on the pixel density haha AMD and NV are struggling to keep up as it is and here comes 8k.
 
Joined
Apr 19, 2012
Messages
12,062 (2.75/day)
Location
Gypsyland, UK
System Name HP Omen 17
Processor i7 7700HQ
Memory 16GB 2400Mhz DDR4
Video Card(s) GTX 1060
Storage Samsung SM961 256GB + HGST 1TB
Display(s) 1080p IPS G-SYNC 75Hz
Audio Device(s) Bang & Olufsen
Power Supply 230W
Mouse Roccat Kone XTD+
Software Win 10 Pro
I'm hoping that AMD hiding behind the curtain for so long will come out and surprise us with some new technology

I'm hoping they surprise us with Zen and their first 22nm GPU. I dont care about 28nm anymore, nor do I care for APU's or the piledriver architecture anymore. I want good old fashioned brute force on new nodes.
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,147 (2.94/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
idk it may seem like dreaming about something far away but look at what a single 290 can do at 4k with uber settings. I bet if you they made a apu with the graphics power of like a 280x it would have potential to do 4k for consoles.
that in itself is a leap for apu's but what about all the people buying 4k tv's.. dont they want a console that can play on it.
shit maybe they should slow down on the pixel density haha AMD and NV are struggling to keep up as it is and here comes 8k.
Consider the power envelope of a 290 and the power envelope of a CPU to drive such a GPU. Now imagine that combined usaged in a single APU. We're talking TDPs in the 350-400 watt range which is way out of the kind of power levels they're aiming for. I think you have to remember that an APU needs to be able to get rid of all of that heat like a normal CPU would so you can't go having that much heat getting generating for a single device. As a result, you don't see 290-like performance on an APU.

I think I would rather see two APUs on one motherboard, but I think we all know that we'll never see that happen.
I'm hoping they surprise us with Zen and their first 22nm GPU. I dont care about 28nm anymore, nor do I care for APU's or the piledriver architecture anymore. I want good old fashioned brute force on new nodes.
Agreed. I've really been waiting for them to move to a smaller node before ditching my 6870s. I really would like it if they could fix multi-monitor idle power consumption though. >50-watts is kind of stupid when a 970/980 can with less than 5-watts.
 
Last edited:
Joined
Jan 2, 2015
Messages
1,099 (0.32/day)
Processor FX6350@4.2ghz-i54670k@4ghz
Video Card(s) HD7850-R9290
I think they are moving to 20nm for gpu's and zen is 14nm or 16nm. If they can get within like 5 percent of the per core ipc of haswell with decent overclocking they will make bank.
2 apu's :eek:
 
Top