• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Editorial AMD Bets on DirectX 12 for Not Just GPUs, but Also its CPUs

Joined
Oct 30, 2008
Messages
1,757 (0.31/day)
System Name Lailalo
Processor Ryzen 9 5900X Boosts to 4.95Ghz
Motherboard Asus TUF Gaming X570-Plus (WIFI
Cooling Noctua
Memory 32GB DDR4 3200 Corsair Vengeance
Video Card(s) XFX 7900XT 20GB
Storage Samsung 970 Pro Plus 1TB, Crucial 1TB MX500 SSD, Segate 3TB
Display(s) LG Ultrawide 29in @ 2560x1080
Case Coolermaster Storm Sniper
Power Supply XPG 1000W
Mouse G602
Keyboard G510s
Software Windows 10 Pro / Windows 10 Home
It helps AMD cpu more cause it allowed to use 4 cores of the cpu, but still that is a dual core cpu vs quad core. As for 9590 test, well sure that could change when you see power bill spiking since use of all 8 cores of the cpu will draw more power.

Bulldozer has been a quirky design. Theres always been 2 camps. Those that consider it a form of HTT, then those that consider them legitimate cores. Lets face it, the weakest FX were always the quads. But then those were just 2 module units. You didn't get to 4 modules till you got to the 8 core models. Performance was only decent with the quad module units.

If you subscribe to the AMD HTT camp, then this explains perfectly why a "quad" APU paces with a dual i3.

More interested in seeing the eventual FX vs i5/i7 benches. If they show similar results that the dual module APUs deliver then it'll be interesting.
 
Joined
Apr 30, 2008
Messages
4,875 (0.84/day)
Location
Multidimensional
System Name Boomer Master Race
Processor AMD Ryzen 7 7800X3D 4.2Ghz - 5Ghz CPU
Motherboard MSI B650I Edge Wifi ITX Motherboard
Cooling CM 280mm AIO + 2x 120mm Slim fans
Memory G.Skill Trident Z5 Neo 32GB 6000MHz
Video Card(s) Galax RTX 4060 8GB (Temporary Until Next Gen)
Storage Kingston KC3000 M.2 1TB + 2TB HDD
Display(s) Asus TUF 24Inch 165Hz || AOC 24Inch 180Hz
Case Cooler Master NR200P Max TG ITX Case
Audio Device(s) Built In Realtek Digital Audio HD
Power Supply CoolerMaster V850 SFX Gold 850W PSU
Mouse Logitech G203 Lightsync
Keyboard Atrix RGB Slim Keyboard
VR HMD ( ◔ ʖ̯ ◔ )
Software Windows 10 Home 64bit
Benchmark Scores Don't do them anymore.
Okay, so who cares about dx 12, when there's still gonna be majority of dx 9-11 out there, and amd cpu's as we know is lackluster in that field.



So what, we should just stick with DX9 - 11 based games forever & never innovate to create/ move to a new & better API? o_O
 
Joined
Oct 20, 2005
Messages
36 (0.01/day)
Location
Bucharest, Romania
Did AMD push Mantle to move DX12 to this direction as there was always a means to the true end?

Mantle was very much a tech demo affair. It showed how a modern API can vastly improve performance in and of itself. Remember that we're talking about percentages that are higher or significantly higher than what mere driver updates can achieve. It's basically the same thing that happened with tessellation. ATi had the hardware for tessellation in their GPUs starting with Radeon 8500 back in 2001, it was called TruForm back then. Only that TruForm never got accepted for DirectX (or OpenGL, for that matter). Fast forward to DirectX 11, and here we go: tessellation official for DirectX.

I don't know if you remember how things worked back in the day. ATi was never really able to get developers to use their proprietary tech, not in any meaningful way anyway. AMD have been far more successful in getting the tech they develop adopted by developers and especially Microsoft. Not really surprising considering that ATi/AMD GPUs are powering the last two console generations from Microsoft (Xbox 360 and Xbone). At any rate, AMD get their technologies (well, the big stuff anyway) implemented in DirectX, which means mass adoption by developers, and Microsoft get to spend less developing new tech to put into newer versions of DirectX. It's a win-win for everyone, really: AMD, Microsoft, the users... hell, even nVidia, since they get full access to everything.

I don't care about the APUs like A10-7850 but if DX12 can push all 8 cores of a FX processor to 90-100%, that will hugely increase performance across all benchmarks.

Believe it or not, DX12 will be most important for people using CPUs like that A10-7850 and the like. The gap between lower end CPUs and the high end will be a whole helluva lot smaller with DX12, which means that entry level and especially lower mainstream CPUs will be far more appealing to the gaming crowd. Think back to the times when you could overclock your cheap CPU to the point where you'd get similar performance to high end models. It wasn't that long ago.

The tittle is wrong. It was always (multicore) CPU performance the goal of a low level API.
Mantle was meant to push Microsoft to move faster on DX12. Mantle and DX12 from the beginning was going to minimize the distance between Intel and AMD CPUs in games that where poorly written. GPU performance with Mantle was always a secondary bonus, as long as Nvidia was sticking with DX11. Now that Nvidia is benefiting from DX12, it is a question mark if AMD will gain more from GCN compared to the performance gains of Maxwell with DX12. Anandtech's DX12 benchmarks show that GCN 1.1 is not as good as Maxwell in 900 series under DX12.

Mantle is AMD's insurance policy: they don't have to wait for a new DirectX to have an API that allows developers to use whatever technology they want to push. Furthermore, they'll have an extra advantage over nVidia next time Nintendo and Sony will need a new GPU/APU for their consoles: not having to rely on their competitor's own API or OpenGL is a pretty big selling point. As for how good AMD's current architecture is in DX12 versus nVidia's Maxwell, that doesn't really matter. AMD is going to launch a new generation of GPUs and it's very likely that said GPUs will use an improved version of the current GCN architecture. That's what matters to them. Comparing current GCN used for Radeon HD 7000 series cards with nVidia's (mostly) brand new Maxwell isn't entirely relevant.

Okay, so who cares about dx 12, when there's still gonna be majority of dx 9-11 out there, and amd cpu's as we know is lackluster in that field.

What DX9-11 game are you playing, that requires more than an 8-core FX or a Phenom II X6 for smooth gameplay? I'm asking because I don't play every game that's out there and maybe I'm missing something. Me, I'm using a Phenom II X6 1055T and I really can't find a single game I can't max out. I had a Core i7 2600K before. I sold it and bought the Phenom II X6 with about 40% of the money I got for the Core i7 and I really can't see the difference in any of the games that I play.

yeah, this looks weird for microsoft.. this could initiate a fat law suit. anyway.. doyou know if star swarm is just multiple instacing and cloning of the same objects doing same things or is it pure realtime interaction and simulation? because the former looks always quite bad when you look at it.

and

I feel like fooled by microsoft and intel with their DX11 and below, so WTH is going on from past time really happened?

So there is or already fishy thing between microft and intel before mantle came out? and micro push out the DX12 to avoid DX ashamed by mantle?

Umm, what? No offense, but do you even know what you're talking about? What does Intel have to do with anything? And why sue Microsoft? You're really missing the big picture here. Low-level APIs are the "secret weapon" the consoles have. Basically, having low-level APIs and fixed hardware specs allows developers to create games with graphics that would normally require a lot more processing power to pull off on the PC.

Windows operating systems have always been bloated, one way or another. The same is true for DirectX. Lately, though, Microsoft has been trying to sort things out in order to create a more efficient software ecosystem. That's right, not just the operating system(s), but everything Microsoft, including DirectX. They need to be able to offer software that is viable for mobile devices (as in, phones and tablets), not just PCs, or they won't stand a chance against Android. Furthermore, they want to offer the users devices that can communicate and work with each others across the board. That means they need a unified software ecosystem, which in turn means making an operating system that can work on all these devices, even if there will be device-specific versions that are optimized to some degree (ranging from UIs customised for phones/consoles/PCs to power consumption algorithms and so on).

In order to create such a Jack-of-all-trades operating system, they need it to have the best features in one package: low-level APIs from consoles, draconian power consumption profiles from phones, very light processing power requirements from both phones and consoles, excellent backwards compatibility through virtualization from PCs, the excellent diversity of software you can only get from a PC, and so on. Creating such an operating system takes a lot of money and effort, not to mention time. They've already screwed up on the UI front with Windows 8 (yeah, apparently it wasn't a no-brainer for them that desktops and phones can never use the same UI). Hopefully, they've learned their lesson and Windows 10 will turn out fine.

DX12 is a major step forward in the right direction, not some conspiracy between Microsoft and Intel to keep AMD CPUs down in the gutter. Nobody would have given all this conspiracy theory a second thought if AMD would have had the money they needed to stay on track with CPU development. But reality is like a kick in the teeth, and AMD CPUs are sadly not nearly as powerful as they'd need to be in order to give Intel a run for their money. Intel stands to lose more ground because they offer the highest performing CPUs, but I think we can all agree on the fact that they didn't get here (having the best performance) by conspiring with Microsoft. Besides, gamers are by far not the only ones that are buying Intel CPUs. Actually, DX12 will be detrimental to AMD as well: fewer people will feel the need to upgrade their CPUs to prepare for upcoming games.
 

GLD

Joined
May 13, 2006
Messages
1,631 (0.25/day)
Location
City 17, California, U.S.A.
Processor AMD Ryzen 7 5700X, AMD Wraith Prism.
Motherboard ASUS TUF X570-Plus (Wi-Fi).
Cooling Antec 120mm RGB case fans.
Memory 4x8gb, G.SKILL F4-3600C16D-16GVKC.
Video Card(s) Sapphire Pulse RX 6700.
Storage PNY XLR8 CS3040 2TB 4.0x4 NVMe ssd with Vantec ICEBERQ heat sink.
Display(s) ASUS VP278QG 27", 1080p, 75hz, FreeSync.
Case Antec GX202.
Audio Device(s) Onboard sound, Logitech Z625 THX 2.1's, Logitech G430 headphones.
Power Supply Seasonic Prime GX-750.
Mouse Logitech G203 Prodigy.
Keyboard Logitech G213.
Software Windows 11 Pro, @ Day 1.
Windows 10 with DX 12, I will need to get it. That and a FX 83xx and a R9 3xx(X) card! Yea to a Newegg card with a zero balance. :eek:
 
Joined
Feb 14, 2012
Messages
2,304 (0.52/day)
System Name msdos
Processor 8086
Motherboard mainboard
Cooling passive
Memory 640KB + 384KB extended
Video Card(s) EGA
Storage 5.25"
Display(s) 80x25
Case plastic
Audio Device(s) modchip
Power Supply 45 watts
Mouse serial
Keyboard yes
Software disk commander
Benchmark Scores still running
I'm sure DX12 will improve on DX11, but I never trust those graphs that suggest it's going to be twice as fast. There's just no way to validate that the game devs put the same honest effort into the DX11 and DX12 paths.
 
Joined
Oct 3, 2012
Messages
229 (0.05/day)
Location
Brazil
System Name Tiffany
Processor Intel i7 4770K @ 4.5Ghz
Motherboard Gigabyte Z87X-UD4H
Cooling Corsair H100i Push-Pull
Memory Corsair Vengeance 1600MHz 16GB
Video Card(s) XFX R9 290 Crossfire @ 925Mhz
Storage WD 500GB + Seagate 1 TB
Display(s) BenQ XL2420TX 120hz
Case Cooler Master CM690II Black & White
Audio Device(s) Onboard
Power Supply Cooler Master Silent Pro 1000W
Software Windows 7
I would guess everyone that follows the CPU market is skeptical about these graphs.

But i'd be thrilled if they're confirmed as soon as the consumer gets its hands on W10 and DX12. Especially if AMD processors are maintained at the current price ranges. Can't even imagine the new landscape that would surge in the gaming community. "Value-oriented gaming" would change completely.
 

costeakai

New Member
Joined
Mar 19, 2015
Messages
22 (0.01/day)
Did AMD push Mantle to move DX12 to this direction as there was always a means to the true end?
AMD have pushed Mantle in the right direction, and DX12 is living proof of it; they just showed the light, and MS rushed in ... The eagerly awaited revival of an old CPU, the FX, is unprecedented in gaming hardware archives ...
 
Joined
Mar 28, 2014
Messages
586 (0.16/day)
Processor AMD FX-8320
Motherboard AsRock 970 PRO3 R2.0
Cooling Thermalright Ultra120 eXtreme + 2 LED Green fans
Memory 2 x 4096 MB DDR3-1333 A-Data
Video Card(s) SAPPHIRE 4096M R9 FURY X 4G D5
Storage ST1000VX000 • SV35.6 Series™ 1000 GB 7200 rpm
Display(s) Acer S277HK wmidpp 27" 4K (3840 x 2160) IPS
Case Cooler Master HAF 912 Plus Black + Red Lights
Audio Device(s) Onboard Realtek
Power Supply OCZ ProXStream 1000W
Mouse Genius NetScroll 100X
Keyboard Logitech Wave
Software Windows 7 Ultimate 64-bit
What does Intel have to do with anything?

Well, the theory (probably true) suggests that Intel with its x86 holds dramatically the industry progress. Simply because x86 as a standard is a shit.

Also, agreement between Microsoft and Intel for negative influences on the progress.

AMD probably couldn't do anything because they were happy to have that x86 license at all.

Still a mystery why x86 still exists and why AMD doesn't jump on anything else. Maybe because in the beginning they simply copied Intel's processors.
 
Joined
Sep 6, 2013
Messages
2,973 (0.77/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 16GB G.Skill RIPJAWS 3600 / 16GB G.Skill Aegis 3200 / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, NVMes everywhere / NVMes, more NVMes / Various storage, SATA SSD mostly
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Software Windows 10 / Windows 10 / Windows 7
Mantle is AMD's insurance policy: they don't have to wait for a new DirectX to have an API that allows developers to use whatever technology they want to push. Furthermore, they'll have an extra advantage over nVidia next time Nintendo and Sony will need a new GPU/APU for their consoles: not having to rely on their competitor's own API or OpenGL is a pretty big selling point.
Good point.
As for how good AMD's current architecture is in DX12 versus nVidia's Maxwell, that doesn't really matter. AMD is going to launch a new generation of GPUs and it's very likely that said GPUs will use an improved version of the current GCN architecture. That's what matters to them. Comparing current GCN used for Radeon HD 7000 series cards with nVidia's (mostly) brand new Maxwell isn't entirely relevant.
It does, especially if part of the 300 series is rebrands. And in Anandtech's tests they used Hawaii, so we are talking about GCN 1.1 and a core that is used in $300 cards. At the time Anandtech was testing DX12 performance there wasn't any DX12 driver for 7000 series available(GCN 1.0). So, if only Fiji is a new GPU and if Tonga is better performing under DX12 compared with Hawaii, you have only those two GPUs performing probably good and all the other GPUs, GCN 1.1 and GCN 1.0 based, losing compared to Maxwell competition. Of course in 2015 that wouldn't really matter much, considering that there will be none, or not many, DX12 titles and probably in games the difference will be much less noticeable compared to benchmarks like Star Swarm.
 

Ebo

Joined
May 9, 2013
Messages
778 (0.20/day)
Location
Nykoebing Mors, Denmark
System Name the little fart
Processor AMD Ryzen 2600X
Motherboard MSI x470 gaming plus
Cooling Noctua NH-C14S
Memory 16 GB G.Skill Ripjaw 2400Mhz DDR 4
Video Card(s) Sapphire RX Vega 56 Pulse
Storage 1 Crucial MX100 512GB SSD,1 Crucial MX500 2TB SSD, 1 1,5TB WD Black Caviar, 1 4TB WD RED HD
Display(s) IIyama XUB2792QSU IPS 2560x1440
Case White Lian-Li PC-011 Dynamic
Audio Device(s) Asus Xonar SE pci-e card
Power Supply Thermaltake DPS G 1050 watt Digital PSU
Mouse Steelseries Sensei
Keyboard Corsair K70
Software windows 10 64 pro bit
I think that we can all agree on that we hope DX 12 will be a fantastic tool, to move computergaming forward.
For me it sounds like the real benefit will be in low gaming and main gaming rigs.

Lets just hope the use of dx12 in games will turn up fast, so we finally can wave goodbye to dx9-11
 
Joined
Jun 13, 2012
Messages
1,316 (0.31/day)
Processor i7-13700k
Motherboard Asus Tuf Gaming z790-plus
Cooling Coolermaster Hyper 212 RGB
Memory Corsair Vengeance RGB 32GB DDR5 7000mhz
Video Card(s) Asus Dual Geforce RTX 4070 Super ( 2800mhz @ 1.0volt, ~60mhz overlock -.1volts. 180-190watt draw)
Storage 1x Samsung 980 Pro PCIe4 NVme, 2x Samsung 1tb 850evo SSD, 3x WD drives, 2 seagate
Display(s) Acer Predator XB273u 27inch IPS G-Sync 165hz
Power Supply Corsair RMx Series RM850x (OCZ Z series PSU retired after 13 years of service)
Mouse Logitech G502 hero
Keyboard Logitech G710+
It does, especially if part of the 300 series is rebrands. And in Anandtech's tests they used Hawaii, so we are talking about GCN 1.1 and a core that is used in $300 cards. At the time Anandtech was testing DX12 performance there wasn't any DX12 driver for 7000 series available(GCN 1.0). So, if only Fiji is a new GPU and if Tonga is better performing under DX12 compared with Hawaii, you have only those two GPUs performing probably good and all the other GPUs, GCN 1.1 and GCN 1.0 based, losing compared to Maxwell competition. Of course in 2015 that wouldn't really matter much, considering that there will be none, or not many, DX12 titles and probably in games the difference will be much less noticeable compared to benchmarks like Star Swarm.

I think bigger issue would be if good part of the 300 series is a "re-badge", that is proper term for it. Issue will be is those cards more then likely will lack DX12 support. They will support speed part but not graphic's part which is bit if a let down for a new series of cards. It would mean only 390 cards will be dx12 cards, any card lower means have to look at nvidia side or wait for hopefully AMD lower end cards that has it.

Lets just hope the use of dx12 in games will turn up fast, so we finally can wave goodbye to dx9-11
The graphics part of 12 probably take a while but I expect speed side which DX11 cards will support will catch on quick least from dev's that care. Rockstar i think for example would be key example of a dev that should add it in GTA5 ASAP

Mantle is AMD's insurance policy: they don't have to wait for a new DirectX to have an API that allows developers to use whatever technology they want to push. Furthermore, they'll have an extra advantage over nVidia next time Nintendo and Sony will need a new GPU/APU for their consoles: not having to rely on their competitor's own API or OpenGL is a pretty big selling point.
Good point.

Yea well, Issue for AMD isn't that they got the advantage but issue of making money. They aren't doing so well and when Sony/Nintendo/MS make their next console, will AMD be still be able to be in business.
 
Last edited:
Joined
Jan 12, 2012
Messages
125 (0.03/day)
Hold your horses, ladies. It'll take at least a year before we see some good games optimized properly on DX12 and about two years for new engines.
 
Joined
Sep 16, 2013
Messages
1,357 (0.35/day)
Location
Canada
System Name HTPC
Processor Intel Core 2 Duo E8400 - 3.00/6M/1333
Motherboard AsRock G31M-GS R2.0
Cooling CoolerMaster Vortex 752 - Black
Memory 4 Go (2x2) Kingston ValueRam DDR2-800
Video Card(s) Asus EN8600GT/HTDP/512M
Storage WD 3200AAKS
Display(s) 32" Irico E320GV-FHD
Case Aerocool Qx-2000
Audio Device(s) Onboard
Power Supply Enermax NoiseTaker 2 - 465w
Mouse Logitech Wave MK550 combo (M510)
Keyboard Logitech Wave MK550 combo (K350)
Software Win_7_Pro-French
Benchmark Scores Windows index : 6.5 / 6.5 / 5.6 / 6.3 / 5.9
I am having an older dual core 1.86 gig cpu setup but I was thinking of goin to APU since i want play only older games and surf the net plus bluray. Does it worth it compared to regular Intel setup? Should I go with Phenom or APU is fine?
 
Joined
Jan 12, 2012
Messages
125 (0.03/day)
I am having an older dual core 1.86 gig cpu setup but I was thinking of goin to APU since i want play only older games and surf the net plus bluray. Does it worth it compared to regular Intel setup? Should I go with Phenom or APU is fine?

My sister has an APU [A8-7100] powered laptop and it does Blu-ray, casual gaming and Office just fine.
 
Joined
Oct 15, 2010
Messages
208 (0.04/day)
What would happen then on my 24 cores 48 threads machine ?

Would DX12 be able to handel these many cores/threads ?

It would be nice if it did, i havent spend 3500 euros for CPUs for nothing, like most of my friends, or followers of my build log would kie to think.
 
Joined
Mar 28, 2014
Messages
586 (0.16/day)
Processor AMD FX-8320
Motherboard AsRock 970 PRO3 R2.0
Cooling Thermalright Ultra120 eXtreme + 2 LED Green fans
Memory 2 x 4096 MB DDR3-1333 A-Data
Video Card(s) SAPPHIRE 4096M R9 FURY X 4G D5
Storage ST1000VX000 • SV35.6 Series™ 1000 GB 7200 rpm
Display(s) Acer S277HK wmidpp 27" 4K (3840 x 2160) IPS
Case Cooler Master HAF 912 Plus Black + Red Lights
Audio Device(s) Onboard Realtek
Power Supply OCZ ProXStream 1000W
Mouse Genius NetScroll 100X
Keyboard Logitech Wave
Software Windows 7 Ultimate 64-bit
Would DX12 be able to handel these many cores/threads ?

According to the presentation slides, DX12 will load every single core of the CPUs, so the answer is yes.

Theoretically there would be no problem to handle as many threads as you like.
 
Joined
Oct 15, 2010
Messages
208 (0.04/day)
According to the presentation slides, DX12 will load every single core of the CPUs, so the answer is yes.

Theoretically there would be no problem to handle as many threads as you like.


Would then DX12 usher in an era where a 4way 15core/30thread E7-4890 v2 CPU (7000 eur each) (60core/120thread in total for 28000 EUR) would be a beast of a gaming cpu compared to a "mortal" 8core/16thread CPU.

Compared to this my 2x12core/48 thread CPUs seem outdated.

Perhaps we will one day see an ASUS ROG board made for these kinds of CPUs, who knows!
 
Joined
Oct 20, 2005
Messages
36 (0.01/day)
Location
Bucharest, Romania
I'm sure DX12 will improve on DX11, but I never trust those graphs that suggest it's going to be twice as fast. There's just no way to validate that the game devs put the same honest effort into the DX11 and DX12 paths.

Understandable. However, you can be quite sure that said devs will use all the DX12 features they can on the Xbone, so that means they'll also use them for the PC port. That means DX12 is likely to show up pretty soon in games, and on a large enough scale: it will probably be adopted faster than DX11.

Well, the theory (probably true) suggests that Intel with its x86 holds dramatically the industry progress. Simply because x86 as a standard is a shit.

Also, agreement between Microsoft and Intel for negative influences on the progress.

AMD probably couldn't do anything because they were happy to have that x86 license at all.

Still a mystery why x86 still exists and why AMD doesn't jump on anything else. Maybe because in the beginning they simply copied Intel's processors.

I agree that x86 is, as a standard, shit. But, what agreement between Microsoft and Intel to slow down progress? I haven't seen anything that would suggest that. Sure, Microsoft isn't exactly fast when it comes to improvements, but that's how things work when there's no viable competition on the operating system front.

Sure, AMD started out by copying Intel's CPUs, but that was a very long time ago and it was part of the agreement between AMD, Intel and IBM. AMD hasn't copied an Intel design for well over a decade now. As for why AMD doesn't "jump on anything else", well, that's not exactly true. They've been tinkering with ARM designs for a while now, and the results are going to show up pretty soon. Developing an entirely new standard to compete with x86 would take incredible amounts of money and time, and AMD has neither of those in ample supply. Not to mention that PC software was built from the ground up for x86 machines and that you'd have better luck drawing up an entirely new GPU on a napkin during lunch break than convincing software developers to remake all their software to work with the new standard.

What would happen then on my 24 cores 48 threads machine ?

Well, supposedly, you'd be able to use all those cores in games. Not that you'd find graphics cards that would be powerful enough to register significantly better performance (higher than 5-10%) than ye average i7 4790K or flagship FX CPU. You also wouldn't need to upgrade the CPU for a decade or so, but then again, your e-peen would shrink big time. Ouch.

Good point.

It does, especially if part of the 300 series is rebrands. And in Anandtech's tests they used Hawaii, so we are talking about GCN 1.1 and a core that is used in $300 cards. At the time Anandtech was testing DX12 performance there wasn't any DX12 driver for 7000 series available(GCN 1.0). So, if only Fiji is a new GPU and if Tonga is better performing under DX12 compared with Hawaii, you have only those two GPUs performing probably good and all the other GPUs, GCN 1.1 and GCN 1.0 based, losing compared to Maxwell competition. Of course in 2015 that wouldn't really matter much, considering that there will be none, or not many, DX12 titles and probably in games the difference will be much less noticeable compared to benchmarks like Star Swarm.

It is indeed likely that the R9 390X/Fiji is the only new core, just like Hawaii was when it was launched. However, judging by the "50-60% better performance than 290X" rumor, as well as the specs (4096 shaders, for starters), I don't think those are ye regular GCN shaders. If Fiji indeed ends up 50-60% faster than Hawaii, while having only about 45% more shaders, it will be rather a amazing feat. Just look at Titan X, it has at least 50% more anything compared to a GTX 980, but it only manages 30% better real life performance.

And that's actually a bit above average as far as scaling goes. Due to the complexity of GPUs, increasing shader count and/or frequency will never yield the same percentage in real life performance gains. And yet, Fiji is coming to the table with a performance gain that EXCEEDS the percentage by which the number of shaders has increased. No matter how good HBM is, it cannot explain such performance gains by itself. If you agree that Hawaii had sufficient memory bandwidth, then the only two things that can account for the huge gain: 2GHz shader frequency (which is extremely unlikely) or modified/new shaders. My money's on the shaders.

I think bigger issue would be if good part of the 300 series is a "re-badge", that is proper term for it. Issue will be is those cards more then likely will lack DX12 support. They will support speed part but not graphic's part which is bit if a let down for a new series of cards. It would mean only 390 cards will be dx12 cards, any card lower means have to look at nvidia side or wait for hopefully AMD lower end cards that has it.

Yea well, Issue for AMD isn't that they got the advantage but issue of making money. They aren't doing so well and when Sony/Nintendo/MS make their next console, will AMD be still be able to be in business.

No, the fact that most of the new 300 series are rebrands wouldn't be an issue. AMD focused all their efforts on Fiji and lo and behold, they got to use HBM one generation before nVidia. They now have the time and expertise required to design new mainstream/performance cards that use HBM and put them into production once HBM is cheap enough (real mass production). Besides, where's the rush? There are no DX12 titles out yet, Windows 10 isn't even out yet. By the time you'll have a DX12 game out there that you'd want to play, you'll be able to purchase 400 series cards.

As for AMD's financial troubles, people have been saying that they're gonna go bankrupt for a good long while now. And guess what, they're still alive and kicking. I'm thankful that AMD price their cards so that they're affordable for most people, not just the rich and silly that are more than willing to pay a thousand bucks or more for a card just because it has nVidia's logo and Titan writing on it. And it also makes sense for them: a lot more people are likely to buy their cards since a lot more people can afford to do so. I'm sorry, but you'll have to live with the fact that your shiny new graphics card is overpriced beyond reason. Don't try to convince the others that they should pay more for their cards just so you can feel good with yourself.

Hold your horses, ladies. It'll take at least a year before we see some good games optimized properly on DX12 and about two years for new engines.

Bingo. We have a winner here. I wholeheartedly agree.
 
Joined
Sep 16, 2013
Messages
1,357 (0.35/day)
Location
Canada
System Name HTPC
Processor Intel Core 2 Duo E8400 - 3.00/6M/1333
Motherboard AsRock G31M-GS R2.0
Cooling CoolerMaster Vortex 752 - Black
Memory 4 Go (2x2) Kingston ValueRam DDR2-800
Video Card(s) Asus EN8600GT/HTDP/512M
Storage WD 3200AAKS
Display(s) 32" Irico E320GV-FHD
Case Aerocool Qx-2000
Audio Device(s) Onboard
Power Supply Enermax NoiseTaker 2 - 465w
Mouse Logitech Wave MK550 combo (M510)
Keyboard Logitech Wave MK550 combo (K350)
Software Win_7_Pro-French
Benchmark Scores Windows index : 6.5 / 6.5 / 5.6 / 6.3 / 5.9
For people like me there's too much cores in play there. People with everyday use can still rock a dualcore and do the job and casual gaming. What makes computer companies fill the bank to reinvest in R&D it's freaks who need the latest technology like it was their last dose of drug.
 
Joined
Jan 5, 2013
Messages
895 (0.22/day)
Clearly DirectX 12 will help properly feed multi core CPUs, which means AMD's CPU performance will increase drastically as half or more of the cores are often under utilized currently due to bad code. That's good for gamers if the games/software is properly written to use DirectX 12. That may be the only reason why anyone would consider Win 10. The fact the Microsucks is planning to give Win 10 to existing Win 8/7 users shows how desperate they are to keep their installed base. For many people Win 10 may still not be worth the hassles.
 
Joined
Oct 15, 2010
Messages
208 (0.04/day)
quote
Well, supposedly, you'd be able to use all those cores in games. Not that you'd find graphics cards that would be powerful enough to register significantly better performance (higher than 5-10%) than ye average i7 4790K or flagship FX CPU. You also wouldn't need to upgrade the CPU for a decade or so, but then again, your e-peen would shrink big time. Ouch.
quote

Wel, that was what i was planning in the first place when i got me these CPUs, to not need to change the CPU setup for a lot of time, sure the GPUs will needto be refresehd now nad then, but the CPUs could stay in place for a lot of time, i reckoned.

It seems my logic was good, as usually my logic allmost always is.
Now to choose the proper GPU setup.

Probably if the apps are coded right, one would or at least could get a big boost of performance even from a Crossfire 295x2 Setup, with 4 GPUS and unified 16gb of RAM.

To bad there isnt a dual gm200 board outthere, that would have been my choice as a gpu.
As it is right now, i will probably go with titan X SLI as a i allready own a GSync Monitor, however its going to take i thin two months before i will actually aquaire them. Untill now i hope i will learn more about the 390x and see if it is worth it.
 

64K

Joined
Mar 13, 2014
Messages
6,104 (1.66/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) MSI RTX 2070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Dell 27 inch 1440p 144 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
Clearly DirectX 12 will help properly feed multi core CPUs, which means AMD's CPU performance will increase drastically as half or more of the cores are often under utilized currently due to bad code. That's good for gamers if the games/software is properly written to use DirectX 12. That may be the only reason why anyone would consider Win 10. The fact the Microsucks is planning to give Win 10 to existing Win 8/7 users shows how desperate they are to keep their installed base. For many people Win 10 may still not be worth the hassles.

I'm not sure why you think MS is desperate to keep their installed base. Where else would people turn to for an OS? Linux? Only a very small percentage of people run Linux compared to Windows. Most people wouldn't even know where to start with Linux. MS has a monopoly on PC OS and they have for a long time.
 
Top