• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD GPUs Show Strong DirectX 12 Performance on "Ashes of the Singularity"

Joined
Jun 13, 2012
Messages
1,328 (0.31/day)
Processor i7-13700k
Motherboard Asus Tuf Gaming z790-plus
Cooling Coolermaster Hyper 212 RGB
Memory Corsair Vengeance RGB 32GB DDR5 7000mhz
Video Card(s) Asus Dual Geforce RTX 4070 Super ( 2800mhz @ 1.0volt, ~60mhz overlock -.1volts. 180-190watt draw)
Storage 1x Samsung 980 Pro PCIe4 NVme, 2x Samsung 1tb 850evo SSD, 3x WD drives, 2 seagate
Display(s) Acer Predator XB273u 27inch IPS G-Sync 165hz
Power Supply Corsair RMx Series RM850x (OCZ Z series PSU retired after 13 years of service)
Mouse Logitech G502 hero
Keyboard Logitech G710+
If these guys are right then AMD really needs something to boost sales. If they do prove to be superior in DX12 then maybe that will do it.

http://www.dsogaming.com/news/amdnv...ering-4-out-of-5-pc-gamers-own-an-nvidia-gpu/


4 out of 5 gamers buying Nvidia GPUs. It doesn't look good right now @Sony Xperia S
AMD's credibility is at an All time low, they have what looks like superior performance in 1 game that is only in Alpha testing isn't really gonna chance that.

They should, R9 290X and 290 have such good performance per dollar. 290X is only 290 USD.
Hard to sell people a 2 year old gpu as a new product. Most people that own a 200 series already have no reason to get a 300 series since its pretty much same card's.
 
Joined
Dec 16, 2014
Messages
421 (0.12/day)
Hard to sell people a 2 year old gpu as a new product. Most people that own a 200 series already have no reason to get a 300 series since its pretty much same card's.
Even though it is an older model it is still at the top of the game like you said and with that price it is a steal.
 
Joined
Sep 7, 2011
Messages
2,785 (0.60/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
By the way. Did Nvidia also showed numbers for professional cards, or the numbers there where not as pretty? AMD was gaining in the Pro market thanks to the MACs.
Neither Nvidia nor AMD list professional graphics shipments separately these days. Professional, discrete, discrete custom ( FirePro D series), and discrete mobile (MXM) shipments are used in both Mercury and JPR's figures.

AMD's professional board figures aren't a straightforward extrapolation of market share as the consumer market largely is. Nvidia boards sell at vastly greater ASPs. To use the "MAC" (Mac Pro) example you cited:
The top Mac Pro comes standard with Dual FirePro D500's and can be upgraded to Dual FirePro D700's for $600.
The FirePro D500 is a custom part analogous to the HD 7870XT ( so sits between the Pitcairn-based FirePro W7000 and Tahiti PRO- based W8000) The W7000 is priced at around $600 each, the W8000 at ~ $1000 each
The FirePro D700 is a FirePro W9000 (also in a custom form factor as per the other FirePro D boards). The W9000 retails for around $3000 each......and Apple offers an upgrade to two of them - from boards priced at less than $1K, for $600

Given that Apple has to factor in their own profit and amortization from warranty replacements, how much do you think AMD's unit price contract is for these custom SKUs?
AMD basically purchased market share (and the cachet of marketing by being allied to the Apple Mac Pro). If AMD rely upon this business model, they build market share all the way to the poor house.
 
Joined
Sep 6, 2013
Messages
2,978 (0.77/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 16GB G.Skill RIPJAWS 3600 / 16GB G.Skill Aegis 3200 / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, NVMes everywhere / NVMes, more NVMes / Various storage, SATA SSD mostly
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Software Windows 10 / Windows 10 / Windows 7
Neither Nvidia nor AMD list professional graphics shipments separately these days. Professional, discrete, discrete custom ( FirePro D series), and discrete mobile (MXM) shipments are used in both Mercury and JPR's figures.

AMD's professional board figures aren't a straightforward extrapolation of market share as the consumer market largely is. Nvidia boards sell at vastly greater ASPs. To use the "MAC" (Mac Pro) example you cited:
The top Mac Pro comes standard with Dual FirePro D500's and can be upgraded to Dual FirePro D700's for $600.
The FirePro D500 is a custom part analogous to the HD 7870XT ( so sits between the Pitcairn-based FirePro W7000 and Tahiti PRO- based W8000) The W7000 is priced at around $600 each, the W8000 at ~ $1000 each
The FirePro D700 is a FirePro W9000 (also in a custom form factor as per the other FirePro D boards). The W9000 retails for around $3000 each......and Apple offers an upgrade to two of them - from boards priced at less than $1K, for $600

Given that Apple has to factor in their own profit and amortization from warranty replacements, how much do you think AMD's unit price contract is for these custom SKUs?
AMD basically purchased market share (and the cachet of marketing by being allied to the Apple Mac Pro). If AMD rely upon this business model, they build market share all the way to the poor house.

Nvidia wins market share. Absolutely logical. Nothing to analyze.
AMD wins market share. "Let me explain to you why there is nothing to see here, or even worst, why it is bad for AMD".

 
Joined
Nov 3, 2011
Messages
690 (0.15/day)
Location
Australia
System Name Eula
Processor AMD Ryzen 9 7900X PBO
Motherboard ASUS TUF Gaming X670E Plus Wifi
Cooling Corsair H115i Elite Capellix XT
Memory Trident Z5 Neo RGB DDR5-6000 64GB (4x16GB F5-6000J3038F16GX2-TZ5NR) EXPO II, OCCT Tested
Video Card(s) Gigabyte GeForce RTX 4080 GAMING OC
Storage Corsair MP600 XT NVMe 2TB, Samsung 980 Pro NVMe 2TB, Toshiba N300 10TB HDD, Seagate Ironwolf 4T HDD
Display(s) Acer Predator X32FP 32in 160Hz 4K IPS FreeSync/GSync DP, LG 27UL600 27in 4K HDR FreeSync/G-Sync DP
Case Phanteks Eclipse P500A D-RGB White
Audio Device(s) Creative Sound Blaster Z
Power Supply Corsair HX1000 Platinum 1000W
Mouse SteelSeries Prime Pro Gaming Mouse
Keyboard SteelSeries Apex 5
Software MS Windows 11 Pro
How many times has AMD failed and refused to admit fault to turn around and blame nvidia for it?


Wouldn't be surprise they didn't bother to much with the game since it is an Alpha stage game and any work done now might not even matter when its finally released


Consider 390x is a 2 year old gpu so yea it should be cheaper then a much newer one. Even with price difference, DX12 games will trickle out here and there over next 6-12 month's.

From http://www.dsogaming.com/news/the-w...t-amd-users-enjoy-physx-hair-and-fur-effects/

With NVIDIA's Gameworks, CDPR lost control over their Witcher 3 PC source code.

Our dear friends over at PCGamesHardware had an interesting interview with CD Projekt RED’s lead engine developer, Balázs Török, in which Balázs claimed that AMD users will be able to enjoy the new fur and hair GPU-accelerated effects, provided Nvidia lets them to.

When asked about the new hair and fur effects, Balázs said that – at this moment – these aforementioned effects run on AMD Radeon GPUs.

“At the moment it works, but if it can run at the end, the decision of Nvidia. What matters is the direction in which they continue to develop the technology and whether and what barriers they are installed. But I think it will also run on Radeon graphics cards.“
 
Joined
Sep 6, 2013
Messages
2,978 (0.77/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 16GB G.Skill RIPJAWS 3600 / 16GB G.Skill Aegis 3200 / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, NVMes everywhere / NVMes, more NVMes / Various storage, SATA SSD mostly
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Software Windows 10 / Windows 10 / Windows 7
AMD's credibility is at an All time low, they have what looks like superior performance in 1 game that is only in Alpha testing isn't really gonna chance that.
That one game will make some people be more skeptical about their next upgrade. It's not something serious, but it could become more serious in the future. Of course Nvidia doesn't have to do much. Just $persuade$ the game developers to not take advantage of DX12 just yet. Wait a little longer, until Pascal comes. They did it before anyway.

Hard to sell people a 2 year old gpu as a new product. Most people that own a 200 series already have no reason to get a 300 series since its pretty much same card's.
There are plenty of rebrands in the market. It's just that AMD's financial position forced them to do rebrands in more expensive cards than Nvidia is doing (GT 730 is one example). At least they do not sell cards with the same name and totally different performance and features. I mean you talk about credibility. Well if the press was treating Nvidia as it treats AMD, Nvidia's credibility would have been in no better position. The typical example is GTX 970, but let's just add here the GT 730 I mentioned.
So we have 3 GT 730s.
One is 96 Fermi cores, 128bit DDR3. <<<This one is not even DX12 yet.
One is 384 Kepler cores, 64bit GDDR5. <<< This one is the good one
And the last one is 384 Kepler cores, 64bit DDR3. <<< This one you throw it out the windows. 12.8GB/sec? Even minesweeper will have performance problems.
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,147 (2.94/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
Hard to sell people a 2 year old gpu as a new product. Most people that own a 200 series already have no reason to get a 300 series since its pretty much same card's.
I don't know, I found the 390 to be a worthy successor to my 6870s in CFX. In all seriousness, despite being rebrands, they've been tuned. This doesn't mean much for power users as overclocking is a think we like to do but, it's not like the 300 series are bad. A lot of things people said were bad about AMD (like multi-monitor idle usage and frame latency,) is a little overstated. So despite not being new technology, it's worth the price you pay for it.

As for DX12, I don't think we can judge everything by one game. It's too early to say much about DX12 other than it potentially can offer some significant improvements. AMD's drivers are known to have more overhead than nVidia's and DX12 might make that less of a problem than it is now.

Honestly, I don't think anyone should get angry or lose sleep over this.
 
Joined
Sep 7, 2011
Messages
2,785 (0.60/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
Nvidia wins market share. Absolutely logical. Nothing to analyze.
Really? And why would that be? ...and why bring up Nvidia? I certainly didn't. You were the one telling the world + dog how great AMD's pro graphics were doing. Market share with a plummeting bottom line is hardly a cause for cheerleading....or any real critique at all really, given that this thread is about DX12 - which I'm pretty sure pro graphics and math co-processors aren't leveraging.
AMD wins market share. "Let me explain to you why there is nothing to see here, or even worst, why it is bad for AMD".
It was you that bought up the "MAC"
All I did was to point out the pricing of AMD parts used in the "MAC".
If you want to have a good cry about it, be my guest - but between whines, maybe you can explain how AMD's FirePro market share is growing (in a fashion) - largely, as you've already said, because of Apple, yet AMD still bleeds red ink.

Nice AMD supplied PPS. Why not use the latest figures that show that AMD's pro graphics have slipped back to 20% ?
Meanwhile, Nvidia remained the dominant force in professional GPUs, responsible for 79.4% of units, while AMD picked up the remaining 20.6%, including a fair number of units sold to Apple to outfit Mac Pros.
So, a large chunk of AMD's pro graphics market is predicated upon a single customer getting boards at knock down pricing - around 1/10th of retail MSRP. As I said, AMD (or anyone for that matter) can grow market share if they offer deals like that - and let's face it, AMD have been in fire sale mode for FirePro for some time. I'm also pretty sure if AMD offered Radeons at $5 each, they'd quickly gain a massive market share gain - but it doesn't mean **** all if it isn't sustainable. Not every entity can look forward to an EU bailout.
 
Last edited:
Joined
Jun 13, 2012
Messages
1,328 (0.31/day)
Processor i7-13700k
Motherboard Asus Tuf Gaming z790-plus
Cooling Coolermaster Hyper 212 RGB
Memory Corsair Vengeance RGB 32GB DDR5 7000mhz
Video Card(s) Asus Dual Geforce RTX 4070 Super ( 2800mhz @ 1.0volt, ~60mhz overlock -.1volts. 180-190watt draw)
Storage 1x Samsung 980 Pro PCIe4 NVme, 2x Samsung 1tb 850evo SSD, 3x WD drives, 2 seagate
Display(s) Acer Predator XB273u 27inch IPS G-Sync 165hz
Power Supply Corsair RMx Series RM850x (OCZ Z series PSU retired after 13 years of service)
Mouse Logitech G502 hero
Keyboard Logitech G710+
From http://www.dsogaming.com/news/the-w...t-amd-users-enjoy-physx-hair-and-fur-effects/

With NVIDIA's Gameworks, CDPR lost control over their Witcher 3 PC source code.

Our dear friends over at PCGamesHardware had an interesting interview with CD Projekt RED’s lead engine developer, Balázs Török, in which Balázs claimed that AMD users will be able to enjoy the new fur and hair GPU-accelerated effects, provided Nvidia lets them to.

When asked about the new hair and fur effects, Balázs said that – at this moment – these aforementioned effects run on AMD Radeon GPUs.

“At the moment it works, but if it can run at the end, the decision of Nvidia. What matters is the direction in which they continue to develop the technology and whether and what barriers they are installed. But I think it will also run on Radeon graphics cards.“
Well fur which is hairworks relies on tessellation, which is part of DX11 standard. Can't really blame Nvidia for AMD cards being very slow on doing it. Nvidia yea could used it on purpose knowing AMD cards are slow in that just like AMD uses OpenCL for tressFX knowing their cards were a lot faster in OpenCL then Nvidia. Both sides did it at one point, Should least be happy that hairworks uses a standard which everyone including AMD was crying for.

I don't know, I found the 390 to be a worthy successor to my 6870s in CFX. In all seriousness, despite being rebrands, they've been tuned. This doesn't mean much for power users as overclocking is a think we like to do but, it's not like the 300 series are bad.

They aint bad no but AMD has tried to claim 390(x) wasn't a rebrand when it is. They even tried to give reason that it wasn't but kinda hard to believe that when GPU-z someone posted of it has a GPU rls date of 2013.
http://wccftech.com/amd-radeon-r9-390-390x-not-rebadges-power-optimization/

There are plenty of rebrands in the market. It's just that AMD's financial position forced them to do rebrands in more expensive cards than Nvidia is doing (GT 730 is one example).

Really how many ppl care about those cards being rebrands or not? No one cares if r5 240 or what ever is rebrand or 230. They are low end cards with very low power draw. No one that cares about performance buys them.
 
Last edited:
Joined
Dec 22, 2011
Messages
3,890 (0.86/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
Erm..... what happened to the 390X and the GTX 980 trading blows? Now the 290X trades blows with the 980 Ti?

This gets more hilarious by the day, I'm literally getting upset AMD make no money!
 
Joined
Apr 30, 2012
Messages
3,881 (0.89/day)
Erm..... what happened to the 390X and the GTX 980 trading blows? Now the 290X trades blows with the 980 Ti?

This gets more hilarious by the day, I'm literally getting upset AMD make no money!

Here the comparable numbers if graphs and numbers are too difficult for you (different systems PCPerspective & Arc)

1080p
290X
48

390X
53

980
50

980 Ti
50

1080p Heavy
290X
40

390X
46

980
44

980 Ti
43



 
Last edited:
Joined
Jun 13, 2012
Messages
1,328 (0.31/day)
Processor i7-13700k
Motherboard Asus Tuf Gaming z790-plus
Cooling Coolermaster Hyper 212 RGB
Memory Corsair Vengeance RGB 32GB DDR5 7000mhz
Video Card(s) Asus Dual Geforce RTX 4070 Super ( 2800mhz @ 1.0volt, ~60mhz overlock -.1volts. 180-190watt draw)
Storage 1x Samsung 980 Pro PCIe4 NVme, 2x Samsung 1tb 850evo SSD, 3x WD drives, 2 seagate
Display(s) Acer Predator XB273u 27inch IPS G-Sync 165hz
Power Supply Corsair RMx Series RM850x (OCZ Z series PSU retired after 13 years of service)
Mouse Logitech G502 hero
Keyboard Logitech G710+
Here the comparable numbers if graphs and numbers are too difficult for you (different systems PCPerspective & Arc)

1080p
290X
48

390X
53

980
50

980 Ti
50

1080p Heavy
290X
40

390X
46

980
44

980 Ti
43



Erm..... what happened to the 390X and the GTX 980 trading blows? Now the 290X trades blows with the 980 Ti?

This gets more hilarious by the day, I'm literally getting upset AMD make no money!
Wow so AMD won something in a game that is in Alpha stages and won't be out for like a year, o yea its also AMD sponsored game. Those numbers are to be take with a grain of salt as its just 1 game that is Alpha build. That game still has a long way to go coding wise before can take those numbers too seriously. Best not Hype up something that likely Not gonna stay where they are like AMD did with the Fury X.

Point is, don't start that hype train rolling just yet. Might turn in to a train wreck of disappointment like it has many times before. I guess though if people haven't learned by now they never will.
 
Last edited:
Joined
Nov 3, 2011
Messages
690 (0.15/day)
Location
Australia
System Name Eula
Processor AMD Ryzen 9 7900X PBO
Motherboard ASUS TUF Gaming X670E Plus Wifi
Cooling Corsair H115i Elite Capellix XT
Memory Trident Z5 Neo RGB DDR5-6000 64GB (4x16GB F5-6000J3038F16GX2-TZ5NR) EXPO II, OCCT Tested
Video Card(s) Gigabyte GeForce RTX 4080 GAMING OC
Storage Corsair MP600 XT NVMe 2TB, Samsung 980 Pro NVMe 2TB, Toshiba N300 10TB HDD, Seagate Ironwolf 4T HDD
Display(s) Acer Predator X32FP 32in 160Hz 4K IPS FreeSync/GSync DP, LG 27UL600 27in 4K HDR FreeSync/G-Sync DP
Case Phanteks Eclipse P500A D-RGB White
Audio Device(s) Creative Sound Blaster Z
Power Supply Corsair HX1000 Platinum 1000W
Mouse SteelSeries Prime Pro Gaming Mouse
Keyboard SteelSeries Apex 5
Software MS Windows 11 Pro
Well fur which is hairworks relies on tessellation, which is part of DX11 standard. Can't really blame Nvidia for AMD cards being very slow on doing it. Nvidia yea could used it on purpose knowing AMD cards are slow in that just like AMD uses OpenCL for tressFX knowing their cards were a lot faster in OpenCL then Nvidia. Both sides did it at one point, Should least be happy that hairworks uses a standard which everyone including AMD was crying for.
Wrong, TressFX uses Microsoft's DirectCompute.

From http://www.techpowerup.com/180675/amd-tressfx-technology-detailed.html

"Technically, TressFX is a toolset co-developed by AMD and Crystal Dynamics, which taps into DirectCompute"

Should least be happy that TressFX uses a standard which everyone including NVIDIA was crying for.

I'm aware of the technical aspect. The problem was excessive tessellation that doesn't substantially improve the graphics appearance. The workaround was to re-use the same AMD driver side tessellation override feature for Crysis 2 NVIDIA patched.


The difference with Ashes of Singularity vs Witcher 3 is the source code availability for AMD(Red Team), NVIDIA (Green Team) and Intel (Blue Team). All interested IHVs can contribute to the same source code without being blocked an exclusivity contract.

Witcher 3 XBO/PS4 builds uses TressFX instead of NVIDIA's Hairworks.

Unreleased Witcher 3 PC build has TressFX enabled, but blocked NVIDIA exclusivity contract. Witcher 3 PC build with TressFX would have benefited lesser NVIDIA GPU cards.


PS; I own MSI 980 Ti OC with my MSI 290X OC.

Wow so AMD won something in a game that is in Alpha stages and won't be out for like a year, o yea its also AMD sponsored game. Those numbers are to be take with a grain of salt as its just 1 game that is Alpha build. That game still has a long way to go coding wise before can take those numbers too seriously. Best not Hype up something that likely Not gonna stay where they are like AMD did with the Fury X.

Point is, don't start that hype train rolling just yet. Might turn in to a train wreck of disappointment like it has many times before. I guess though if people haven't learned by now they never will.
The pattern is similar to 3DMarks API overhead test results.




It could also indicate AMD's DX11 driver is sub-par relative it's TFLOPS potential.
 
Last edited:

OneMoar

There is Always Moar
Joined
Apr 9, 2010
Messages
8,746 (1.70/day)
Location
Rochester area
System Name RPC MK2.5
Processor Ryzen 5800x
Motherboard Gigabyte Aorus Pro V2
Cooling Enermax ETX-T50RGB
Memory CL16 BL2K16G36C16U4RL 3600 1:1 micron e-die
Video Card(s) GIGABYTE RTX 3070 Ti GAMING OC
Storage ADATA SX8200PRO NVME 512GB, Intel 545s 500GBSSD, ADATA SU800 SSD, 3TB Spinner
Display(s) LG Ultra Gear 32 1440p 165hz Dell 1440p 75hz
Case Phanteks P300 /w 300A front panel conversion
Audio Device(s) onboard
Power Supply SeaSonic Focus+ Platinum 750W
Mouse Kone burst Pro
Keyboard EVGA Z15
Software Windows 11 +startisallback
in other news cpu bound game befits from reduction in cpu load .... more at 11
please don't feed into the hypetrain
 

Tatty_Two

Gone Fishing
Joined
Jan 18, 2006
Messages
25,801 (3.87/day)
Location
Worcestershire, UK
Processor Rocket Lake Core i5 11600K @ 5 Ghz with PL tweaks
Motherboard MSI MAG Z490 TOMAHAWK
Cooling Thermalright Peerless Assassin 120SE + 4 Phanteks 140mm case fans
Memory 32GB (4 x 8GB SR) Patriot Viper Steel 4133Mhz DDR4 @ 3600Mhz CL14@1.45v Gear 1
Video Card(s) Asus Dual RTX 4070 OC
Storage WD Blue SN550 1TB M.2 NVME//Crucial MX500 500GB SSD (OS)
Display(s) AOC Q2781PQ 27 inch Ultra Slim 2560 x 1440 IPS
Case Phanteks Enthoo Pro M Windowed - Gunmetal
Audio Device(s) Onboard Realtek ALC1200/SPDIF to Sony AVR @ 5.1
Power Supply Seasonic CORE GM650w Gold Semi modular
Mouse Coolermaster Storm Octane wired
Keyboard Element Gaming Carbon Mk2 Tournament Mech
Software Win 10 Home x64
On that 290X comparison in DX12, my understanding is that the 290X is only "DX12 ready" and therefore will not support the full DX12 feature set, it certainly indicated that on AMD's site a couple of weeks ago when I looked (because I have a 290X and was interested to know it's DX12 capability) where as the 980 will have the works so I can only guess that the 290X was only doing half the work.... dunno maybe I am wrong there.
 

Mussels

Freshwater Moderator
Staff member
Joined
Oct 6, 2004
Messages
58,413 (8.18/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
On that 290X comparison in DX12, my understanding is that the 290X is only "DX12 ready" and therefore will not support the full DX12 feature set, it certainly indicated that on AMD's site a couple of weeks ago when I looked (because I have a 290X and was interested to know it's DX12 capability) where as the 980 will have the works so I can only guess that the 290X was only doing half the work.... dunno maybe I am wrong there.

with these early DX12 titles its not likely they'll even use those features, so the graphical quality/load would be the same.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
Yeah, it's not DirectX 12 that needs to be looked at, it's the Direct3D feature level the game implements. Do we even know what feature level they're using?

https://en.wikipedia.org/wiki/Graphics_Core_Next

GCN 1.0 is 11_1:
Oland
Cape Verde
Pitcairn
Tahiti

GCN 1.1 is 12.0:
Bonaire
Hawaii
Temash
Kabini
Liverpool
Durango
Kaveri
Godavari
Mullins
Beema

GCN 1.2 is 12.0:
Tonga (Volcanic Islands family)
Fiji (Pirate Islands family)
Carrizo

Only NVIDIA's GM2xx chips are 12.1 compliant (which the GTX 980 Ti is). Even Skylake's GPU is 12.0.

So if the game supports feature level 12.1 and it is using it on GTX 980 but using 12.0 on 290X, it's not an apples to apples comparison. We'd have to know that both cards are running feature level 12.0.
 
Joined
Apr 30, 2012
Messages
3,881 (0.89/day)
So if the game supports feature level 12.1 and it is using it on GTX 980 but using 12.0 on 290X, it's not an apples to apples comparison. We'd have to know that both cards are running feature level 12.0.

DX12 is more application dependent then DX11 was. DX12 moves some management that the driver was doing to the application.

AMD GCN cards can handle more resources as well so its not always going to be an apples to apples comparison.
 
Joined
Nov 3, 2011
Messages
690 (0.15/day)
Location
Australia
System Name Eula
Processor AMD Ryzen 9 7900X PBO
Motherboard ASUS TUF Gaming X670E Plus Wifi
Cooling Corsair H115i Elite Capellix XT
Memory Trident Z5 Neo RGB DDR5-6000 64GB (4x16GB F5-6000J3038F16GX2-TZ5NR) EXPO II, OCCT Tested
Video Card(s) Gigabyte GeForce RTX 4080 GAMING OC
Storage Corsair MP600 XT NVMe 2TB, Samsung 980 Pro NVMe 2TB, Toshiba N300 10TB HDD, Seagate Ironwolf 4T HDD
Display(s) Acer Predator X32FP 32in 160Hz 4K IPS FreeSync/GSync DP, LG 27UL600 27in 4K HDR FreeSync/G-Sync DP
Case Phanteks Eclipse P500A D-RGB White
Audio Device(s) Creative Sound Blaster Z
Power Supply Corsair HX1000 Platinum 1000W
Mouse SteelSeries Prime Pro Gaming Mouse
Keyboard SteelSeries Apex 5
Software MS Windows 11 Pro
Yeah, it's not DirectX 12 that needs to be looked at, it's the Direct3D feature level the game implements. Do we even know what feature level they're using?

https://en.wikipedia.org/wiki/Graphics_Core_Next

GCN 1.0 is 11_1:
Oland
Cape Verde
Pitcairn
Tahiti

GCN 1.1 is 12.0:
Bonaire
Hawaii
Temash
Kabini
Liverpool
Durango
Kaveri
Godavari
Mullins
Beema

GCN 1.2 is 12.0:
Tonga (Volcanic Islands family)
Fiji (Pirate Islands family)
Carrizo

Only NVIDIA's GM2xx chips are 12.1 compliant (which the GTX 980 Ti is). Even Skylake's GPU is 12.0.

So if the game supports feature level 12.1 and it is using it on GTX 980 but using 12.0 on 290X, it's not an apples to apples comparison. We'd have to know that both cards are running feature level 12.0.
Resource Binding: Maxwell Tier 2, GCN Tier 3.

Feature level: Maxwell 12_1, GCN 12_0.



CR (Conservative Rasterization) feature

Read https://community.amd.com/message/1308478#1308478

Question:

I need for my application that every drawing produce at least one pixel output (even if this is an empty triangle = 3 identical points). NVidia have an extension (GL_NV_conservative_raster) to enable such a mode (on Maxwell+ cards). Is there a similar extension on AMD cards

Answer (from AMD):

Some of our hardware can support functionality similar to that in the NVIDIA extension you mention, but we are currently not shipping an extension of our own. We will likely hold off until we can come to a consensus with other hardware vendors on a common extension before exposing the feature, but it will come in time.


For ROV feature

AMD already supports Intel's "GL_INTEL_Fragmented_shader_ordering" in OpenGL.

From https://twitter.com/g_truc/status/581224843556843521

It seems that shader invocation ordering is proportionally a lot more expensive on GM204 than S.I. or HSW.
 
Last edited:

rtwjunkie

PC Gaming Enthusiast
Supporter
Joined
Jul 25, 2008
Messages
13,909 (2.42/day)
Location
Louisiana -Laissez les bons temps rouler!
System Name Bayou Phantom
Processor Core i7-8700k 4.4Ghz @ 1.18v
Motherboard ASRock Z390 Phantom Gaming 6
Cooling All air: 2x140mm Fractal exhaust; 3x 140mm Cougar Intake; Enermax T40F Black CPU cooler
Memory 2x 16GB Mushkin Redline DDR-4 3200
Video Card(s) EVGA RTX 2080 Ti Xc
Storage 1x 500 MX500 SSD; 2x 6TB WD Black; 1x 4TB WD Black; 1x400GB VelRptr; 1x 4TB WD Blue storage (eSATA)
Display(s) HP 27q 27" IPS @ 2560 x 1440
Case Fractal Design Define R4 Black w/Titanium front -windowed
Audio Device(s) Soundblaster Z
Power Supply Seasonic X-850
Mouse Coolermaster Sentinel III (large palm grip!)
Keyboard Logitech G610 Orion mechanical (Cherry Brown switches)
Software Windows 10 Pro 64-bit (Start10 & Fences 3.0 installed)
The way I see it, is that why optimize for dx11 anymore ?

Win 10 has been DL more than 50 million times allready including dx12 which makes dx11 a thing of the past.
Move on and forget, just keep dx11 where it allready is, thats good enough and start developing dx12 right away.

You ARE aware, I assume, that 50 million is a drop in the bucket? And that's downloads, not installs. I personally know several people staying on 7 and 8.1. Even here, an enthusiast community, I've seen probably 10% of those that upgraded to W10 go back. So no, DX11 isn't going anywhere,
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
I downloaded twice (Home and Pro) and installed 7 times (Pro once, Home six times). XD

The change from DX11 to DX12 should be pretty swift for developers because it is nothing monumental. I believe DX12 automatically falls back to 11 and 10 depending on hardware capabilities.
 
Joined
Nov 3, 2011
Messages
690 (0.15/day)
Location
Australia
System Name Eula
Processor AMD Ryzen 9 7900X PBO
Motherboard ASUS TUF Gaming X670E Plus Wifi
Cooling Corsair H115i Elite Capellix XT
Memory Trident Z5 Neo RGB DDR5-6000 64GB (4x16GB F5-6000J3038F16GX2-TZ5NR) EXPO II, OCCT Tested
Video Card(s) Gigabyte GeForce RTX 4080 GAMING OC
Storage Corsair MP600 XT NVMe 2TB, Samsung 980 Pro NVMe 2TB, Toshiba N300 10TB HDD, Seagate Ironwolf 4T HDD
Display(s) Acer Predator X32FP 32in 160Hz 4K IPS FreeSync/GSync DP, LG 27UL600 27in 4K HDR FreeSync/G-Sync DP
Case Phanteks Eclipse P500A D-RGB White
Audio Device(s) Creative Sound Blaster Z
Power Supply Corsair HX1000 Platinum 1000W
Mouse SteelSeries Prime Pro Gaming Mouse
Keyboard SteelSeries Apex 5
Software MS Windows 11 Pro
I downloaded twice (Home and Pro) and installed 7 times (Pro once, Home six times). XD

The change from DX11 to DX12 should be pretty swift for developers because it is nothing monumental. I believe DX12 automatically falls back to 11 and 10 depending on hardware capabilities.
There are other API besides DX11 e.g. PS4's lower level APIs.

AMD Mantle and PS4's lower level APIs has set the ground work for DX12.
 
Joined
Mar 24, 2011
Messages
2,356 (0.49/day)
Location
VT
Processor Intel i7-10700k
Motherboard Gigabyte Aurorus Ultra z490
Cooling Corsair H100i RGB
Memory 32GB (4x8GB) Corsair Vengeance DDR4-3200MHz
Video Card(s) MSI Gaming Trio X 3070 LHR
Display(s) ASUS MG278Q / AOC G2590FX
Case Corsair X4000 iCue
Audio Device(s) Onboard
Power Supply Corsair RM650x 650W Fully Modular
Software Windows 10
And Microsoft just comes in and starts building off it. The same way Sony used BSD to make OS for PS4, damn BSD licensing.

Microsoft has been working on DX12 since before DX11.1 actually went live. They started work on it with Intel\AMD\Nvidia before Mantle was even announced, AMD just released Mantle before DX12 as a bit of a PR stunt. As far as I know OpenGL did have extensions that supported some of the features new to DX12. It's more accurate to stay Khronos built on Mantle, but Microsoft built DX12 alongside Mantle.

Better examples of low-level API's would have been Glide and Metal, but most people block those out of their memory because it was a frustrating time in the PC world.
 
Top