• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Could Capitalize on AMD Graphics CoreNext Not Supporting Direct3D 12_1

Joined
Oct 9, 2009
Messages
706 (0.19/day)
Location
Finland
System Name :P~
Processor Intel Core i7-5930K (ES)
Motherboard Asus Rampage V Extreme/3.1
Cooling Phanteks PH-TC14PE
Memory 32GB Corsair Vengeance LPX 2400 MHz
Video Card(s) Asus GTX 1080 Strix
Storage 400GB Intel 750 PCI-E SSD, 512GB Crucial MX100 SSD, 3TB WD RED HDD
Display(s) QNIX QX2710LED OC @ 96 Hz 27"
Case Corsair Obsidian 750D
Audio Device(s) Audioquest Dragon Red + Sennheiser HD 650
Power Supply Corsair HX1000i + Cablemod sleeved cables kit
Mouse Logitech G500s
Keyboard Logitech Ultra X Flat Premium
Software Windows 10 64-bit
I suspect these "Feature levels and x.1 supports mean as much as they have had in the past. Absolutely nothing. Never had any benefit whatsoever, but AMD and NVIDIA PR sure fires off ton of slides always.
 
Joined
Feb 20, 2007
Messages
366 (0.08/day)
Location
Where the beer is good
System Name Karl Arsch v. u. z. Abgewischt
Processor i5 3770K @5GHz delided
Motherboard ASRock Z77 Professional
Cooling Arctic Liquid Freezer 240
Memory 4x 4GB 1866 MHz DDR3
Video Card(s) GTX 970
Storage Samsung 830 - 512GB; 2x 2TB WD Blue
Display(s) Samsung T240 1920x1200
Case Bitfenix Shinobie XL
Audio Device(s) onboard
Power Supply Cougar G600
Mouse Logitech G500
Keyboard CMStorm Ultimate QuickFire (CherryMX Brown)
Software Win7 Pro 64bit
The source for that computerbase.de "article" is an AMD guy (Robert Hallock) with whom computerbase.de had a chat.
 
Joined
Dec 22, 2011
Messages
180 (0.06/day)
GCN 1.0 was confirmed DX12-compliant.
And the point I was bringing up is that two of the features specifically mentioned in this article that GCN supposedly doesn't support, it does at a hardware level, unlike anything(except latest Maxwell) NV has, specifically tiled resources and conservative rasterization. Regardless of tier, these are DX12_1 features, and they are supported on-metal by GCN, which directly contradicts this "news" article.
AMD has said it themselves too, current GCN implementations only support FL12_0 (or possibly under in case of 1.0), they lack some of the required hardware features required for DX12's ROVs or conservative rasterization, or both, can't remember out of the top of my head. Also, volume tiled resources is different from tiled resources. GCN's can do the latter for sure, there's still no clear answer for the former.
The only relevant remaining question is is whether GCN 1.0 is 11_1 or 12_0.

(also, sidenote: not all chips considered GCN 1.0 are necessarily 1.0 anymore, there's indications that Cape Verde and Pitcairn (Curacao) have been upgraded to 1.1 after HD7-series (AMD OpenCL 2.0 requires GCN 1.1, yet their list for OpenCL 2.0 supporting products include several Cape Verde & Pitcairn based products (but not their HD7-versions))
((GCN 1.1 in this case refererring to the 3D/Compute-related capabilities of the chips, not including display controller or audio controller features which obviously aren't there))

edit:
Regarding the source, they seem to be somewhat confused, too. Bonaire is same gen as Hawaii, but they seem to think it's older)
 
Joined
Aug 16, 2004
Messages
3,095 (0.55/day)
Location
Visalia, CA
Processor Intel Core i7 8700K @ 5GHz
Motherboard Asus Prime Z370-A
Cooling Swiftech H220-X
Memory 2x16GBs G.Skill Trident DDR4 3000MHz
Video Card(s) Asus ROG Strix 2080 Ti Black Ops Ed
Storage OS: 256GBs Samsung 850 Pro SSD/Games: 3TBs WD Black
Display(s) Sony 75" 4K HDR TV - Samsung Odyssey VR
Case Corsair Graphite Black 760T
Audio Device(s) Sony STR-DN1080 - 7.2 Klipsch Dolby ATMOS Speaker Setup
Power Supply EVGA Supernova G2 1300W
Keyboard Logitech K830 wireless keyboard/trackpad
Software Windows 10 Pro 64bit
I don't read German so I can't read the source article but it's not Nvidia saying it. Is it? Is the German article alluding to a fact not of Nvidia's fabrication or is it pertaining to Nvidia's heavy PR on the feature set Maxwell focusses on?
Can someone clarify without any brand bias? Obviously I can see the charts posted but has AMD stated in any press deck that they dont support tier 1?
In other use words to simplify, who the f@ck said GCN doesn't support it?
Lol, yes you're right, so this news item, is it more of an editorial then? Nowhere in the quoted article says nvidia will use this to their advantage :shadedshu:
 
Joined
Mar 4, 2006
Messages
439 (0.09/day)
Just like we have the (PR) tags in front of articles that are press releases from companies, can we get a (FUD) tag for articles that have no basis in reality please?
 
Joined
Jun 13, 2012
Messages
1,110 (0.41/day)
System Name desktop
Processor i7-4770k
Motherboard Asus z87-plus
Cooling Corsair h80
Memory 32gb G.Skill Ares @ 2400mhz
Video Card(s) EVGA GeForce GTX 1080 SC (ACX 3.0)
Storage 2x Samsung 1tb 850evo SSD, 3x WD drives, 2 seagate
Display(s) Acer Predator XB271HU 27inch IPS G-Sync 165hz
Audio Device(s) Sound Blaster x-FI Platium, Turtle beach Elite pro 2 + superamp.
Power Supply OCZ Z Series 850W
Mouse Razer Deathadder Elite
Keyboard Logitch G710+
The company probably wants to make a few new effects that rely on D3D 12_1 part of GameWorks, and deflect accusations of exclusivity to the competition (AMD) not supporting certain API features, which are open to them. Granted, AMD GPUs, and modern game consoles such as the Xbox One and PlayStation 4 don't support GameWorks, but that didn't stop big game devs from implementing them.
hairworks relies on DX11 tessellation so they already do that with current effects. AMD cards royally suck at tessellation.

nVidia must be terrified of Fury if they are running this scared and attempting to deceive consumers about DX12. 6/16 is looking even more interesting.
pfft most reports show fury isn't as good as people expect, few say it will be SLOWER then a 980ti which given history could be possible, but likely its only as fast and cost more.


Kinda funny first page of comments are all AMD fans attacking pretty sad
 
Joined
Dec 22, 2011
Messages
2,986 (1.03/day)
System Name Zimmer Frame Rates
Processor Intel i7 920 @ Stock speeds baby
Motherboard EVGA X58 3X SLI
Cooling True 120
Memory Corsair Vengeance 12GB
Video Card(s) Palit GTX 980 Ti Super JetStream
Storage Of course
Display(s) Crossover 27Q 27" 2560x1440
Case Antec 1200
Audio Device(s) Don't be silly
Power Supply XFX 650W Core
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 10
Benchmark Scores Epic
Still, It's good to know Maxwell V2 is DX12_1 compliant at least. :p
 
Joined
Sep 7, 2011
Messages
2,785 (0.92/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
My god, yet another topic that seems to be less understood the more its talked about.

DirectX12 has both hardware level support and feature level support - they are NOT the same entities.
Resource binding tiers are a subset of DirectX 11 not 12. Tier 1 requires 11.0 support, Tier 2 requires 11.1 support, Tier 3 requires 11.2 support. Tier support - whether 1, 2, or 3 is mandatory for DX12. Supported architectures at each level are:


Within those tiers, the feature level support associated with - but not the same as- the hardware support (since some of the features can be emulated in software) is broken down thus:


The hardware level and feature level, and whether any natively supported feature level will be emulated will be down to the game developers.
As I pointed out in another posting a few days ago, feature level support will likely depend on whose IHV game dev program backing (if any) the game studio leverages.
This is a good article, they indicate Feature Levels have different Tiers within them, for example:
So in short, it's perfectly reasonble to assume AMD support DX12_0, up to Tier 3, Tier 3 doesn't automatically mean they support DX12_1
Exactly.
If the publicity is guilty of anything it is making the explanation too simplistic. AMD is also to blame here as well, since they widely circulated some not very concise slides, and made little if no distinction between native support and emulation which seems to cause some confusion regarding GCN 1.0 support of DX11.2 Tier 2 (which of course is carried over to DX12).

Oh this is btarunr who posted this. Not buying into his Nvidia bullshit anymore.
It's actually btarunr quoting a German site of some repute, who interviewed AMD's head graphics marketing exec who is telling anyone who will listen that Nvidia could include conservative rasterization features into their game dev program....which could well eventuate, although since Maxwell v2 makes up such as small part of the graphics market they would almost certainly be addition features rather than obligated ones. So, Mr/Mrs/Ms Octopuss what this is all about is AMD alerting the world to the possible (and in some way probable) expansion of Nvidia's GameWorks program. Maybe you should have taken the time to read the original source material.
 
Last edited:
Joined
Apr 30, 2008
Messages
4,405 (1.04/day)
Location
Multidimensional
System Name Jill-Sandwich
Processor AMD Ryzen 7 3700X 8 Core 16 thread @ Stock / PBO On
Motherboard Gigabyte X570 I Aorus Pro Wifi
Cooling Bequiet Shadow Rock LP 120mm / 2x 120mm Slim Silverstone fans
Memory 32GB Corsair Vengeance LP DDR4 3200Mhz
Video Card(s) Sapphire RX 5700 XT 8GB Nitro+ @ 2Ghz
Storage 512GB Adata XPG SX8200 Pro M.2 NVMe / 2TB Samsung 2.5in HDD
Display(s) Hisense 1080p Smart LED HDTV 40inch
Case Silverstone Raven RVZ03 RGB Case
Audio Device(s) Realtek HD Audio / HDMI Audio Via GPU
Power Supply Corsair SFX 600W Gold Rated PSU
Mouse CoolerMaster Masterkeys Lite L RGB Mem-Chanical Combo
Keyboard CoolerMaster Masterkeys Lite L RGB Mem-Chanical Combo
Software Windows 10 Home 64bit
Benchmark Scores Don't do em anymore. :(
hairworks relies on DX11 tessellation so they already do that with current effects. AMD cards royally suck at tessellation.


pfft most reports show fury isn't as good as people expect, few say it will be SLOWER then a 980ti which given history could be possible, but likely its only as fast and cost more.


Kinda funny first page of comments are all AMD fans attacking pretty sad
Dude I wouldn't talk if I were you, you praise Nvidia like close minded PC fanboys praise Gabe Newell!
 
Joined
Apr 29, 2014
Messages
3,837 (1.87/day)
Location
Texas
System Name SnowFire / The Reinforcer / Portable?
Processor i7 8700K 5.0ghz (24/7) / 2x Intel Xeon E52650v2 / Intel i7 7700HQ
Motherboard Asus Z370 Prime-A / Dell Dual Socket (R720) / HP Omen Stock
Cooling RX 360mm + 140mm Custom Loop in Push Pull Config. / Dell Stock / HP Stock
Memory Corsair Vengeance RGB 16gb DDR4 3000 CL 16 / 1600mhz DDR3 128gb 16 x 8gb / 12gb DDR4 3 x 4gb
Video Card(s) GTX Titan XP (2025mhz) / Asus GTX 950 (No Power Connector) / RX 580
Storage Samsung 512gb SSD RAID 0 and 2tb HDD x4 RAID 5 / 300gb x8 RAID 5 / NVME 240gb SSD and 1tb HDD
Display(s) Acer XG270HU 1440p 144hz Freesync, Acer B286HK 4K UHD Monitor / 1080p 60hz Freesync display
Case Corsair Obsidian 800D / Dell Poweredge R720 Rack Mount Case / HP Omen 17
Audio Device(s) Realtec ALC1150 (On board)
Power Supply Rosewill Lightning 1300Watt
Mouse Logitech G5
Keyboard Logitech G19S
Software Windows 10 Pro / Windows Server 2008 R2 / Windows 10 Home
Dude I wouldn't talk if I were you, you praise Nvidia like close minded PC fanboys praise Gabe Newell!
Shots fired! If anyone though is biased he's one of the major ones.

As for the article, it's an article based on what another site wrote. I don't really count any of this stating bt is biased. He was happy to post multiple articles about the gtx 970 memory and the blocked overclocking so I really would not call him biased.

Whatever the case, the truth will be revealed in time. You can lie all you want (not claiming anything specific is a lie) but in the end the truth will be shown one way or another.
 
Joined
Jun 13, 2012
Messages
1,110 (0.41/day)
System Name desktop
Processor i7-4770k
Motherboard Asus z87-plus
Cooling Corsair h80
Memory 32gb G.Skill Ares @ 2400mhz
Video Card(s) EVGA GeForce GTX 1080 SC (ACX 3.0)
Storage 2x Samsung 1tb 850evo SSD, 3x WD drives, 2 seagate
Display(s) Acer Predator XB271HU 27inch IPS G-Sync 165hz
Audio Device(s) Sound Blaster x-FI Platium, Turtle beach Elite pro 2 + superamp.
Power Supply OCZ Z Series 850W
Mouse Razer Deathadder Elite
Keyboard Logitch G710+
Dude I wouldn't talk if I were you, you praise Nvidia like close minded PC fanboys praise Gabe Newell!
Fact are facts, even though AMD fans won't admit the truth when it hurts their favorite brand.

As for ALL the hype over AMD's latest chip, its ALL hype. just cause it has HBM this 4000 gcn that, as proven over last many years just cause AMD chip has more of this or that, doesn't mean its better or faster. History proves that as Fact.
 
Last edited:
Joined
Aug 20, 2007
Messages
12,096 (2.69/day)
System Name Pioneer
Processor Intel i9 9900k
Motherboard ASRock Z390 Taichi
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory G.SKILL TridentZ Series 32GB (4 x 8GB) DDR4-3200 @ 13-13-13-33-2T
Video Card(s) EVGA GTX 1080 FTW2
Storage Mushkin Pilot-E 2TB NVMe SSD w/ EKWB M.2 Heatsink
Display(s) LG 32GK850G-B 1440p 32" AMVA Panel G-Sync 144hz Display
Case Thermaltake Core X31
Audio Device(s) Onboard TOSLINK to Schiit Modi MB to Schiit Asgard 2 Amp to AKG K7XX Ruby Red Massdrop Headphones
Power Supply EVGA SuperNova T2 850W 80Plus Titanium
Mouse ROCCAT Kone EMP
Keyboard WASD CODE 104-Key w/ Cherry MX Green Keyswitches, Doubleshot Vortex PBT White Transluscent Keycaps
Software Windows 10 x64 Enterprise... yes, it's legit.
My god, yet another topic that seems to be less understood the more its talked about.

DirectX12 has both hardware level support and feature level support - they are NOT the same entities.
Resource binding tiers are a subset of DirectX 11 not 12. Tier 1 requires 11.0 support, Tier 2 requires 11.1 support, Tier 3 requires 11.2 support. Tier support - whether 1, 2, or 3 is mandatory for DX12. Supported architectures at each level are:


Within those tiers, the feature level support associated with - but not the same as- the hardware support (since some of the features can be emulated in software) is broken down thus:


The hardware level and feature level, and whether any natively supported feature level will be emulated will be down to the game developers.
As I pointed out in another posting a few days ago, feature level support will likely depend on whose IHV game dev program backing (if any) the game studio leverages.

Exactly.
If the publicity is guilty of anything it is making the explanation too simplistic. AMD is also to blame here as well, since they widely circulated some not very concise slides, and made little if no distinction between native support and emulation which seems to cause some confusion regarding GCN 1.0 support of DX11.2 Tier 2 (which of course is carried over to DX12).


It's actually btarunr quoting a German site of some repute, who interviewed AMD's head graphics marketing exec who is telling anyone who will listen that Nvidia could include conservative rasterization features into their game dev program....which could well eventuate, although since Maxwell v2 makes up such as small part of the graphics market they would almost certainly be addition features rather than obligated ones. So, Mr/Mrs/Ms Octopuss what this is all about is AMD alerting the world to the possible (and in some way probable) expansion of Nvidia's GameWorks program. Maybe you should have taken the time to read the original source material.

Thanks for taking the time to explain that... I'm sure I'm not the only one here to admit it makes my head spin a bit.

Bottom line: The game devs will code at the lowest common denominator, as they always have, making ALL of this stupid FUD.
 
Joined
Apr 30, 2008
Messages
4,405 (1.04/day)
Location
Multidimensional
System Name Jill-Sandwich
Processor AMD Ryzen 7 3700X 8 Core 16 thread @ Stock / PBO On
Motherboard Gigabyte X570 I Aorus Pro Wifi
Cooling Bequiet Shadow Rock LP 120mm / 2x 120mm Slim Silverstone fans
Memory 32GB Corsair Vengeance LP DDR4 3200Mhz
Video Card(s) Sapphire RX 5700 XT 8GB Nitro+ @ 2Ghz
Storage 512GB Adata XPG SX8200 Pro M.2 NVMe / 2TB Samsung 2.5in HDD
Display(s) Hisense 1080p Smart LED HDTV 40inch
Case Silverstone Raven RVZ03 RGB Case
Audio Device(s) Realtek HD Audio / HDMI Audio Via GPU
Power Supply Corsair SFX 600W Gold Rated PSU
Mouse CoolerMaster Masterkeys Lite L RGB Mem-Chanical Combo
Keyboard CoolerMaster Masterkeys Lite L RGB Mem-Chanical Combo
Software Windows 10 Home 64bit
Benchmark Scores Don't do em anymore. :(
Fact are facts, even though AMD fans won't admit the truth when it hurts their favorite brand.

As for ALL the hype over AMD's latest chip, its ALL hype. just cause it has HBM this 4000 gcn that, as proven over last many years just cause AMD chip has more of this or that, doesn't mean its better or faster. History proves that as Fact.
Yeah maybe I guess, will just have to wait & see :toast:
 
Joined
Mar 24, 2012
Messages
342 (0.12/day)
This is so confusing, the other day I read in this forum that GCN supports DX12 _3 all the way to ver 1.1, and now this seems to contradict that, all these DX12 feature levels are making my head spin, I mean, can someone please clarify what's going on with that? :(
except there is no such thing as 12_3. there are only 12_0 and 12_1. people are confusing tier level support with feature level support.
 
Joined
Sep 7, 2011
Messages
2,785 (0.92/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
It also contradicts what Nvidia has said about pre 980/970 and CR. Which falls inline with the other table.
Really?
And what did Nvidia say about Fermi and Maxwell v1 and conservative rasterization?

Just throwing shit at the wall and hoping nobody notices, or are you privy to facts you aren't willing to share? Or are you confusing full hardware support with software emulation? Because that is available to most architectures- it just isn't very resource friendly.


except there is no such thing as 12_3. there are only 12_0 and 12_1. people are confusing tier level support with feature level support.
+1....tier
Bottom line: The game devs will code at the lowest common denominator, as they always have, making ALL of this stupid FUD.
Yes, I doubt that it makes much of a difference in the long run (although The Witcher 3 utilizes some of the same features, which is why Maxwell v2 cards and AMD's GCN does better than Kepler since the latter doesn't have DX 11.2 support), but I'm sure it will churn up endless debate over what probably amounts to 3-4% performance variance, while the content of the bulk of the games themselves - uninspiring corridor shooters and thinly veiled ripoffs of earlier titles receive a bare fraction of the same forum bandwidth. Yay for common sense.
 
Last edited:
Joined
Apr 7, 2008
Messages
632 (0.15/day)
Location
Australia
System Name _Speedforce_ (Successor to Strike-X, 4LI3NBR33D-H, Core-iH7 & Nemesis-H)
Processor Intel Core i9 7980XE (Lapped) @ 5.2Ghz With XSPC Raystorm (Lapped)
Motherboard Asus Rampage VI Extreme (XSPC Watercooled) - Custom Heatsinks (Lapped)
Cooling XSPC Custom Water Cooling + Custom Air Cooling (From Delta 220's TFB1212GHE to Spal 30101504&5)
Memory 8x 8Gb G.Skill Trident Z RGB 4266MHz @ 4667Mhz (2x F4-4266C17Q-32GTZR)
Video Card(s) 3x Asus GTX1080 Ti (Lapped) With Customised EK Waterblock (Lapped) + Custom heatsinks (Lapped)
Storage 1x Samsung 970 EVO 2TB - 2280 (Hyper M.2 x16 Card), 7x Samsung 860 Pro 4Tb
Display(s) 6x Asus ROG Swift PG348Q
Case Aerocool Strike X (Modified)
Audio Device(s) Creative Sound BlasterX AE-5 & Aurvana XFi Headphones
Power Supply 2x Corsair AX1500i With Custom Sheilding, Custom Switching Unit. Braided Cables.
Mouse Razer Copperhead + R.A.T 9
Keyboard Ideazon Zboard + Optimus Maximus. Logitech G13.
Software w10 Pro x64.
Benchmark Scores pppft, gotta see it to believe it. . .
Well yeah but AMD has plenty of both especially with the new HBM cards, so I see no issues with this.
This should help Nvidia with their 4Gb - Sorry, I mean 3.5Gb GPU lineup :)
 
Joined
Aug 20, 2007
Messages
12,096 (2.69/day)
System Name Pioneer
Processor Intel i9 9900k
Motherboard ASRock Z390 Taichi
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory G.SKILL TridentZ Series 32GB (4 x 8GB) DDR4-3200 @ 13-13-13-33-2T
Video Card(s) EVGA GTX 1080 FTW2
Storage Mushkin Pilot-E 2TB NVMe SSD w/ EKWB M.2 Heatsink
Display(s) LG 32GK850G-B 1440p 32" AMVA Panel G-Sync 144hz Display
Case Thermaltake Core X31
Audio Device(s) Onboard TOSLINK to Schiit Modi MB to Schiit Asgard 2 Amp to AKG K7XX Ruby Red Massdrop Headphones
Power Supply EVGA SuperNova T2 850W 80Plus Titanium
Mouse ROCCAT Kone EMP
Keyboard WASD CODE 104-Key w/ Cherry MX Green Keyswitches, Doubleshot Vortex PBT White Transluscent Keycaps
Software Windows 10 x64 Enterprise... yes, it's legit.
while the content of the bulk of the games themselves - uninspiring corridor shooters and thinly veiled ripoffs of earlier titles receive a bare fraction of the same forum bandwidth. Yay for common sense.
Glad to see I'm not the only one who finds present games rather uninspired.
 
Joined
Apr 27, 2014
Messages
20 (0.01/day)
System Name Enthoo Phantom
Processor 4820K
Motherboard Gigabyte X79-UP4
Cooling Custom-Loop
Memory Patriot DDR3 Intel limited 1866
Video Card(s) gigabyte geforce gtx 970 g1 gaming in SLI
Storage 1TB SSD + 250GB SSD and 2TB HDD
Display(s) MSI Optix MAG-C 2560x1440 144hz 1ms
Case Enthoo Luxe
Audio Device(s) Sound Blaster Z
Power Supply 850 W Corsair Bronze : (
Mouse Logitech G502
Keyboard Corsair K70
Software M$ WIndows 10 Pro
Benchmark Scores Crap
Im just mad that my 970's wont run the full 12.1 crap. I mean wtf, I bought em thinking id be set for a while. Now I find out their is a higher version not supported by my card? WTF M$
 

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
37,790 (8.50/day)
Location
Hyderabad, India
Processor AMD Ryzen 7 2700X
Motherboard ASUS ROG Strix B450-E Gaming
Cooling AMD Wraith Prism
Memory 2x 16GB Corsair Vengeance LPX DDR4-3000
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) Samsung U28D590 28-inch 4K UHD
Case Corsair Carbide 100R
Audio Device(s) Creative Sound Blaster Recon3D PCIe
Power Supply Cooler Master MWE Gold 650W
Mouse Razer Abyssus
Keyboard Microsoft Sidewinder X4
Software Windows 10 Pro
For those of you who don't know Google Translate exists,

In summary, no 12_1 for GCN. Not only did a person of authority from AMD confirm it, but also AMD sourgraped about it much like NVIDIA did about 11_1 for Kepler.





"For 'Fiji' AMD is not expressed" means that AMD guy didn't want to talk about Fiji. Any DACH guys can confirm that translation.
 
Joined
Apr 2, 2011
Messages
2,457 (0.77/day)
This is a fun discussion. It reads first as red/green team member "proving" their team is better without ever proving anything. It morphs into accusations that the article's author is biased. Finally, we wind up in the bizarre land of what exactly DX12 support is.

Addressing these issues in order:

When GCN and Kepler came out DX11 was just starting to be the standard for gaming. The fact that there is any proposed support for either of these architectures to offer DX12 features is a small miracle. It'd be like expecting your 2000 Toyota Camry to have an Ipod dock. Obviously DX12 is something that is meant to reach back and give some cards new features, but expecting any cards with 100% compatibility is a pipe dream. Neither AMD nor Nvidia are magic, deal with it.


Btarunr and I often have differences of opinion. Look back to the editorial on everyone "needing to accept windows 8, because it's not going away" and you can see that clearly. Despite differences of opinion, you need to offer respect where it is due. This article, and Btarunr in general, are shooting straight. There aren't any lies introduced by the author, there's no reason to believe fiscal gains are the motive, and reading the original piece I think this is a fair representation. I disagree that reporting on reporters reporting is news (seriously, it's hard enough to report fairly when it's a one-on-one interview), but that's no reason to accuse a person of being a shill. Whenever you've got incontrovertible proof of wrong doing, present it. Right now these accusations are hollow, and dismissed as easily as they are made.


Sniff....sniff.....sniff.... Does anyone else smell a turd? I think I remember that smell... Vista, is that you?

Jokes aside, isn't anyone else getting flashbacks of Vista when they hear about DX12? Some hardware supports it, other hardware is capable of supporting part of it, and people are selling hardware as compatible despite knowing that you'd need a half dozen upgrades to get a system running reasonably. DX12 is a complicated animal, over simplified in the media and by sales, and the ensuing fallout from the false portrayal will hurt more people than it could ever help. The end goal is just to move more hardware, as people get frustrated by partial support and the promised improvements being hollow.



Let's start some more fires. Fiji is supposed to faster than, slower than, and as fast as the GTX 980 ti. Zen is supposed to both compete with Intel, and put them back into the dedicated CPU market. Intel is going to both improve FPGA production and cost by incorporating Altera into its production chain, but because the market will become a monopoly it will destroy affordable product lines. You see, FUD is fun. My last four sentences are complete speculation, yet they've started flame wars on forums. Seriously, show me the numbers before I give a crap. Until there are solid FACTs you might as well be playing with yourself for all the energy you're wasting on going nowhere.
 
Joined
Feb 21, 2014
Messages
1,131 (0.53/day)
Location
Alabama, USA
System Name Desktop || XPS 15 9560
Processor i5 4670k || i5 7300HQ
Motherboard MSI Z87-G41 || OEM
Cooling NZXT Respire T40 || OEM
Memory 16GB 1866Mhz DDR3 || 8GB DDR4
Video Card(s) EVGA GTX 1070 FTW || GTX 1050
Storage Ultra 2 480GB + WD Black 2TB || 1TB Crucial SSD
Display(s) ASUS VS228 1080p || Dell InfinityEdge 4k
Case NZXT Source 210 White || OEM
Power Supply Corsair CXm 750w || OEM
Mouse Corsair SABRE RGB || Logitech 720 Triathlon
Keyboard Steelseries APEX RGB || OEM
Honestly, as long as my 280x can handle the most basic dx12 features or at least spoof them without too much of an issue I couldn't give less of a crap. My GPU came out in 2012, we should just be expecting dx12 support in this generation or the next. The fact that old cards might still get the CPU (and other benefits) of dx12 is good enough for me.

Honestly I'll probably need to upgrade my GPU before dx12 is widespread enough to be required.
 
Joined
Jan 13, 2015
Messages
51 (0.03/day)
The League of Extraordinary (Wealthy) Gentlemen

This surmarizes my feelings when reading these comments.

To be honest, I'll say this from the start. I like AMD better than Intel or NVIDIA - but it never stopped me from buying the a clearly better product, when it makes sense. Secondly, my machine - on which I do games (though mostly ones that I really like and spending many hours, often hundreds, with) - is getting old, almost 3 years now. Even when bought, it was of average 'strength' - though I do tend to build a lots of system, for my friends who need it and have less experience than I do (I own different PCs from 1987th, after all:) )

But, an average comment poster seems to have something like this:
- at least i7 of the newer generation, looking for an upgrade
- Titan X or, at least 980x (preferably in SLI)
- 4k monitor (at least one), or some multi-monitor configuration

Same poster is:
- highly environment-aware, though PSU is >800+W, and every 10-30W of extra usage counts as unacceptable
- liking and needing Iris Pro
- is highly dependant on yet unreleased DirectX 12, even to a point where some small inability of the graphic card to use hardware acceleration of the feature means life or death, NOW

Configuration mentioned easily surpasses 2000$, and may need replacement immediately, or very soon, because, well, some of the games released after famous June 29th won't have every every hardware optimization of feature which would be questionably used in that game(s) -softwere emulation is out of the question, naturally

Well, MY old system is very capable to support games I play, and it is not Soltaire - a number of AAA titles are there.

- 4K - though I've seen enough gaming on those, I've decided that I don't need one yet
- Multi-monitor gaming - it looks... bad, in my book (though I do have a second monitor, I use it for a different purposes entirely)
- Iris Pro is inherently... unneeded, a person spending as much for a CPU and is a passionate gamer buys a REAL graphic card, even if it's just an average one, and gets like 5-10 better results of it (minimum - whole concept of having Iris Pro eludes me, it's suitable for a budget CPU and budget gaming, yet it bears premium price)

Main features of DX12 are, as my understanding goes, to better utilize CPU cores and GPU, existing ones as yet unreleased. Other than this, there are bunch of new features - but then again, there are PsysX (and Flex and Enviromental and so on), Fur, God Rays, HairFX - also TrueAudio, Tressfx, Mantle... Neither green or red using games profited hugely out of having, or missing those - yes, they are noticeable, especially in demos, and probably moreso than some cryptic "Feature Level 12_x".

That being said, I wouldn't worry too much of it for the time being - and neither should others. Buying a new GPU cause it supports something as vague as "Feature Level 12_x" as primary reason is pure waste of money. You can always do the same when and IF this starts to affect your experience. When it happens, all the chances are that prices will be lower, products more mature and better overall, to the point of having the whole new generation.

I plan to upgrade my computer in 2016, if something important doesn't change - this goes both ways.

(Oh, and I did had a privilege to see several expensive components and how they affect my configuration - for me, they weren't worth buying, but mileage may vary - those who need them, probably have them already)

Sorry for a wall of text, but I felt that I have to write this, since the whole discussion regarding replacing GPU or profiting over having "Feature Level 12_x" seems so pointless to me.
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
15,925 (3.63/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K at stock (hits 5 gees+ easily)
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (4 x 4GB Corsair Vengeance DDR3 PC3-12800 C9 1600MHz)
Video Card(s) Zotac GTX 1080 AMP! Extreme Edition
Storage Samsung 850 Pro 256GB | WD Green 4TB
Display(s) BenQ XL2720Z | Asus VG278HE (both 27", 144Hz, 3D Vision 2, 1080p)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair HX 850W v1
Software Windows 10 Pro 64-bit
Ouch, can anyone say controversial?

There's no love lost between these two, that's for sure lol.
 
Top