• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Lack of Async Compute on Maxwell Makes AMD GCN Better Prepared for DirectX 12

Joined
Sep 7, 2011
Messages
2,785 (0.61/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
Indeed but AMD is the worst company in the world to be taking a gamble like that. I think the console market had more to do with AMD's decision than discreet GPUs.
Possible, but the console APUs are Sony/MS turf. AMD is the designer. Any deviation in development ultimately is Sony/MS's decision
Fiji is just their viability test platform.
A test platform that represents AMD's only new GPU in the last year (and for the next six months at least). Without Fiji, AMD's lineup is straight up rebrands with some mildly warmed over SKUs added into the mix, whose top model is the 390X (and presumably without Fiji, there would be a 395X2). Maybe not a huge gulf in outright performance, but from a marketing angle AMD would get skinned alive. Their market share without Fiji was in a nose dive.
Maybe AMD excepts to ship second generation APUs for Xbox One and PlayStation 4 with die shrink and HBM and AMD expects to be able to pocket the savings instead of Sony and Microsoft.
Devinder Kumar intimated that the APU die shrink would mean AMD's net profit would rise, so that is a fair assumption that any saving in manufacturing cost aids AMD, but even with the APU die and packaging shrink, Kumar expected gross margins to break $20/unit form the $17-18 they are presently residing at. Console APUs are still a volume commodity product, and I doubt that Sony/MS would tolerate any delivery slippage due to process/package deviation unless the processes involved were rock solid - especially if the monetary savings are going into AMD's pocket rather than the risk/reward being shared.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.65/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
Their market share without Fiji was in a nose dive.
It still is because Fiji is priced as a premium product and the bulk of discreet card sales are midrange and low end. Those cards are all still rebrands.

Devinder Kumar intimated that the APU die shrink would mean AMD's net profit would rise, so that is a fair assumption that any saving in manufacturing cost aids AMD, but even with the APU die and packaging shrink, Kumar expected gross margins to break $20/unit form the $17-18 they are presently residing at. Console APUs are still a volume commodity product, and I doubt that Sony/MS would tolerate any delivery slippage due to process/package deviation unless the processes involved were rock solid - especially if the monetary savings are going into AMD's pocket rather than the risk/reward being shared.
Sony/Microsoft would save in other areas like power transformer, cooling, and space. They can make a physically smaller console which translates to materials saving. Everyone wins--AMD the most because instead of just getting paid for the APU, they'd also have to get paid for memory too (most of which would be sent to the memory manufacturer but it is still something AMD can charge more for).
 
Joined
Sep 7, 2011
Messages
2,785 (0.61/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
It still is because Fiji is priced as a premium product and the bulk of discreet card sales are midrange and low end. Those cards are all still rebrands.
Well, for definitive proof you'd need to see the Q3 market share numbers, since Fiji barely arrived before the close of Q2.
Sales of the top end aren't generally the only advantage to sales. They indirectly influence sales of lower parts due to the halo effect. Nvidia probably sells a bunch of GT (and lower end GTX) 700/900 series cards due to the same halo effect from the Titan and 980 Ti series - a little reflected glory if you like.
AMD obviously didn't foresee GM200 scaling (clocks largely unaffected by increasing die size) as well as it did when it laid down Fiji's design, and had Fiji been unreservedly the "worlds fastest GPU" as they'd intended, it would have boosted sales of the lower tier. AMD's mistake was not taking into account that the opposition also have capable R&D divisions, but when AMD signed up for HBM in late 2013, they had to make a decision on estimates and available information.
Sony/Microsoft would save in other areas like power transformer, cooling, and space. They can make a physically smaller console which translates to materials saving. Everyone wins--AMD the most because instead of just getting paid for the APU, they'd also have to get paid for memory too (most of which would be sent to the memory manufacturer but it is still something AMD can charge more for).
Hynix's own rationale seemed to be to keep pace with Samsung ( who had actually already begun rolling out 3D NAND tech by this time). AMD's involvement surely stemmed from HSA and hUMA in general - of which, consoles leverage the same tech to be sure, but I think were only part of the whole HSA implementation strategy.
 
Joined
Dec 31, 2009
Messages
19,366 (3.72/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
Sorry to be completely OTHERWISE bere, but human, that quote from that douche canoe charlie is PRICELESS.
 
Joined
Sep 7, 2011
Messages
2,785 (0.61/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
Sorry to be completely OTHERWISE bere, but human, that quote from that douche canoe charlie is PRICELESS.
I'd say that you simply can't buy insight like that, but you can. For a measly $1000 year subscription, the thoughts and ramblings of Chairman Charlie can be yours! Charlie predicts...he dices...he slices, juliennes, and mashes, all for the introductory low, low price!
 
Joined
Nov 3, 2011
Messages
690 (0.15/day)
Location
Australia
System Name Eula
Processor AMD Ryzen 9 7900X PBO
Motherboard ASUS TUF Gaming X670E Plus Wifi
Cooling Corsair H115i Elite Capellix XT
Memory Trident Z5 Neo RGB DDR5-6000 64GB (4x16GB F5-6000J3038F16GX2-TZ5NR) EXPO II, OCCT Tested
Video Card(s) Gigabyte GeForce RTX 4080 GAMING OC
Storage Corsair MP600 XT NVMe 2TB, Samsung 980 Pro NVMe 2TB and Toshiba N300 NAS 10TB HDD
Display(s) 2X LG 27UL600 27in 4K HDR FreeSync/G-Sync DP
Case Phanteks Eclipse P500A D-RGB White
Audio Device(s) Creative Sound Blaster Z
Power Supply Corsair HX1000 Platinum 1000W
Mouse SteelSeries Prime Pro Gaming Mouse
Keyboard SteelSeries Apex 5
Software MS Windows 11 Pro
So, basically what I just said.

Don't kid yourself, they only reasons they aren't here is because they're too busy eating their crayons.
bon appétit

I think rvalencia is attempting to bridge that divide.
1. With Fables, you asserted a unsupported claim .

2. Your personality based attacks shows you are a hypocrite i.e. not much different to certain WCCFTech's comment section.


Beware, I'm hearing about problems with R9 280(X) from all over the place. Specifically, Gigabyte and XFX come up.
Do you have a view that Gigabyte and XFX Maxwellv2s are trouble free? Your assertion shows you are a hypocrite.
 
Last edited:
Joined
Mar 24, 2011
Messages
2,356 (0.50/day)
Location
VT
Processor Intel i7-10700k
Motherboard Gigabyte Aurorus Ultra z490
Cooling Corsair H100i RGB
Memory 32GB (4x8GB) Corsair Vengeance DDR4-3200MHz
Video Card(s) MSI Gaming Trio X 3070 LHR
Display(s) ASUS MG278Q / AOC G2590FX
Case Corsair X4000 iCue
Audio Device(s) Onboard
Power Supply Corsair RM650x 650W Fully Modular
Software Windows 10
The comment section on WCCFTech is literally a wasteland of human intellect. I suppose it's fitting for a site that publishes every stray theory and tweet from an engineer as breaking news. They are second only to S|A on the shortlist of tech sites I cannot stand seeing cited.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.65/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
Do you have a view that Gigabyte and XFX Maxwellv2s are trouble free? Your assertion shows you are a hypocrite.
I have no idea. All I know is RMA'ing 280(X) graphics cards is trendy right now. Specifically, Gigabyte 280X and XFX 280.
 
Joined
Sep 7, 2011
Messages
2,785 (0.61/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
1. With Fables, you asserted a unsupported claim .
The only assertion I made was that different games use different resources to different extents. My claim is no more unsupported than yours that they use identical resources to the same extent. I have historical precedent on my side (different game engines, different coders etc), you have hysterical supposition on yours.
Beware, I'm hearing about problems with R9 280(X) from all over the place. Specifically, Gigabyte and XFX come up.
Do you have a view that Gigabyte and XFX Maxwellv2s are trouble free? Your assertion shows you are a hypocrite.
What has one to do with the other? Oh, thats right....nothing!
I'd suggest you calm down, you're starting to sound just like the loons at WTFtech...assuming your violent defense of them means you aren't a fully paid up Disqus member already.
The comment section on WCCFTech is literally a wasteland of human intellect. I suppose it's fitting for a site that publishes every stray theory and tweet from an engineer as breaking news. They are second only to S|A on the shortlist of tech sites I cannot stand seeing cited.
Quoted for truth.
 
Joined
Nov 3, 2011
Messages
690 (0.15/day)
Location
Australia
System Name Eula
Processor AMD Ryzen 9 7900X PBO
Motherboard ASUS TUF Gaming X670E Plus Wifi
Cooling Corsair H115i Elite Capellix XT
Memory Trident Z5 Neo RGB DDR5-6000 64GB (4x16GB F5-6000J3038F16GX2-TZ5NR) EXPO II, OCCT Tested
Video Card(s) Gigabyte GeForce RTX 4080 GAMING OC
Storage Corsair MP600 XT NVMe 2TB, Samsung 980 Pro NVMe 2TB and Toshiba N300 NAS 10TB HDD
Display(s) 2X LG 27UL600 27in 4K HDR FreeSync/G-Sync DP
Case Phanteks Eclipse P500A D-RGB White
Audio Device(s) Creative Sound Blaster Z
Power Supply Corsair HX1000 Platinum 1000W
Mouse SteelSeries Prime Pro Gaming Mouse
Keyboard SteelSeries Apex 5
Software MS Windows 11 Pro
The only assertion I made was that different games use different resources to different extents. My claim is no more unsupported than yours that they use identical resources to the same extent. I have historical precedent on my side (different game engines, different coders etc), you have hysterical supposition on yours.
Have you played the new Fables DX12?


What has one to do with the other? Oh, thats right....nothing!
I'd suggest you calm down, you're starting to sound just like the loons at WTFtech...assuming your violent defense of them means you aren't a fully paid up Disqus member already.
You started it. You calm down.


Quoted for truth.
As posted earlier in this thread, the WCCFTech post was from Oxide i.e. read the full post from
http://www.overclock.net/t/1569897/...ingularity-dx12-benchmarks/1200#post_24356995


The comment section on WCCFTech is literally a wasteland of human intellect. I suppose it's fitting for a site that publishes every stray theory and tweet from an engineer as breaking news. They are second only to S|A on the shortlist of tech sites I cannot stand seeing cited.
As posted earlier in this thread, the WCCFTech post was from Oxide i.e. read the full post from
http://www.overclock.net/t/1569897/...ingularity-dx12-benchmarks/1200#post_24356995


I have no idea. All I know is RMA'ing 280(X) graphics cards is trendy right now. Specifically, Gigabyte 280X and XFX 280.
That's a double standard view point. http://forums.evga.com/GTX-970-Black-Screen-Crash-during-game-SOLVED-RMA-m2248453.aspx
 
Last edited:

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.65/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
I'm sure the async compute features of GCN are intrinsically linked to Mantle. Because AMD supported Mantle and NVIDIA couldn't be bothered to even look into it, AMD has a huge advantage when it comes to DirectX 12 and Vulkan. It makes sense. The question is how long will it take for NVIDIA to catch up? Pascal? Longer?


I made no mention of NVIDIA in the context of 280(X).
 
  • Like
Reactions: nem
Joined
Sep 7, 2011
Messages
2,785 (0.61/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
Have you played the new Fables DX12?
No but I can read, and more to the point I obviously understand the content of the links you post more than you do.
As posted earlier in this thread, the WCCFTech post was from Oxide i.e. read the full post from
http://www.overclock.net/t/1569897/...ingularity-dx12-benchmarks/1200#post_24356995
.....[and again...same link twice in the same post yet you still failed to make the connection]
As posted earlier in this thread, the WCCFTech post was from Oxide i.e. read the full post from
http://www.overclock.net/t/1569897/...ingularity-dx12-benchmarks/1200#post_24356995

Allow me to point out the obvious (for most people) from the post you linked to twice...consecutively
Our use of Async Compute, however, pales with comparisons to some of the things which the console guys are starting to do. Most of those haven't made their way to the PC yet, but I've heard of developers getting 30% GPU performance by using Async Compute. Too early to tell, of course, but it could end being pretty disruptive in a year or so as these GCN built and optimized engines start coming to the PC. I don't think Unreal titles will show this very much though, so likely we'll have to wait to see. Has anyone profiled Ark yet?
You keep linking to the words of the Oxide developer so you obviously place some store in what he's saying - why keep linking otherwise? Yet the developer doesn't see Unreal using the same levels of async compute- and it's almost a certainty that Nitrous and UE4 aren't identical.

Here's the kicker in case you don't understand why I'm singling out the Unreal Engine 4 part of his post....Fable Legends USES Unreal Engine 4
 
Last edited:

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.65/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
It should be noted that Oxide isn't going to know much about Unreal Engine development. Considering Unreal Engine 4 is used on PlayStation 4 and Xbox One and both of those support async compute, I think it is quite silly to believe Epic would employ async compute where possible. The only way it wouldn't is if their engine can't be made to use it without pushing out major architectural changes. In which case, Unreal Engine 5 will be coming sooner rather than later.
 
Joined
Sep 7, 2011
Messages
2,785 (0.61/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
It should be noted that Oxide isn't going to know much about Unreal Engine development. Considering Unreal Engine 4 is used on PlayStation 4 and Xbox One and both of those support async compute, I think it is quite silly to believe Epic would employ async compute where possible. The only way it wouldn't is if their engine can't be made to use it without pushing out major architectural changes. In which case, Unreal Engine 5 will be coming sooner rather than later.
It's up to individual developers which version of UE4 (or any other UE engine, including the DX12 patched UE3 that Gears of War Unlimited uses) they use. As I noted earlier in post #76, UE4.9 supports both DX12 and supports a number of DX12 features - including async compute and ROVs. The point I was attempting to make - and seemingly failed at - is that depending upon which version is used, and how much effort the developer puts into coding, the feature set used is going to differ from engine to engine, from game engine version to game engine version, and from from game to game. Larger studio's with a larger development team are likely to utilize a greater degree of control over the final product. Small game studios might also either use less features to save expenditure, or rely upon third party dev teams to incorporate elements of their gaming SDK into the final product. rvalencia is putting forward the notion that all DX12 releases are going to be performance clones of AotS - which I find laughable in the extreme given how the gaming industry actually works.

Now, given that Epic haven't exactly hidden their association with Nvidia's game development program:
"Epic developed Unreal Engine 4 on NVIDIA hardware, and it looks and runs best on GeForce." - Tim Sweeney, founder, CEO and technical director of Epic Games.

What are the chances that Unreal Engine 4 (and patched UE3 builds) operates exactly the same way as Oxide's Nitrous Engine, as rvalencia asserts? A game engine that was overhauled (if not developed) as a demonstrator for Mantle.
It should be noted that Oxide isn't going to know much about Unreal Engine development.
Well, that would make them pretty lazy considering UE4 builds are easily sourced, and Epic runs a pretty extensive forum. I would have thought that a game developer might have more than a passing interest in one of the most widely licensed game engines considering evaluation costs nothing.
I can appreciate that you'd concentrate on your own engine, but I'd find it difficult to imagine that they wouldn't keep an eye on what the competition are doing, especially when the cost is minimal and the information is freely available.
 
Last edited:
Joined
Mar 28, 2014
Messages
586 (0.16/day)
Processor AMD FX-8320
Motherboard AsRock 970 PRO3 R2.0
Cooling Thermalright Ultra120 eXtreme + 2 LED Green fans
Memory 2 x 4096 MB DDR3-1333 A-Data
Video Card(s) SAPPHIRE 4096M R9 FURY X 4G D5
Storage ST1000VX000 • SV35.6 Series™ 1000 GB 7200 rpm
Display(s) Acer S277HK wmidpp 27" 4K (3840 x 2160) IPS
Case Cooler Master HAF 912 Plus Black + Red Lights
Audio Device(s) Onboard Realtek
Power Supply OCZ ProXStream 1000W
Mouse Genius NetScroll 100X
Keyboard Logitech Wave
Software Windows 7 Ultimate 64-bit
l'm always honest. Nvidia is the Apple of GPUs, they are evil, they are greedy, there is almost nothing you could like about them, but they make very good stuff, which works well and also performs well, so they win. If they would suck, nobody would buy their products for decades, the GPU market is not like the music industry, only a very small tech savy percentage of the population buys dedicated GPUs, no Justin Biebers can keep themselves on the surface for a long time without actually delivering good stuff.

I really hate to feed the nonsensical but, I wonder how you define better...

It can't be in performance /watt...
It can't be in frame time in CFx v SLI...
It can't be in highest FPS/performance...

Bang for your buck? CHECK.
Utilizing technology to get (TOO FAR) ahead of the curve? CHECK.

I'm spent.

Beware, I'm hearing about problems with R9 280(X) from all over the place. Specifically, Gigabyte and XFX come up.

That's called brainwashing. I have never seen any technological competetive advantages in apple's products compared to the competition. Actually, the opposite - they break like shit.

Anyways, you guys are so mean. I can't comprehend how it's even possible that such people exist.

Bang for your buck? CHECK.
Utilizing technology to get (TOO FAR) ahead of the curve? CHECK.

Yes, and Image quality CHECK. ;)
 
Joined
Mar 24, 2011
Messages
2,356 (0.50/day)
Location
VT
Processor Intel i7-10700k
Motherboard Gigabyte Aurorus Ultra z490
Cooling Corsair H100i RGB
Memory 32GB (4x8GB) Corsair Vengeance DDR4-3200MHz
Video Card(s) MSI Gaming Trio X 3070 LHR
Display(s) ASUS MG278Q / AOC G2590FX
Case Corsair X4000 iCue
Audio Device(s) Onboard
Power Supply Corsair RM650x 650W Fully Modular
Software Windows 10
A game engine that was overhauled (if not developed) as a demonstrator for Mantle.

It was developed with Mantle in mind, and it was pretty apparent when AMD dragged a rep from Oxide around like a puppy dog to every tech convention to talk about how superior it was to DirectX and OpenGL at the time.

I have never seen any technological competetive advantages in apple's products compared to the competition.

Ease of Use (Ensured Compatibility, User-Friendly Software, Simple Controls\Navigation), Aesthetics (form over function), Reliability, Build Quality (Keyboards, Trackpads, Durable Materials), Ease of Development\Standardized Hardware--and those are just broad terms. If you want to get down to it Apple's iPhones and iOS by extension continue to be superior to their competitors, and offer features that are more well-rounded than their competitors. Companies like Samsung and HTC have been trying to chip away at Apple by offering handsets that are cheaper or have a single feature better than Apple's equivalent in the iPhone, but they almost never have offered a more well-rounded product. Apple's Cinema Displays are some of the best IPS's you can buy, and they have given up on lower resolutions even in their low-end laptops. They do a lot of things better than their competitors, that's why people keep buying them.
 
Last edited:

the54thvoid

Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
12,378 (2.37/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
I actually just read the extremetech review (from 17th Aug) - the one that places 980ti versus Fury X. What's the fuss about?

http://www.extremetech.com/gaming/2...-singularity-amd-and-nvidia-go-head-to-head/2

Effectively, Fury X ( a £550 card) pretty much runs the same (or 5-10% better, workload dependent, ironically, heavy workload = 980ti better) than a 980ti (a £510 card). *stock design prices

This is using an engine Nvidia say has an MSAA bug yet the 4xMSAA bench at 1080p:



and 4k:



So is this whole shit fest about top end Fiji and top end Maxwell being.... EQUAL, OMG, stop the freaking bus.... (I should've read these things earlier).

This is actually hilarious. All the AMD muppets saying all the silly things about Nvidia and all the Nvidia muppets saying the same about AMD when the reality of DX12 is.......

They're the same.

wow.

Why aren't we all hugging and saying how great this is? Why are we fighting over parity?

Oh, one caveat from extremetech themselves:

but Ashes of the Singularity and possibly Fable Legends are the only near-term DX12 launches, and neither is in finished form just yet. DX11 and even DX9 are going to remain important for years to come, and AMD needs to balance its admittedly limited pool of resources between encouraging DX12 adoption and ensuring that gamers who don’t have Windows 10 don’t end up left in the cold.

That bit in bold is very important.... DX12 levels the field, even 55-45 in AMD's favour but DX11 is AMD's Achilles heel, worse in DX9.

lol.
 
Joined
Mar 28, 2014
Messages
586 (0.16/day)
Processor AMD FX-8320
Motherboard AsRock 970 PRO3 R2.0
Cooling Thermalright Ultra120 eXtreme + 2 LED Green fans
Memory 2 x 4096 MB DDR3-1333 A-Data
Video Card(s) SAPPHIRE 4096M R9 FURY X 4G D5
Storage ST1000VX000 • SV35.6 Series™ 1000 GB 7200 rpm
Display(s) Acer S277HK wmidpp 27" 4K (3840 x 2160) IPS
Case Cooler Master HAF 912 Plus Black + Red Lights
Audio Device(s) Onboard Realtek
Power Supply OCZ ProXStream 1000W
Mouse Genius NetScroll 100X
Keyboard Logitech Wave
Software Windows 7 Ultimate 64-bit
Ease of Use (Ensured Compatibility, User-Friendly Software, Simple Controls\Navigation), Aesthetics (form over function), Reliability, Build Quality (Keyboards, Trackpads, Durable Materials), Ease of Development\Standardized Hardware--and those are just broad terms. If you want to get down to it Apple's iPhones and iOS by extension continue to be superior to their competitors, and offer features that are more well-rounded than their competitors. Companies like Samsung and HTC have been trying to chip away at Apple by offering handsets that are cheaper or have a single feature better than Apple's equivalent in the iPhone, but they almost never have offered a more well-rounded product. Apple's Cinema Displays are some of the best IPS's you can buy, and they have given up on lower resolutions even in their low-end laptops. They do a lot of things better than their competitors, that's why people keep buying them.

About resolutions I agree. But their disadvantage is the extremely high price tag. So anyways - they don't qualify in terms of cost of investment.

All other so called by you "broad terms" are so subjective. About materials specifically I told you that you can NOT rely on iphone because one or two times on the ground will be enough to break the screen.

* There is an anecdote in my country. That people buy super expensive iphones for 500-600-700 euros and then they don't have 2-3 euros to sit in the cafe. :laugh:
 

Frick

Fishfaced Nincompoop
Joined
Feb 27, 2006
Messages
18,914 (2.86/day)
Location
Piteå
System Name Black MC in Tokyo
Processor Ryzen 5 5600
Motherboard Asrock B450M-HDV
Cooling Be Quiet! Pure Rock 2
Memory 2 x 16GB Kingston Fury 3400mhz
Video Card(s) XFX 6950XT Speedster MERC 319
Storage Kingston A400 240GB | WD Black SN750 2TB |WD Blue 1TB x 2 | Toshiba P300 2TB | Seagate Expansion 8TB
Display(s) Samsung U32J590U 4K + BenQ GL2450HT 1080p
Case Fractal Design Define R4
Audio Device(s) Line6 UX1 + some headphones, Nektar SE61 keyboard
Power Supply Corsair RM850x v3
Mouse Logitech G602
Keyboard Cherry MX Board 1.0 TKL Brown
VR HMD Acer Mixed Reality Headset
Software Windows 10 Pro
Benchmark Scores Rimworld 4K ready!
I think dx12 should have a faster adaption rate than earlier versions. For one thing, it's pushed by consoles isn't it? And secondly, if they can get higher performance out of it, I mean significant performance, would that make it more interesting to use as well? And Win10 is free for many people so currently it's not associated with money, as it was with say dx10. People have probably made the same argument before.
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,147 (2.96/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
About resolutions I agree. But their disadvantage is the extremely high price tag. So anyways - they don't qualify in terms of cost of investment.
That's so dumb. It's like saying that anything doesn't qualify unless it's cheap. You know the old adage, "You get what you pay for." That's true of Apple to a point. Their chassis are solid, they've integrated everything to a tiny motherboard so most of Apple's laptops are battery, not circuitry. Having locked down hardware enables devs to have an expectation with respect to what kind of hardware is under the hood. Simple fact is that there are a lot of reasons why Apple is successful right now. Ignoring that is just being blind.
All other so called by you "broad terms" are so subjective. About materials specifically I told you that you can NOT rely on iphone because one or two times on the ground will be enough to break the screen.
Drop any phone flat on the screen hitting something and I bet you the screen will crack. With that said, my iPhone 4S has been dropped a lot without a case and it is still perfectly fine...

Lastly, this is a thread about AMD. Why the hell are you talking about Apple? Stick to the topic and stop being a smart ass. AMD offers price and there have been arguments in the past about IQ settings. Simple fact is that nVidia cards can look just as good, they're just tuned for performance out of the box. Nothing more, nothing less.
 
Joined
Mar 28, 2014
Messages
586 (0.16/day)
Processor AMD FX-8320
Motherboard AsRock 970 PRO3 R2.0
Cooling Thermalright Ultra120 eXtreme + 2 LED Green fans
Memory 2 x 4096 MB DDR3-1333 A-Data
Video Card(s) SAPPHIRE 4096M R9 FURY X 4G D5
Storage ST1000VX000 • SV35.6 Series™ 1000 GB 7200 rpm
Display(s) Acer S277HK wmidpp 27" 4K (3840 x 2160) IPS
Case Cooler Master HAF 912 Plus Black + Red Lights
Audio Device(s) Onboard Realtek
Power Supply OCZ ProXStream 1000W
Mouse Genius NetScroll 100X
Keyboard Logitech Wave
Software Windows 7 Ultimate 64-bit
That's so dumb. It's like saying that anything doesn't qualify unless it's cheap. You know the old adage, "You get what you pay for." That's true of Apple to a point. Their chassis are solid, they've integrated everything to a tiny motherboard so most of Apple's laptops are battery, not circuitry. Having locked down hardware enables devs to have an expectation with respect to what kind of hardware is under the hood. Simple fact is that there are a lot of reasons why Apple is successful right now. Ignoring that is just being blind.

Drop any phone flat on the screen hitting something and I bet you the screen will crack. With that said, my iPhone 4S has been dropped a lot without a case and it is still perfectly fine...

Lastly, this is a thread about AMD. Why the hell are you talking about Apple? Stick to the topic and stop being a smart ass. AMD offers price and there have been arguments in the past about IQ settings. Simple fact is that nVidia cards can look just as good, they're just tuned for performance out of the box. Nothing more, nothing less.

This is a thread about nvidia and AMD, and somebody brought the comparison that apple is nvidia.

There is no need to have something hit on the ground - just a flat surface like asphalt will be enough. And that's not true - there are videos which you can watch that many other brands offer phones which don't break when hitting the ground.

Oh, and apple is successful because it's an american company and those guys in usa just support it on nationalistic basis.
 
Joined
Sep 7, 2011
Messages
2,785 (0.61/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
I actually just read the extremetech review (from 17th Aug) ...
You may want to give this review a look-see. Very comprehensive.
Oh, one caveat from extremetech themselves:
That bit in bold is very important.... DX12 levels the field, even 55-45 in AMD's favour but DX11 is AMD's Achilles heel, worse in DX9.
The review I just linked to has a similar outlook, and one that I alluded to regarding finances/interest in game devs coding for DX12
Ultimately, no matter what AMD, Microsoft, or Nvidia might say, there’s another important fact to consider. DX11 (and DX10/DX9) are not going away; the big developers have the resources to do low-level programming with DX12 to improve performance. Independent developers and smaller outfits are not going to be as enamored with putting in more work on the engine if it just takes time away from making a great game. And at the end of the day, that’s what really matters. Games likeStarCraft II, Fallout 3, and the Mass Effect series have all received rave reviews, with nary a DX11 piece of code in sight. And until DX11 is well and truly put to rest (maybe around the time Dream Machine 2020 rolls out?), things like drivers and CPU performance are still going to be important.
This is actually hilarious. All the AMD muppets saying all the silly things about Nvidia and all the Nvidia muppets saying the same about AMD when the reality of DX12 is.......They're the same.
Pretty much. Like anything else graphics game engine related, it all comes down to the application and the settings used. For some people, if one metric doesn't work, try another...and another...and another...and when you find that oh so important (maybe barely) discernible difference, unleash the bile that most sane people might only exhibit on finding out their neighbour is wanted by the International Criminal Court for crimes against humanity.
I think dx12 should have a faster adaption rate than earlier versions. For one thing, it's pushed by consoles isn't it? And secondly, if they can get higher performance out of it, I mean significant performance, would that make it more interesting to use as well? And Win10 is free for many people so currently it's not associated with money, as it was with say dx10. People have probably made the same argument before.
DX12 might be available to the OS, but game developers still have to code (and optimize that code) for their titles. That isn't necessarily a given, as game devs have pointed out themselves. DX12 as has been quoted many times, puts more input into the hands of the developer. The developer still needs to get to grips with that input. I can see many smaller studios not wanting to expend the effort, and many more may prefer to use a simplified engine of a known quantity (DX9 or 11) for a time-to-market situation. Consoles taking up DX12 is all well and good, but porting a console game to PC isn't a trivial matter.
 
Last edited:

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,147 (2.96/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
This is a thread about nvidia and AMD, and somebody brought the comparison that apple is nvidia.
Actually, you removed something that you said that started that argument. See this post. Don't try to change history then lie about it.
Oh, and apple is successful because it's an american company and those guys in usa just support it on nationalistic basis.
I don't support Apple, I just said that they have a quality product that you pay for. Also making claims about the American people in general is really bad idea. I have an iPhone because work pays for the monthly bill and I have a Macbook Pro because work gave me one.

This is the nice way of me telling you to shut up and stop posting bullshit but, it appears that I needed to spell that out for you.
 
Joined
Mar 28, 2014
Messages
586 (0.16/day)
Processor AMD FX-8320
Motherboard AsRock 970 PRO3 R2.0
Cooling Thermalright Ultra120 eXtreme + 2 LED Green fans
Memory 2 x 4096 MB DDR3-1333 A-Data
Video Card(s) SAPPHIRE 4096M R9 FURY X 4G D5
Storage ST1000VX000 • SV35.6 Series™ 1000 GB 7200 rpm
Display(s) Acer S277HK wmidpp 27" 4K (3840 x 2160) IPS
Case Cooler Master HAF 912 Plus Black + Red Lights
Audio Device(s) Onboard Realtek
Power Supply OCZ ProXStream 1000W
Mouse Genius NetScroll 100X
Keyboard Logitech Wave
Software Windows 7 Ultimate 64-bit
Actually, you removed something that you said that started that argument. See this post. Don't try to change history then lie about it.

What are you speaking about ?? What did I remove and what am I trying to change ? :(

This is the line which introduced apple:

l'm always honest. Nvidia is the Apple of GPUs, they are evil, they are greedy, there is almost nothing you could like about them
 
Top