Monday, August 31st 2015

Lack of Async Compute on Maxwell Makes AMD GCN Better Prepared for DirectX 12

It turns out that NVIDIA's "Maxwell" architecture has an Achilles' heel after all, which tilts the scales in favor of competing AMD Graphics CoreNext architecture, in being better prepared for DirectX 12. "Maxwell" lacks support for async compute, one of the three highlight features of Direct3D 12, even as the GeForce driver "exposes" the feature's presence to apps. This came to light when game developer Oxide Games alleged that it was pressured by NVIDIA's marketing department to remove certain features in its "Ashes of the Singularity" DirectX 12 benchmark.

Async Compute is a standardized API-level feature added to Direct3D by Microsoft, which allows an app to better exploit the number-crunching resources of a GPU, by breaking down its graphics rendering tasks. Since NVIDIA driver tells apps that "Maxwell" GPUs supports it, Oxide Games simply created its benchmark with async compute support, but when it attempted to use it on Maxwell, it was an "unmitigated disaster." During to course of its developer correspondence with NVIDIA to try and fix this issue, it learned that "Maxwell" doesn't really support async compute at the bare-metal level, and that NVIDIA driver bluffs its support to apps. NVIDIA instead started pressuring Oxide to remove parts of its code that use async compute altogether, it alleges.
"Personally, I think one could just as easily make the claim that we were biased toward NVIDIA as the only "vendor" specific-code is for NVIDIA where we had to shutdown async compute. By vendor specific, I mean a case where we look at the Vendor ID and make changes to our rendering path. Curiously, their driver reported this feature was functional but attempting to use it was an unmitigated disaster in terms of performance and conformance so we shut it down on their hardware. As far as I know, Maxwell doesn't really have Async Compute so I don't know why their driver was trying to expose that. The only other thing that is different between them is that NVIDIA does fall into Tier 2 class binding hardware instead of Tier 3 like AMD which requires a little bit more CPU overhead in D3D12, but I don't think it ended up being very significant. This isn't a vendor specific path, as it's responding to capabilities the driver reports," writes Oxide, in a statement disputing NVIDIA's "misinformation" about the "Ashes of Singularity" benchmark in its press communications (presumably to VGA reviewers).

Given its growing market-share, NVIDIA could use similar tactics to keep game developers away from industry-standard API features that it doesn't support, and which rival AMD does. NVIDIA drivers tell Windows that its GPUs support DirectX 12 feature-level 12_1. We wonder how much of that support is faked at the driver-level, like async compute. The company is already drawing flack for using borderline anti-competitive practices with GameWorks, which effectively creates a walled garden of visual effects that only users of NVIDIA hardware can experience for the same $59 everyone spends on a particular game. Sources: DSOGaming, WCCFTech
Add your own comment

196 Comments on Lack of Async Compute on Maxwell Makes AMD GCN Better Prepared for DirectX 12

#126
FordGT90Concept
"I go fast!1!11!1!"
HumanSmoke
Their market share without Fiji was in a nose dive.
It still is because Fiji is priced as a premium product and the bulk of discreet card sales are midrange and low end. Those cards are all still rebrands.

HumanSmoke
Devinder Kumar intimated that the APU die shrink would mean AMD's net profit would rise, so that is a fair assumption that any saving in manufacturing cost aids AMD, but even with the APU die and packaging shrink, Kumar expected gross margins to break $20/unit form the $17-18 they are presently residing at. Console APUs are still a volume commodity product, and I doubt that Sony/MS would tolerate any delivery slippage due to process/package deviation unless the processes involved were rock solid - especially if the monetary savings are going into AMD's pocket rather than the risk/reward being shared.
Sony/Microsoft would save in other areas like power transformer, cooling, and space. They can make a physically smaller console which translates to materials saving. Everyone wins--AMD the most because instead of just getting paid for the APU, they'd also have to get paid for memory too (most of which would be sent to the memory manufacturer but it is still something AMD can charge more for).
Posted on Reply
#127
HumanSmoke
FordGT90Concept
It still is because Fiji is priced as a premium product and the bulk of discreet card sales are midrange and low end. Those cards are all still rebrands.
Well, for definitive proof you'd need to see the Q3 market share numbers, since Fiji barely arrived before the close of Q2.
Sales of the top end aren't generally the only advantage to sales. They indirectly influence sales of lower parts due to the halo effect. Nvidia probably sells a bunch of GT (and lower end GTX) 700/900 series cards due to the same halo effect from the Titan and 980 Ti series - a little reflected glory if you like.
AMD obviously didn't foresee GM200 scaling (clocks largely unaffected by increasing die size) as well as it did when it laid down Fiji's design, and had Fiji been unreservedly the "worlds fastest GPU" as they'd intended, it would have boosted sales of the lower tier. AMD's mistake was not taking into account that the opposition also have capable R&D divisions, but when AMD signed up for HBM in late 2013, they had to make a decision on estimates and available information.
FordGT90Concept
Sony/Microsoft would save in other areas like power transformer, cooling, and space. They can make a physically smaller console which translates to materials saving. Everyone wins--AMD the most because instead of just getting paid for the APU, they'd also have to get paid for memory too (most of which would be sent to the memory manufacturer but it is still something AMD can charge more for).
Hynix's own rationale seemed to be to keep pace with Samsung ( who had actually already begun rolling out 3D NAND tech by this time). AMD's involvement surely stemmed from HSA and hUMA in general - of which, consoles leverage the same tech to be sure, but I think were only part of the whole HSA implementation strategy.
Posted on Reply
#128
EarthDog
Sorry to be completely OTHERWISE bere, but human, that quote from that douche canoe charlie is PRICELESS.
Posted on Reply
#129
HumanSmoke
EarthDog
Sorry to be completely OTHERWISE bere, but human, that quote from that douche canoe charlie is PRICELESS.
I'd say that you simply can't buy insight like that, but you can. For a measly $1000 year subscription, the thoughts and ramblings of Chairman Charlie can be yours! Charlie predicts...he dices...he slices, juliennes, and mashes, all for the introductory low, low price!
Posted on Reply
#130
rvalencia
HumanSmoke
So, basically what I just said.

Don't kid yourself, they only reasons they aren't here is because they're too busy eating their crayons.
bon appétit

I think rvalencia is attempting to bridge that divide.
1. With Fables, you asserted a unsupported claim .

2. Your personality based attacks shows you are a hypocrite i.e. not much different to certain WCCFTech's comment section.


FordGT90Concept
Beware, I'm hearing about problems with R9 280(X) from all over the place. Specifically, Gigabyte and XFX come up.
Do you have a view that Gigabyte and XFX Maxwellv2s are trouble free? Your assertion shows you are a hypocrite.
Posted on Reply
#131
xenocide
The comment section on WCCFTech is literally a wasteland of human intellect. I suppose it's fitting for a site that publishes every stray theory and tweet from an engineer as breaking news. They are second only to S|A on the shortlist of tech sites I cannot stand seeing cited.
Posted on Reply
#132
FordGT90Concept
"I go fast!1!11!1!"
rvalencia
Do you have a view that Gigabyte and XFX Maxwellv2s are trouble free? Your assertion shows you are a hypocrite.
I have no idea. All I know is RMA'ing 280(X) graphics cards is trendy right now. Specifically, Gigabyte 280X and XFX 280.
Posted on Reply
#133
HumanSmoke
rvalencia
1. With Fables, you asserted a unsupported claim .
The only assertion I made was that different games use different resources to different extents. My claim is no more unsupported than yours that they use identical resources to the same extent. I have historical precedent on my side (different game engines, different coders etc), you have hysterical supposition on yours.
rvalencia
[quote=FordGT90Concept]Beware, I'm hearing about problems with R9 280(X) from all over the place. Specifically, Gigabyte and XFX come up.
Do you have a view that Gigabyte and XFX Maxwellv2s are trouble free? Your assertion shows you are a hypocrite.[/quote]What has one to do with the other? Oh, thats right....nothing!
I'd suggest you calm down, you're starting to sound just like the loons at WTFtech...assuming your violent defense of them means you aren't a fully paid up Disqus member already.
xenocide
The comment section on WCCFTech is literally a wasteland of human intellect. I suppose it's fitting for a site that publishes every stray theory and tweet from an engineer as breaking news. They are second only to S|A on the shortlist of tech sites I cannot stand seeing cited.
Quoted for truth.
Posted on Reply
#134
rvalencia
HumanSmoke
The only assertion I made was that different games use different resources to different extents. My claim is no more unsupported than yours that they use identical resources to the same extent. I have historical precedent on my side (different game engines, different coders etc), you have hysterical supposition on yours.
Have you played the new Fables DX12?


HumanSmoke
What has one to do with the other? Oh, thats right....nothing!
I'd suggest you calm down, you're starting to sound just like the loons at WTFtech...assuming your violent defense of them means you aren't a fully paid up Disqus member already.
You started it. You calm down.


HumanSmoke
Quoted for truth.
As posted earlier in this thread, the WCCFTech post was from Oxide i.e. read the full post from
http://www.overclock.net/t/1569897/various-ashes-of-the-singularity-dx12-benchmarks/1200#post_24356995


xenocide
The comment section on WCCFTech is literally a wasteland of human intellect. I suppose it's fitting for a site that publishes every stray theory and tweet from an engineer as breaking news. They are second only to S|A on the shortlist of tech sites I cannot stand seeing cited.
As posted earlier in this thread, the WCCFTech post was from Oxide i.e. read the full post from
http://www.overclock.net/t/1569897/various-ashes-of-the-singularity-dx12-benchmarks/1200#post_24356995


FordGT90Concept
I have no idea. All I know is RMA'ing 280(X) graphics cards is trendy right now. Specifically, Gigabyte 280X and XFX 280.
That's a double standard view point. http://forums.evga.com/GTX-970-Black-Screen-Crash-during-game-SOLVED-RMA-m2248453.aspx
Posted on Reply
#136
FordGT90Concept
"I go fast!1!11!1!"
I'm sure the async compute features of GCN are intrinsically linked to Mantle. Because AMD supported Mantle and NVIDIA couldn't be bothered to even look into it, AMD has a huge advantage when it comes to DirectX 12 and Vulkan. It makes sense. The question is how long will it take for NVIDIA to catch up? Pascal? Longer?


rvalencia
That's a double standard view point. http://forums.evga.com/GTX-970-Black-Screen-Crash-during-game-SOLVED-RMA-m2248453.aspx
I made no mention of NVIDIA in the context of 280(X).
Posted on Reply
#137
HumanSmoke
rvalencia
Have you played the new Fables DX12?
No but I can read, and more to the point I obviously understand the content of the links you post more than you do.
rvalencia
As posted earlier in this thread, the WCCFTech post was from Oxide i.e. read the full post from
http://www.overclock.net/t/1569897/various-ashes-of-the-singularity-dx12-benchmarks/1200#post_24356995
.....[and again...same link twice in the same post yet you still failed to make the connection]
As posted earlier in this thread, the WCCFTech post was from Oxide i.e. read the full post from
http://www.overclock.net/t/1569897/various-ashes-of-the-singularity-dx12-benchmarks/1200#post_24356995
Allow me to point out the obvious (for most people) from the post you linked to twice...consecutively
Our use of Async Compute, however, pales with comparisons to some of the things which the console guys are starting to do. Most of those haven't made their way to the PC yet, but I've heard of developers getting 30% GPU performance by using Async Compute. Too early to tell, of course, but it could end being pretty disruptive in a year or so as these GCN built and optimized engines start coming to the PC. I don't think Unreal titles will show this very much though, so likely we'll have to wait to see. Has anyone profiled Ark yet?
You keep linking to the words of the Oxide developer so you obviously place some store in what he's saying - why keep linking otherwise? Yet the developer doesn't see Unreal using the same levels of async compute- and it's almost a certainty that Nitrous and UE4 aren't identical.

Here's the kicker in case you don't understand why I'm singling out the Unreal Engine 4 part of his post....Fable Legends USES Unreal Engine 4
Posted on Reply
#138
FordGT90Concept
"I go fast!1!11!1!"
It should be noted that Oxide isn't going to know much about Unreal Engine development. Considering Unreal Engine 4 is used on PlayStation 4 and Xbox One and both of those support async compute, I think it is quite silly to believe Epic would employ async compute where possible. The only way it wouldn't is if their engine can't be made to use it without pushing out major architectural changes. In which case, Unreal Engine 5 will be coming sooner rather than later.
Posted on Reply
#139
HumanSmoke
FordGT90Concept
It should be noted that Oxide isn't going to know much about Unreal Engine development. Considering Unreal Engine 4 is used on PlayStation 4 and Xbox One and both of those support async compute, I think it is quite silly to believe Epic would employ async compute where possible. The only way it wouldn't is if their engine can't be made to use it without pushing out major architectural changes. In which case, Unreal Engine 5 will be coming sooner rather than later.
It's up to individual developers which version of UE4 (or any other UE engine, including the DX12 patched UE3 that Gears of War Unlimited uses) they use. As I noted earlier in post #76, UE4.9 supports both DX12 and supports a number of DX12 features - including async compute and ROVs. The point I was attempting to make - and seemingly failed at - is that depending upon which version is used, and how much effort the developer puts into coding, the feature set used is going to differ from engine to engine, from game engine version to game engine version, and from from game to game. Larger studio's with a larger development team are likely to utilize a greater degree of control over the final product. Small game studios might also either use less features to save expenditure, or rely upon third party dev teams to incorporate elements of their gaming SDK into the final product. rvalencia is putting forward the notion that all DX12 releases are going to be performance clones of AotS - which I find laughable in the extreme given how the gaming industry actually works.

Now, given that Epic haven't exactly hidden their association with Nvidia's game development program:
"Epic developed Unreal Engine 4 on NVIDIA hardware, and it looks and runs best on GeForce." - Tim Sweeney, founder, CEO and technical director of Epic Games.
What are the chances that Unreal Engine 4 (and patched UE3 builds) operates exactly the same way as Oxide's Nitrous Engine, as rvalencia asserts? A game engine that was overhauled (if not developed) as a demonstrator for Mantle.
FordGT90Concept
It should be noted that Oxide isn't going to know much about Unreal Engine development.
Well, that would make them pretty lazy considering UE4 builds are easily sourced, and Epic runs a pretty extensive forum. I would have thought that a game developer might have more than a passing interest in one of the most widely licensed game engines considering evaluation costs nothing.
I can appreciate that you'd concentrate on your own engine, but I'd find it difficult to imagine that they wouldn't keep an eye on what the competition are doing, especially when the cost is minimal and the information is freely available.
Posted on Reply
#140
Sony Xperia S
Ikaruga
l'm always honest. Nvidia is the Apple of GPUs, they are evil, they are greedy, there is almost nothing you could like about them, but they make very good stuff, which works well and also performs well, so they win. If they would suck, nobody would buy their products for decades, the GPU market is not like the music industry, only a very small tech savy percentage of the population buys dedicated GPUs, no Justin Biebers can keep themselves on the surface for a long time without actually delivering good stuff.
EarthDog
I really hate to feed the nonsensical but, I wonder how you define better...

It can't be in performance /watt...
It can't be in frame time in CFx v SLI...
It can't be in highest FPS/performance...

Bang for your buck? CHECK.
Utilizing technology to get (TOO FAR) ahead of the curve? CHECK.

I'm spent.
FordGT90Concept
Beware, I'm hearing about problems with R9 280(X) from all over the place. Specifically, Gigabyte and XFX come up.
That's called brainwashing. I have never seen any technological competetive advantages in apple's products compared to the competition. Actually, the opposite - they break like shit.

Anyways, you guys are so mean. I can't comprehend how it's even possible that such people exist.

EarthDog
Bang for your buck? CHECK.
Utilizing technology to get (TOO FAR) ahead of the curve? CHECK.
Yes, and Image quality CHECK. ;)
Posted on Reply
#141
xenocide
HumanSmoke
A game engine that was overhauled (if not developed) as a demonstrator for Mantle.
It was developed with Mantle in mind, and it was pretty apparent when AMD dragged a rep from Oxide around like a puppy dog to every tech convention to talk about how superior it was to DirectX and OpenGL at the time.

Sony Xperia S
I have never seen any technological competetive advantages in apple's products compared to the competition.
Ease of Use (Ensured Compatibility, User-Friendly Software, Simple Controls\Navigation), Aesthetics (form over function), Reliability, Build Quality (Keyboards, Trackpads, Durable Materials), Ease of Development\Standardized Hardware--and those are just broad terms. If you want to get down to it Apple's iPhones and iOS by extension continue to be superior to their competitors, and offer features that are more well-rounded than their competitors. Companies like Samsung and HTC have been trying to chip away at Apple by offering handsets that are cheaper or have a single feature better than Apple's equivalent in the iPhone, but they almost never have offered a more well-rounded product. Apple's Cinema Displays are some of the best IPS's you can buy, and they have given up on lower resolutions even in their low-end laptops. They do a lot of things better than their competitors, that's why people keep buying them.
Posted on Reply
#142
the54thvoid
I actually just read the extremetech review (from 17th Aug) - the one that places 980ti versus Fury X. What's the fuss about?

http://www.extremetech.com/gaming/212314-directx-12-arrives-at-last-with-ashes-of-the-singularity-amd-and-nvidia-go-head-to-head/2

Effectively, Fury X ( a £550 card) pretty much runs the same (or 5-10% better, workload dependent, ironically, heavy workload = 980ti better) than a 980ti (a £510 card). *stock design prices

This is using an engine Nvidia say has an MSAA bug yet the 4xMSAA bench at 1080p:



and 4k:



So is this whole shit fest about top end Fiji and top end Maxwell being.... EQUAL, OMG, stop the freaking bus.... (I should've read these things earlier).

This is actually hilarious. All the AMD muppets saying all the silly things about Nvidia and all the Nvidia muppets saying the same about AMD when the reality of DX12 is.......

They're the same.

wow.

Why aren't we all hugging and saying how great this is? Why are we fighting over parity?

Oh, one caveat from extremetech themselves:
but Ashes of the Singularity and possibly Fable Legends are the only near-term DX12 launches, and neither is in finished form just yet. DX11 and even DX9 are going to remain important for years to come, and AMD needs to balance its admittedly limited pool of resources between encouraging DX12 adoption and ensuring that gamers who don’t have Windows 10 don’t end up left in the cold.
That bit in bold is very important.... DX12 levels the field, even 55-45 in AMD's favour but DX11 is AMD's Achilles heel, worse in DX9.

lol.
Posted on Reply
#143
Sony Xperia S
xenocide
Ease of Use (Ensured Compatibility, User-Friendly Software, Simple Controls\Navigation), Aesthetics (form over function), Reliability, Build Quality (Keyboards, Trackpads, Durable Materials), Ease of Development\Standardized Hardware--and those are just broad terms. If you want to get down to it Apple's iPhones and iOS by extension continue to be superior to their competitors, and offer features that are more well-rounded than their competitors. Companies like Samsung and HTC have been trying to chip away at Apple by offering handsets that are cheaper or have a single feature better than Apple's equivalent in the iPhone, but they almost never have offered a more well-rounded product. Apple's Cinema Displays are some of the best IPS's you can buy, and they have given up on lower resolutions even in their low-end laptops. They do a lot of things better than their competitors, that's why people keep buying them.
About resolutions I agree. But their disadvantage is the extremely high price tag. So anyways - they don't qualify in terms of cost of investment.

All other so called by you "broad terms" are so subjective. About materials specifically I told you that you can NOT rely on iphone because one or two times on the ground will be enough to break the screen.

* There is an anecdote in my country. That people buy super expensive iphones for 500-600-700 euros and then they don't have 2-3 euros to sit in the cafe. :laugh:
Posted on Reply
#144
Frick
Fishfaced Nincompoop
I think dx12 should have a faster adaption rate than earlier versions. For one thing, it's pushed by consoles isn't it? And secondly, if they can get higher performance out of it, I mean significant performance, would that make it more interesting to use as well? And Win10 is free for many people so currently it's not associated with money, as it was with say dx10. People have probably made the same argument before.
Posted on Reply
#145
Aquinus
Resident Wat-man
Sony Xperia S
About resolutions I agree. But their disadvantage is the extremely high price tag. So anyways - they don't qualify in terms of cost of investment.
That's so dumb. It's like saying that anything doesn't qualify unless it's cheap. You know the old adage, "You get what you pay for." That's true of Apple to a point. Their chassis are solid, they've integrated everything to a tiny motherboard so most of Apple's laptops are battery, not circuitry. Having locked down hardware enables devs to have an expectation with respect to what kind of hardware is under the hood. Simple fact is that there are a lot of reasons why Apple is successful right now. Ignoring that is just being blind.
Sony Xperia S
All other so called by you "broad terms" are so subjective. About materials specifically I told you that you can NOT rely on iphone because one or two times on the ground will be enough to break the screen.
Drop any phone flat on the screen hitting something and I bet you the screen will crack. With that said, my iPhone 4S has been dropped a lot without a case and it is still perfectly fine...

Lastly, this is a thread about AMD. Why the hell are you talking about Apple? Stick to the topic and stop being a smart ass. AMD offers price and there have been arguments in the past about IQ settings. Simple fact is that nVidia cards can look just as good, they're just tuned for performance out of the box. Nothing more, nothing less.
Posted on Reply
#146
Sony Xperia S
Aquinus
That's so dumb. It's like saying that anything doesn't qualify unless it's cheap. You know the old adage, "You get what you pay for." That's true of Apple to a point. Their chassis are solid, they've integrated everything to a tiny motherboard so most of Apple's laptops are battery, not circuitry. Having locked down hardware enables devs to have an expectation with respect to what kind of hardware is under the hood. Simple fact is that there are a lot of reasons why Apple is successful right now. Ignoring that is just being blind.

Drop any phone flat on the screen hitting something and I bet you the screen will crack. With that said, my iPhone 4S has been dropped a lot without a case and it is still perfectly fine...

Lastly, this is a thread about AMD. Why the hell are you talking about Apple? Stick to the topic and stop being a smart ass. AMD offers price and there have been arguments in the past about IQ settings. Simple fact is that nVidia cards can look just as good, they're just tuned for performance out of the box. Nothing more, nothing less.
This is a thread about nvidia and AMD, and somebody brought the comparison that apple is nvidia.

There is no need to have something hit on the ground - just a flat surface like asphalt will be enough. And that's not true - there are videos which you can watch that many other brands offer phones which don't break when hitting the ground.

Oh, and apple is successful because it's an american company and those guys in usa just support it on nationalistic basis.
Posted on Reply
#147
HumanSmoke
the54thvoid
I actually just read the extremetech review (from 17th Aug) ...
You may want to give this review a look-see. Very comprehensive.
the54thvoid
Oh, one caveat from extremetech themselves:
That bit in bold is very important.... DX12 levels the field, even 55-45 in AMD's favour but DX11 is AMD's Achilles heel, worse in DX9.
The review I just linked to has a similar outlook, and one that I alluded to regarding finances/interest in game devs coding for DX12
Ultimately, no matter what AMD, Microsoft, or Nvidia might say, there’s another important fact to consider. DX11 (and DX10/DX9) are not going away; the big developers have the resources to do low-level programming with DX12 to improve performance. Independent developers and smaller outfits are not going to be as enamored with putting in more work on the engine if it just takes time away from making a great game. And at the end of the day, that’s what really matters. Games likeStarCraft II, Fallout 3, and the Mass Effect series have all received rave reviews, with nary a DX11 piece of code in sight. And until DX11 is well and truly put to rest (maybe around the time Dream Machine 2020 rolls out?), things like drivers and CPU performance are still going to be important.
the54thvoid
This is actually hilarious. All the AMD muppets saying all the silly things about Nvidia and all the Nvidia muppets saying the same about AMD when the reality of DX12 is.......They're the same.
Pretty much. Like anything else graphics game engine related, it all comes down to the application and the settings used. For some people, if one metric doesn't work, try another...and another...and another...and when you find that oh so important (maybe barely) discernible difference, unleash the bile that most sane people might only exhibit on finding out their neighbour is wanted by the International Criminal Court for crimes against humanity.
Frick
I think dx12 should have a faster adaption rate than earlier versions. For one thing, it's pushed by consoles isn't it? And secondly, if they can get higher performance out of it, I mean significant performance, would that make it more interesting to use as well? And Win10 is free for many people so currently it's not associated with money, as it was with say dx10. People have probably made the same argument before.
DX12 might be available to the OS, but game developers still have to code (and optimize that code) for their titles. That isn't necessarily a given, as game devs have pointed out themselves. DX12 as has been quoted many times, puts more input into the hands of the developer. The developer still needs to get to grips with that input. I can see many smaller studios not wanting to expend the effort, and many more may prefer to use a simplified engine of a known quantity (DX9 or 11) for a time-to-market situation. Consoles taking up DX12 is all well and good, but porting a console game to PC isn't a trivial matter.
Posted on Reply
#148
Aquinus
Resident Wat-man
Sony Xperia S
This is a thread about nvidia and AMD, and somebody brought the comparison that apple is nvidia.
Actually, you removed something that you said that started that argument. See this post. Don't try to change history then lie about it.
Sony Xperia S
Oh, and apple is successful because it's an american company and those guys in usa just support it on nationalistic basis.
I don't support Apple, I just said that they have a quality product that you pay for. Also making claims about the American people in general is really bad idea. I have an iPhone because work pays for the monthly bill and I have a Macbook Pro because work gave me one.

This is the nice way of me telling you to shut up and stop posting bullshit but, it appears that I needed to spell that out for you.
Posted on Reply
#149
Sony Xperia S
Aquinus
Actually, you removed something that you said that started that argument. See this post. Don't try to change history then lie about it.
What are you speaking about ?? What did I remove and what am I trying to change ? :(

This is the line which introduced apple:

Ikaruga
l'm always honest. Nvidia is the Apple of GPUs, they are evil, they are greedy, there is almost nothing you could like about them
Posted on Reply
#150
Aquinus
Resident Wat-man
Sony Xperia S
What are you speaking about ?? What did I remove and what am I trying to change ? :(

This is the line which introduced apple:
He said nVidia is the Apple of. He wasn't talking about Apple. He went on to say (talking about nVidia):
Ikaruga
they are evil, they are greedy, there is almost nothing you could like about them, but they make very good stuff, which works well and also performs well, so they win. If they would suck, nobody would buy their products for decades, the GPU market is not like the music industry, only a very small tech savy percentage of the population buys dedicated GPUs
Your only digging yourself a deeper hole...
Posted on Reply
Add your own comment