• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

DirectX 11 Won't Define GPU Sales: NVIDIA

Someone prepare the 2nd violin for Nvidia, the wind's blowing towards ATI for now, even if it will be only for a few months (ore more? seems NV have birth issues with GT300). This will remain in the GPU history as an ATI win, hell, Wikipedia even mentions the few days of K6 III being the king back in 1999.
There's no stopping to the "horsepower", duh. To get more and better feats you have to pack some...
So NV, stop barking at the moon and get your act together, we need you so the prices go to normal levels.
 
You guys calling this an ATI win have no clue what Nvidia has in store for us. No one does. I say wait until the 300s come out and we can talk. Until then all I see is mob rule and bandwagon jumpers.

Ill forum fight all you bastards!
motivator6650984.jpg
 
I don't remember anyone (mentioned among others: radeon 9700, Nvidia Geforce 256, 8800) waiting for competitors to release their equivalent products when a new generation of cards based on a new DirectX or technology (T&L hardware) is released, before declaring the champion of the time.
Although, before the 23rd we cannot know for sure, maybe NV has the GT300 already waiting to punch from the darkness at that date (mhmmm....)
 
way to go nvdia,

it's clear now, they won't have GT 300 ready when ati launch the evergreen
 
I don't remember anyone (mentioned among others: radeon 9700, Nvidia Geforce 256, 8800) waiting for competitors to release their equivalent products when a new generation of cards based on a new DirectX or technology (T&L hardware) is released, before declaring the champion of the time.
Although, before the 23rd we cannot know for sure, maybe NV has the GT300 already waiting to punch from the darkness at that date (mhmmm....)

I don't know what you're talking about -- the GeForce 5900 Ultra was a worthy competitor.

Now that was the kind of card that could keep you warm on cold nights.
 
Radeon 9700
Radeon 9700's advanced architecture was very efficient and, of course, more powerful compared to its older peers of 2002. Under normal conditions it beat the GeForce4 Ti 4600, the previous top-end card, by 15–20%. However, when anti-aliasing (AA) and/or anisotropic filtering (AF) were enabled it would beat the Ti 4600 by anywhere from 40–100%. At the time, this was quite astonishing, and resulted in the widespread acceptance of AA and AF as critical, truly usable features.

Besides advanced architecture, reviewers also took note of ATI's change in strategy. The 9700 would be the second of ATI's chips (after the 8500) to be shipped to third-party manufacturers instead of ATI producing all of its graphics cards, though ATI would still produce cards off of its highest-end chips. This freed up engineering resources that were channeled towards driver improvements, and the 9700 performed phenomenally well at launch because of this. id Software technical director John Carmack had the Radeon 9700 run the E3 Doom 3 demonstration.[3]

The performance and quality increases offered by the R300 GPU is considered to be one of the greatest in the history of 3D graphics, alongside the achievements GeForce 256 and Voodoo Graphics. Furthermore, NVIDIA’s response in the form of the GeForce FX 5800 was both late to market and somewhat unimpressive, especially when pixel shading was used. R300 would become one of the GPUs with the longest useful lifetime in history, allowing playable performance in new games at least 3 years after its launch.[4]

GeForce256 and Nvidia 8800 series were also uncontested winners at that time, no other player on the market had equivalent functional technologies.
 
You guys calling this an ATI win have no clue what Nvidia has in store for us. No one does. I say wait until the 300s come out and we can talk. Until then all I see is mob rule and bandwagon jumpers.

Ill forum fight all you bastards!
http://img211.imageshack.us/img211/9926/motivator6650984.jpg

We have no clue what Nvidia has in store because we dont SEE ANYTHING. Not a peep from nvidia except a blatant attempt to write off DX11.

I am a big nvidia fan (check specs)... but reality is reality. This is, for all intents and purposes a HUGE ati win. Nvidia has dominated for so long, and now they will lose the crown - they were SO far ahead... and now they are back where they were during the g7x series in relation to ATI. That is a win from ATI no matter how you spin it.

ATI has a dx11 part that will take the crown... and nvidia is saying that DX11 wont matter ?!?:roll::roll::roll:

These are the same muppets that told us all that physX matters.:nutkick: LOL. They haven't learned their lesson with DX10, they are just trying to convince their investors not to jump ship BC they don't have a competing part. This is a business move, plain and simple... Just trying to minimize the pain until they can compete.
 
Last edited:
Wow, talk about trying to pull a "wool over your head". Lets go back to the G80 release shall we? Because ATI was late to the party with DX10 part (IE HD2900) Nvidia reaped the benefit to be the only DX10 card in town. In which their market share did increase during ATI's absence. In spite of their being little to no DX10 games out. Not only did consumers perfer G80 at it's higher price but caused ATI to lose a big piece of the discrete GPU market (as well as mobile, etc).

I believe that their market share lost was initiated by the G80 released with no answer from ATI. Compounded by the HD 2900 release, more or less. Today, AMD is currently still trying to recover from "that". Now all of sudden we are to forget what happened and say that the DX11 is nice but not all that important. :shadedshu Yes, we know that market conditions during that time and now are completely different. However, if AMD is able to adapt and compensate for that I see no reason why they wouldn't do well.
 
Last edited:
Remember that lots of stupid people buy graphics card. If Nvidia says that they have DX11 equipped cards before ATI, then Nvidia will do worlds better than ATI!! Why? Well people are dumb and think DX11 makes the card faster etc. In truth DX11 WILL define graphics cards sales greatly.
 
Remember that lots of stupid people buy graphics card. If Nvidia says that they have DX11 equipped cards before ATI, then Nvidia will do worlds better than ATI!! Why? Well people are dumb and think DX11 makes the card faster etc. In truth DX11 WILL define graphics cards sales greatly.

+1... exactly... theyre just trying to pull a Baghdad Bob on their investors. "No No... we ARE winning the war... ATI is cowering in fear, and our customers don't care about new tech at all... its just not important." :roll:
 
Remember that lots of stupid people buy graphics card. If Nvidia says that they have DX11 equipped cards before ATI, then Nvidia will do worlds better than ATI!! Why? Well people are dumb and think DX11 makes the card faster etc. In truth DX11 WILL define graphics cards sales greatly.

People feign ignorance all the time when it comes to product they prefer. It doesn't mean that the masses simply don't know any better. I believe that it's the positive word of mouth from their friends, etc that causes brand recognition more so then just "not knowing better". Again, I'm talking about the masses, not individual cases.
 
+1... exactly... theyre just trying to pull a Baghdad Bob on their investors. "No No... we ARE winning the war... ATI is running in fear" :roll:

Exactly! Cause we know that ATI will have the first DX11 card out soon (the HD 5xxx series) so ATI will win! That means Nvidia will be in dire trouble.
 
in the meantime, you get all the other benefits anyway.

Oh but if you wait 6 months before buying, you might as well wait another 6 months and get the 6 series from ATI. or another 6 after that for nvidias next offering... or another 6 after that...


its an endless loop. if it has good performance to price ratio at any given time and people can afford it when they want to update, they will buy it.

when these cards launch, Nvidia will have nothing comparable - they're ahead in performance, features, and DX compatibility - you'd be stupid to buy a DX10.0 card over an 11.0 card

Yes my last post was a little narrow minded

As by mid \ end of next year DX11 will COUGH should be more worth it as more games will be out for it. Although with the said boost makes it more temping but tell ya the truth i've played all games i want to play already and what are giving a issue is more CPU bound than GPU.

Stupid to buy a DX10 card now ? but that depends on what card they have now. BUT DX11 cards are going be like $250-$300+. So they could get a DX10 card for around $150 and by mid\end of next year if DX11 is more exceptable you get one but the price is going be cheaper to get one. And there be a reason to get one.

I'm a gamer so thats my view on it. I do very few benchmarks as thats not what i get faster hardware for.

Sure if you have a lower end card it's going be more worth but if you already have a card like the 285 or the 4890 there is no need if your a gamer..
 
Yes my last post was a little narrow minded

As by mid \ end of next year DX11 will COUGH should be more worth it as more games will be out for it. Although with the said boost makes it more temping but tell ya the truth i've played all games i want to play already and what are giving a issue is more CPU bound than GPU.

Stupid to buy a DX10 card now ? but that depends on what card they have now. BUT DX11 cards are going be like $250-$300+. So they could get a DX10 card for around $150 and by mid\end of next year if DX11 is more exceptable you get one but the price is going be cheaper to get one. And there be a reason to get one.

I'm a gamer so thats my view on it. I do very few benchmarks as thats not what i get faster hardware for.

Sure if you have a lower end card it's going be more worth but if you already have a card like the 285 or the 4890 there is no need if your a gamer..

If DX11 turns out like DX10, we won't need it! Every DX10 game had the ability to run in DX9 mode. Were DX10 supporting cards necessary? No.
 
If DX11 turns out like DX10, we won't need it! Every DX10 game had the ability to run in DX9 mode. Were DX10 supporting cards necessary? No.


Yes they were to get extra high quality :rolleyes:
 
nvidia needs to cut the crap and make a DX11 card. if they don't then they can STFU!
 
I kind of agree with nVidia - I don't believe that DX11 ITSELF will cause AMD (or is it still ATi? I get confused. Anyway...) cards to fly of the shelf. What will cause these cards to sell is that they will be top dog for a good few months. And them being DX11 pretty much equates to being better at performing well in DX10 et al (hell there are still "will this run Crysis" threads).

What nVidia are doing is saying to all those people that think that DX11 will make a huge difference is - hey you don't need DX11 just yet, here we'll (well i would think they would) cut the price on our cards that still perform pretty damn well.
 
I kind of agree with nVidia - I don't believe that DX11 ITSELF will cause AMD (or is it still ATi? I get confused. Anyway...) cards to fly of the shelf. What will cause these cards to sell is that they will be top dog for a good few months. And them being DX11 pretty much equates to being better at performing well in DX10 et al (hell there are still "will this run Crysis" threads).

What nVidia are doing is saying to all those people that think that DX11 will make a huge difference is - hey you don't need DX11 just yet, here we'll (well i would think they would) cut the price on our cards that still perform pretty damn well.

it's AMD:
AMDmarkham4.jpg
 
Last edited by a moderator:
I've enjoyed reading this thread. You guys are making alot of vaild points.
 
You guys calling this an ATI win have no clue what Nvidia has in store for us. No one does. I say wait until the 300s come out and we can talk. Until then all I see is mob rule and bandwagon jumpers.

Ill forum fight all you bastards!
http://img211.imageshack.us/img211/9926/motivator6650984.jpg

That's not nerd rage, this is:

punch-small.jpg


He wasn't happy when he found out that his GTX 380M was based on 40 nm G92c. That aside, let's get back on track.
 
ATI: We have DX11 WHQL driver.
nVidia: DX11? Phew, let's concentrate on whats important- the Powerpoint slides.
 
does this driver have opencl?
 
"...framerate and resolution are nice, but today they are very high and going from 120fps to 125fps is not going to fundamentally change end-user experience. But I think the things that we are doing with Stereo 3D Vision, PhysX, about making the games more immersive, more playable is beyond framerates and resolutions. Nvidia will show with the next-generation GPUs that the compute side is now becoming more important that the graphics side.”

Um no, going from 120 to 125 isn't worth anything, correct, but stopping performance going from 60 to 30 IS worth something.

Compute side is all well and good, because without it, 'special' visuals won't work effeciently, but to say that pure computing is necessary is a bit premature.

Hopefully he's hinting and suggesting what we want to see in the near future, which is real time vector drawing, rather than pre-rendered visuals. But that would require cards with massive computing flexibility, like the FIRE GL types used in AutoCAD programs.

But still, stop making cards that do give you 125fps over 120fps, and start making ones that don't cower in fear at a few dynamic shadows in a 3d program. Then worry about 'compute' cards.
 
ATI was hardware accelerating before Nvidia.
 
Back
Top