Thursday, September 17th 2009

DirectX 11 Won't Define GPU Sales: NVIDIA

"DirectX 11 by itself is not going be the defining reason to buy a new GPU. It will be one of the reasons." This coming from the same company that a few years ago said that there was every reason to opt for a DirectX 10 compliant graphics card, to complete the Windows Vista experience, at a time when it was the first and only company to be out with compliant hardware. In the wake of rival AMD's ambitious Evergreen family of DirectX 11 compliant graphics cards being released, NVIDIA made it a point to tell the press that the development shouldn't really change anything in the industry.

Speaking at the Deutsche Bank Securities Technology Conference, NVIDIA's VP of investor relations said "DirectX 11 by itself is not going be the defining reason to buy a new GPU. It will be one of the reasons. This is why Microsoft is in work with the industry to allow more freedom and more creativity in how you build content, which is always good, and the new features in DirectX 11 are going to allow people to do that. But that no longer is the only reason, we believe, consumers would want to invest in a GPU."

"Now, we know, people are doing a lot in the area of video, people are going to do more and more in the area of photography… I think that the things we are doing would allow the GPU to be a co-processor to the CPU and deliver better user experience, better battery life and make that computers little bit more optimized," added Mr. Hara

NVIDIA, which was until very recently a firm believer in graphics processing horsepower to serve as the biggest selling points of new GPUs, now switches its line on what it believes will drive the market forward. All of a sudden, software that rely on the raw computational power of GPUs (eg: media transcoding software), and not advanced visual effects that a new generation API brings with it (in games and CGI applications), is what will drive people to buying graphics processors, according to the company.

Mr. Hara concluded saying "Graphics industry, I think, is on the point that microprocessor industry was several years ago, when AMD made the public confession that frequency does not matter anymore and it is more about performance per watt. I think we are the same crossroad with the graphics world: framerate and resolution are nice, but today they are very high and going from 120fps to 125fps is not going to fundamentally change end-user experience. But I think the things that we are doing with Stereo 3D Vision, PhysX, about making the games more immersive, more playable is beyond framerates and resolutions. Nvidia will show with the next-generation GPUs that the compute side is now becoming more important that the graphics side."

The timing of this comes when NVIDIA does not have any concrete product plans laid out, while AMD is working towards getting a headstart with its next-generation GPUs that are DirectX 11 compliant, and also has compliance with industry-wide GPGPU standards such as DirectCompute 11 and OpenCL.
Source: Xbit Labs
Add your own comment

194 Comments on DirectX 11 Won't Define GPU Sales: NVIDIA

#51
TheMailMan78
Big Member
You guys calling this an ATI win have no clue what Nvidia has in store for us. No one does. I say wait until the 300s come out and we can talk. Until then all I see is mob rule and bandwagon jumpers.

Ill forum fight all you bastards!
Posted on Reply
#52
toyo
I don't remember anyone (mentioned among others: radeon 9700, Nvidia Geforce 256, 8800) waiting for competitors to release their equivalent products when a new generation of cards based on a new DirectX or technology (T&L hardware) is released, before declaring the champion of the time.
Although, before the 23rd we cannot know for sure, maybe NV has the GT300 already waiting to punch from the darkness at that date (mhmmm....)
Posted on Reply
#53
Unregistered
way to go nvdia,

it's clear now, they won't have GT 300 ready when ati launch the evergreen
Posted on Edit | Reply
#54
mdm-adph
toyoI don't remember anyone (mentioned among others: radeon 9700, Nvidia Geforce 256, 8800) waiting for competitors to release their equivalent products when a new generation of cards based on a new DirectX or technology (T&L hardware) is released, before declaring the champion of the time.
Although, before the 23rd we cannot know for sure, maybe NV has the GT300 already waiting to punch from the darkness at that date (mhmmm....)
I don't know what you're talking about -- the GeForce 5900 Ultra was a worthy competitor.

Now that was the kind of card that could keep you warm on cold nights.
Posted on Reply
#55
toyo
Radeon 9700
Radeon 9700's advanced architecture was very efficient and, of course, more powerful compared to its older peers of 2002. Under normal conditions it beat the GeForce4 Ti 4600, the previous top-end card, by 15–20%. However, when anti-aliasing (AA) and/or anisotropic filtering (AF) were enabled it would beat the Ti 4600 by anywhere from 40–100%. At the time, this was quite astonishing, and resulted in the widespread acceptance of AA and AF as critical, truly usable features.

Besides advanced architecture, reviewers also took note of ATI's change in strategy. The 9700 would be the second of ATI's chips (after the 8500) to be shipped to third-party manufacturers instead of ATI producing all of its graphics cards, though ATI would still produce cards off of its highest-end chips. This freed up engineering resources that were channeled towards driver improvements, and the 9700 performed phenomenally well at launch because of this. id Software technical director John Carmack had the Radeon 9700 run the E3 Doom 3 demonstration.[3]

The performance and quality increases offered by the R300 GPU is considered to be one of the greatest in the history of 3D graphics, alongside the achievements GeForce 256 and Voodoo Graphics. Furthermore, NVIDIA’s response in the form of the GeForce FX 5800 was both late to market and somewhat unimpressive, especially when pixel shading was used. R300 would become one of the GPUs with the longest useful lifetime in history, allowing playable performance in new games at least 3 years after its launch.[4]

GeForce256 and Nvidia 8800 series were also uncontested winners at that time, no other player on the market had equivalent functional technologies.
Posted on Reply
#56
phanbuey
TheMailMan78You guys calling this an ATI win have no clue what Nvidia has in store for us. No one does. I say wait until the 300s come out and we can talk. Until then all I see is mob rule and bandwagon jumpers.

Ill forum fight all you bastards!
img211.imageshack.us/img211/9926/motivator6650984.jpg
We have no clue what Nvidia has in store because we dont SEE ANYTHING. Not a peep from nvidia except a blatant attempt to write off DX11.

I am a big nvidia fan (check specs)... but reality is reality. This is, for all intents and purposes a HUGE ati win. Nvidia has dominated for so long, and now they will lose the crown - they were SO far ahead... and now they are back where they were during the g7x series in relation to ATI. That is a win from ATI no matter how you spin it.

ATI has a dx11 part that will take the crown... and nvidia is saying that DX11 wont matter ?!?:roll::roll::roll:

These are the same muppets that told us all that physX matters.:nutkick: LOL. They haven't learned their lesson with DX10, they are just trying to convince their investors not to jump ship BC they don't have a competing part. This is a business move, plain and simple... Just trying to minimize the pain until they can compete.
Posted on Reply
#57
EastCoasthandle
Wow, talk about trying to pull a "wool over your head". Lets go back to the G80 release shall we? Because ATI was late to the party with DX10 part (IE HD2900) Nvidia reaped the benefit to be the only DX10 card in town. In which their market share did increase during ATI's absence. In spite of their being little to no DX10 games out. Not only did consumers perfer G80 at it's higher price but caused ATI to lose a big piece of the discrete GPU market (as well as mobile, etc).

I believe that their market share lost was initiated by the G80 released with no answer from ATI. Compounded by the HD 2900 release, more or less. Today, AMD is currently still trying to recover from "that". Now all of sudden we are to forget what happened and say that the DX11 is nice but not all that important. :shadedshu Yes, we know that market conditions during that time and now are completely different. However, if AMD is able to adapt and compensate for that I see no reason why they wouldn't do well.
Posted on Reply
#58
PVTCaboose1337
Graphical Hacker
Remember that lots of stupid people buy graphics card. If Nvidia says that they have DX11 equipped cards before ATI, then Nvidia will do worlds better than ATI!! Why? Well people are dumb and think DX11 makes the card faster etc. In truth DX11 WILL define graphics cards sales greatly.
Posted on Reply
#59
phanbuey
PVTCaboose1337Remember that lots of stupid people buy graphics card. If Nvidia says that they have DX11 equipped cards before ATI, then Nvidia will do worlds better than ATI!! Why? Well people are dumb and think DX11 makes the card faster etc. In truth DX11 WILL define graphics cards sales greatly.
+1... exactly... theyre just trying to pull a Baghdad Bob on their investors. "No No... we ARE winning the war... ATI is cowering in fear, and our customers don't care about new tech at all... its just not important." :roll:
Posted on Reply
#60
EastCoasthandle
PVTCaboose1337Remember that lots of stupid people buy graphics card. If Nvidia says that they have DX11 equipped cards before ATI, then Nvidia will do worlds better than ATI!! Why? Well people are dumb and think DX11 makes the card faster etc. In truth DX11 WILL define graphics cards sales greatly.
People feign ignorance all the time when it comes to product they prefer. It doesn't mean that the masses simply don't know any better. I believe that it's the positive word of mouth from their friends, etc that causes brand recognition more so then just "not knowing better". Again, I'm talking about the masses, not individual cases.
Posted on Reply
#61
PVTCaboose1337
Graphical Hacker
phanbuey+1... exactly... theyre just trying to pull a Baghdad Bob on their investors. "No No... we ARE winning the war... ATI is running in fear" :roll:
Exactly! Cause we know that ATI will have the first DX11 card out soon (the HD 5xxx series) so ATI will win! That means Nvidia will be in dire trouble.
Posted on Reply
#62
AsRock
TPU addict
Musselsin the meantime, you get all the other benefits anyway.

Oh but if you wait 6 months before buying, you might as well wait another 6 months and get the 6 series from ATI. or another 6 after that for nvidias next offering... or another 6 after that...


its an endless loop. if it has good performance to price ratio at any given time and people can afford it when they want to update, they will buy it.

when these cards launch, Nvidia will have nothing comparable - they're ahead in performance, features, and DX compatibility - you'd be stupid to buy a DX10.0 card over an 11.0 card
Yes my last post was a little narrow minded

As by mid \ end of next year DX11 will COUGH should be more worth it as more games will be out for it. Although with the said boost makes it more temping but tell ya the truth i've played all games i want to play already and what are giving a issue is more CPU bound than GPU.

Stupid to buy a DX10 card now ? but that depends on what card they have now. BUT DX11 cards are going be like $250-$300+. So they could get a DX10 card for around $150 and by mid\end of next year if DX11 is more exceptable you get one but the price is going be cheaper to get one. And there be a reason to get one.

I'm a gamer so thats my view on it. I do very few benchmarks as thats not what i get faster hardware for.

Sure if you have a lower end card it's going be more worth but if you already have a card like the 285 or the 4890 there is no need if your a gamer..
Posted on Reply
#63
PVTCaboose1337
Graphical Hacker
AsRockYes my last post was a little narrow minded

As by mid \ end of next year DX11 will COUGH should be more worth it as more games will be out for it. Although with the said boost makes it more temping but tell ya the truth i've played all games i want to play already and what are giving a issue is more CPU bound than GPU.

Stupid to buy a DX10 card now ? but that depends on what card they have now. BUT DX11 cards are going be like $250-$300+. So they could get a DX10 card for around $150 and by mid\end of next year if DX11 is more exceptable you get one but the price is going be cheaper to get one. And there be a reason to get one.

I'm a gamer so thats my view on it. I do very few benchmarks as thats not what i get faster hardware for.

Sure if you have a lower end card it's going be more worth but if you already have a card like the 285 or the 4890 there is no need if your a gamer..
If DX11 turns out like DX10, we won't need it! Every DX10 game had the ability to run in DX9 mode. Were DX10 supporting cards necessary? No.
Posted on Reply
#64
[I.R.A]_FBi
PVTCaboose1337If DX11 turns out like DX10, we won't need it! Every DX10 game had the ability to run in DX9 mode. Were DX10 supporting cards necessary? No.
Yes they were to get extra high quality :rolleyes:
Posted on Reply
#65
mtosev
nvidia needs to cut the crap and make a DX11 card. if they don't then they can STFU!
Posted on Reply
#66
WhiteLotus
I kind of agree with nVidia - I don't believe that DX11 ITSELF will cause AMD (or is it still ATi? I get confused. Anyway...) cards to fly of the shelf. What will cause these cards to sell is that they will be top dog for a good few months. And them being DX11 pretty much equates to being better at performing well in DX10 et al (hell there are still "will this run Crysis" threads).

What nVidia are doing is saying to all those people that think that DX11 will make a huge difference is - hey you don't need DX11 just yet, here we'll (well i would think they would) cut the price on our cards that still perform pretty damn well.
Posted on Reply
#67
mtosev
WhiteLotusI kind of agree with nVidia - I don't believe that DX11 ITSELF will cause AMD (or is it still ATi? I get confused. Anyway...) cards to fly of the shelf. What will cause these cards to sell is that they will be top dog for a good few months. And them being DX11 pretty much equates to being better at performing well in DX10 et al (hell there are still "will this run Crysis" threads).

What nVidia are doing is saying to all those people that think that DX11 will make a huge difference is - hey you don't need DX11 just yet, here we'll (well i would think they would) cut the price on our cards that still perform pretty damn well.
it's AMD:
Posted on Reply
#68
HossHuge
I've enjoyed reading this thread. You guys are making alot of vaild points.
Posted on Reply
#69
btarunr
Editor & Senior Moderator
TheMailMan78You guys calling this an ATI win have no clue what Nvidia has in store for us. No one does. I say wait until the 300s come out and we can talk. Until then all I see is mob rule and bandwagon jumpers.

Ill forum fight all you bastards!
img211.imageshack.us/img211/9926/motivator6650984.jpg
That's not nerd rage, this is:



He wasn't happy when he found out that his GTX 380M was based on 40 nm G92c. That aside, let's get back on track.
Posted on Reply
#70
extrasalty
ATI: We have DX11 WHQL driver.
nVidia: DX11? Phew, let's concentrate on whats important- the Powerpoint slides.
Posted on Reply
#72
newconroer
"...framerate and resolution are nice, but today they are very high and going from 120fps to 125fps is not going to fundamentally change end-user experience. But I think the things that we are doing with Stereo 3D Vision, PhysX, about making the games more immersive, more playable is beyond framerates and resolutions. Nvidia will show with the next-generation GPUs that the compute side is now becoming more important that the graphics side.”

Um no, going from 120 to 125 isn't worth anything, correct, but stopping performance going from 60 to 30 IS worth something.

Compute side is all well and good, because without it, 'special' visuals won't work effeciently, but to say that pure computing is necessary is a bit premature.

Hopefully he's hinting and suggesting what we want to see in the near future, which is real time vector drawing, rather than pre-rendered visuals. But that would require cards with massive computing flexibility, like the FIRE GL types used in AutoCAD programs.

But still, stop making cards that do give you 125fps over 120fps, and start making ones that don't cower in fear at a few dynamic shadows in a 3d program. Then worry about 'compute' cards.
Posted on Reply
#73
Steevo
ATI was hardware accelerating before Nvidia.
Posted on Reply
#75
Benetanegia
I think that most angry fans are completely missing the point that the Nvidia rep was making. He is not saying DX11 won't matter, he is not saying it is worthless. All that he is saying is that it won't drive sales as much as other factors. Performance and YES GPGPU capabilities. The number of impressed non-gamer crowd is increasing in forums like CGSociety and even youtube yonkies that upload lots of videos everyday. All this people that couldn't care less about gaming, let alone DX11, do find GPGPU quite useful, because it means they can encode their videos twice as fast by just adding an small GPU instead of the Intel IGP most of them have.

Don't be naive and pretend that the gaming crowd is anywhere close to that installed base of users wanting some acceleration in video encoding, Photoshop and the like. Nvidia is talking about that. The capable software it's here already and it does make a difference, and much more is coming in the near future. The GPU is going to be more than a mere gaming device and that will sell more cards, simply because as I said the volume of non-gamer crowd is much much bigger than the gamer one. And considering the WoW and Sims crowd, that doesn't even know what DX is to begin with, you can pretty much disqualify half the gamer crowd as people waiting for DX11.

At the end of the day only enthusiasts care and know about DX11, and probably only half of them will buy the new cards based on DX11, because we know it will mean squat, at least in first tittles and multi-platform titles. So that leaves us with a number of around 2%. That's the percentage of people that will buy a card caring about DX11. The rest will buy the hardware for something else, but not DX11.
Posted on Reply
Add your own comment
Jun 5th, 2024 02:13 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts