Thursday, September 17th 2009

DirectX 11 Won't Define GPU Sales: NVIDIA

"DirectX 11 by itself is not going be the defining reason to buy a new GPU. It will be one of the reasons." This coming from the same company that a few years ago said that there was every reason to opt for a DirectX 10 compliant graphics card, to complete the Windows Vista experience, at a time when it was the first and only company to be out with compliant hardware. In the wake of rival AMD's ambitious Evergreen family of DirectX 11 compliant graphics cards being released, NVIDIA made it a point to tell the press that the development shouldn't really change anything in the industry.

Speaking at the Deutsche Bank Securities Technology Conference, NVIDIA's VP of investor relations said "DirectX 11 by itself is not going be the defining reason to buy a new GPU. It will be one of the reasons. This is why Microsoft is in work with the industry to allow more freedom and more creativity in how you build content, which is always good, and the new features in DirectX 11 are going to allow people to do that. But that no longer is the only reason, we believe, consumers would want to invest in a GPU."

"Now, we know, people are doing a lot in the area of video, people are going to do more and more in the area of photography… I think that the things we are doing would allow the GPU to be a co-processor to the CPU and deliver better user experience, better battery life and make that computers little bit more optimized," added Mr. Hara

NVIDIA, which was until very recently a firm believer in graphics processing horsepower to serve as the biggest selling points of new GPUs, now switches its line on what it believes will drive the market forward. All of a sudden, software that rely on the raw computational power of GPUs (eg: media transcoding software), and not advanced visual effects that a new generation API brings with it (in games and CGI applications), is what will drive people to buying graphics processors, according to the company.

Mr. Hara concluded saying "Graphics industry, I think, is on the point that microprocessor industry was several years ago, when AMD made the public confession that frequency does not matter anymore and it is more about performance per watt. I think we are the same crossroad with the graphics world: framerate and resolution are nice, but today they are very high and going from 120fps to 125fps is not going to fundamentally change end-user experience. But I think the things that we are doing with Stereo 3D Vision, PhysX, about making the games more immersive, more playable is beyond framerates and resolutions. Nvidia will show with the next-generation GPUs that the compute side is now becoming more important that the graphics side."

The timing of this comes when NVIDIA does not have any concrete product plans laid out, while AMD is working towards getting a headstart with its next-generation GPUs that are DirectX 11 compliant, and also has compliance with industry-wide GPGPU standards such as DirectCompute 11 and OpenCL.
Source: Xbit Labs
Add your own comment

194 Comments on DirectX 11 Won't Define GPU Sales: NVIDIA

#76
TheMailMan78
Big Member
BenetanegiaI think that most angry fans are completely missing the point that the Nvidia rep was making. He is not saying DX11 won't matter, he is not saying it is worthless. All that he is saying is that it won't drive sales as much as other factors. Performance and YES GPGPU capabilities. The number of impressed non-gamer crowd is increasing in forums like CGSociety and even youtube yonkies that upload lots of videos everyday. All this people that couldn't care less about gaming, let alone DX11, do find GPGPU quite useful, because it means they can encode their videos twice as fast by just adding an small GPU instead of the Intel IGP most of them have.

Don't be naive and pretend that the gaming crowd is anywhere close to that installed base of users wanting some acceleration in video encoding, Photoshop and the like. Nvidia is talking about that. The capable software it's here already and it does make a difference, and much more is coming in the near future. The GPU is going to be more than a mere gaming device and that will sell more cards, simply because as I said the volume of non-gamer crowd is much much bigger than the gamer one. And considering the WoW and Sims crowd, that doesn't even know what DX is to begin with, you can pretty much disqualify half the gamer crowd as people waiting for DX11.

At the end of the day only enthusiasts care and know about DX11, and probably only half of them will buy the new cards based on DX11, because we know it will mean squat, at least in first tittles and multi-platform titles. So that leaves us with a number of around 2%. That's the percentage of people that will buy a card caring about DX11. The rest will buy the hardware for something else, but not DX11.
A dedicated GPU card speeds up Photoshop? Thats news to me. :wtf:
Posted on Reply
#77
toyo
Yeah, CS4 has a few accelerated apps. Nvidia even pulled a dedicated Quadro CX card for Adobe CS4...
Posted on Reply
#78
Steevo
Yes it does, and ATI was doing GPU acceleration first, F@H style, transcoding, better video acceleration.....


Then Nvidia jumped on, and they do it better in some aspects for the G200 series, however the move to DX11 will allow a common platform that Nvidia can't throw "TWIMTBP" money at, and level the playing field, and providing a better consumer experience for all.


DX11 is just the first step to faster computers at everything, and a hugely better experience, it is the DX7 of our time. The beginnings of something better.
Posted on Reply
#80
extrasalty
The rep from nVidia is VP of investor relations. The quarterly loss must be near.
Posted on Reply
#81
toyo
TheMailMan78No it doesn't. I use CS4 10 hours a day. All a dedicated GPU does is help out with some of the scaling and that can be done with a good IGP with 3.0 shaders.

Look.
kb2.adobe.com/cps/404/kb404898.html
There's more to it then scaling, and in my opinion this is GPU acceleration... I feel I'm not understanding what you mean somewhere :)
Posted on Reply
#82
KainXS
Of course DX11 dosen't matter right now, what I look at with the 5870 its a pretty awesomely priced card with performance that seems to top everything else available with a standby powerstate that tops everything else available, nvidia is just shaking in their boots right now because they know they are screwed until next year.
Posted on Reply
#83
TheMailMan78
Big Member
toyoThere's more to it then scaling, and in my opinion this is GPU acceleration... I feel I'm not understanding what you mean somewhere :)
No there's not. Read the article from Adobe. Read the system requirements for the GPU to be of any use. Its a gimmick. 99% of the computers that have been made in the last 2 years can run this on their IGP. A dedicated GPU card will make no difference in Photoshop.
Posted on Reply
#84
Benetanegia
SteevoATI was hardware accelerating before Nvidia.
Correction. Stanford was hardware accelerating before Nvidia, and they used Ati cards to accelerate their Brook parallel computing libraries. Essentially Ati X19xx cards were better for that purpose, but Ati had very little to do with that project apart from the technical support they had to give.

Curiously the chief scientist behind Brook projects, Bill Dally, and one of the project directors (whose name I do't remember now), both work for Nvidia. Bill Dally is the chief scientist and VP of Nvidia and the other one takes care of Nvidia's parallel computing division.
Posted on Reply
#85
toyo
TheMailMan78No there's not. Read the article from Adobe. Read the system requirements for the GPU to be of any use. Its a gimmick. 99% of the computers that have been made in the last 2 years can run this on their IGP. A dedicated GPU card will make no difference in Photoshop.
Ah, I get you now. I don't have any ideea about the IGPvsGPU performance in PS, and in this stage maybe it isn't much than a gimmick, but it's the correct choice of a gimmick. It's the 1st accelerated PS, and the differences vs the CPU are visible where it is working. I hope Adobe will continue on this path, and if they can make any IGPs run it, that's even more value. Also hope they prioritize bug solving and coherence between their apps :ohwell:
Posted on Reply
#86
erocker
*
extrasaltyThe rep from nVidia is VP of investor relations. The quarterly loss must be near.
And that says it all folks. :toast:
Posted on Reply
#87
phanbuey
BenetanegiaI think that most angry fans are completely missing the point that the Nvidia rep was making. He is not saying DX11 won't matter, he is not saying it is worthless. All that he is saying is that it won't drive sales as much as other factors. Performance and YES GPGPU capabilities. The number of impressed non-gamer crowd is increasing in forums like CGSociety and even youtube yonkies that upload lots of videos everyday. All this people that couldn't care less about gaming, let alone DX11, do find GPGPU quite useful, because it means they can encode their videos twice as fast by just adding an small GPU instead of the Intel IGP most of them have.

Don't be naive and pretend that the gaming crowd is anywhere close to that installed base of users wanting some acceleration in video encoding, Photoshop and the like. Nvidia is talking about that. The capable software it's here already and it does make a difference, and much more is coming in the near future. The GPU is going to be more than a mere gaming device and that will sell more cards, simply because as I said the volume of non-gamer crowd is much much bigger than the gamer one. And considering the WoW and Sims crowd, that doesn't even know what DX is to begin with, you can pretty much disqualify half the gamer crowd as people waiting for DX11.

At the end of the day only enthusiasts care and know about DX11, and probably only half of them will buy the new cards based on DX11, because we know it will mean squat, at least in first tittles and multi-platform titles. So that leaves us with a number of around 2%. That's the percentage of people that will buy a card caring about DX11. The rest will buy the hardware for something else, but not DX11.
Yeah BUT... they touted DX10 and CUDA and PhysX as being reasons why their GPU's will sell... and are now saying that DX11 doesn't really matter.

Of course you can argue it and spin it any way you want - and say that DX11 does or doesn't matter while giving valid reasons. But nvidia changed the tune of their song. That's the point... they used to be the ones touting benefits that didn't exist -(lol PhysX and even CUDA to a big extent). Like the photoshop "speedup" which only affects a sliver of the features in photoshop.

Yet now theyre trying to write off DX11... yeah... de nile is a river in africa... And they are talking to investors.

Honestly, think about it - if DX11 WAS a major reason ppl would buy gfx cards, hypothetically, would Nvidia really go out to investors and say "hey this is a huge feature and we got NOTHIN! this will sell, but we don't have it yet... sorry, our bad"?
Posted on Reply
#88
Benetanegia
TheMailMan78No there's not. Read the article from Adobe. Read the system requirements for the GPU to be of any use. Its a gimmick. 99% of the computers that have been made in the last 2 years can run this on their IGP. A dedicated GPU card will make no difference in Photoshop.
But the higher the performance of the GPU (be it IGP or dedicated) the better. An IGP will let you apply some masks and filters faster than with te CPU alone. A faster card will allow you to apply them on the fly and in 2 and 3 digit Mpixel photos, something that IGP can't do. Granted it was my fault to include PS4 as a GPGPU app, considering what GPGPU means officially (like what HD is as oposed to high resolution), considering that any DX10 capable card can accelerate PS4. But the point still is that the better the card the faster that PS4 will run those things, and gimmick or not it could drive sales even better than DX11. There are far more apps that are going to have acceleration anyway, and their convenience innegable.
Posted on Reply
#89
erocker
*
phanbueyYeah BUT... they touted DX10 and CUDA and PhysX as being reasons why their GPU's will sell... and are now saying that DX11 doesn't really matter.

Of course you can argue it and spin it any way you want - and say that DX11 does or doesn't matter while giving valid reasons. But nvidia changed the tune of their song. That's the point... they used to be the ones touting benefits that didn't exist -(lol PhysX and even CUDA to a big extent). Like the photoshop "speedup" which only affects a sliver of the features in photoshop.

Yet now theyre trying to write off DX11... yeah... de nile is a river in africa... And they are talking to investors.

Honestly, think about it - if DX11 WAS a major reason ppl would buy gfx cards, hypothetically, would Nvidia really go out to investors and say "hey this is a huge feature and we got NOTHIN! this will sell, but we don't have it yet... sorry, our bad"?
...and why should we believe (technology speaking) anything any VP of Investor relations has to say about a product they most likely don't understand? This is nothing but spin and bullshit, and why should it even matter to us? I want to hear what Mr. "Investor Relations" said coming out of one of Nvidia's engineers mouths.

Regardless to that, we all know what's going to happen anyways. In a few months (more or less) Nvidia will launch a DX11 card that will most likely have a bit better performance than ATi and will most deffinitely be more expensive. Until then we are going to hear a bunch of crap, saying not to buy the competitors products. Whoop de doo!
Posted on Reply
#90
Mussels
Freshwater Moderator
TheMailMan78A dedicated GPU card speeds up Photoshop? Thats news to me. :wtf:
latest photoshop (CS4) actually has GPU acceleration. it uses GPU ram and renders as a 3D image, speeding up editing/effects and other fancy crap i dont know about, since i just use it to resize my photos.


while it doesnt make HUGE differences, even an IGP can do some things faster than some CPU's (or at least, the two combined is faster than CPU alone)
Posted on Reply
#91
erocker
*
Musselslatest photoshop (CS4) actually has GPU acceleration. it uses GPU ram and renders as a 3D image, speeding up editing/effects and other fancy crap i dont know about, since i just use it to resize my photos.
Didn't they implement that in CS3? I dunno, I'm still using CS2.
Posted on Reply
#92
PVTCaboose1337
Graphical Hacker
Ok so lets just use the X8xx series as an example (old I know). Remember that this card did not have shader support for Bioshock? So Zek upgraded to the 2400 because he needed Shader support, even though the 2400 was slower than his X850? Features do matter.

Also on photoshop:



It does have that GPU shit or whatnot. Just my Vista hates my amazing modified drivers...
Posted on Reply
#93
Mussels
Freshwater Moderator
erockerDidn't they implement that in CS3? I dunno, I'm still using CS2.
i went from photoshop 7.0 to CS4 -i had a big leap :D




it has a list of features there, that are only enabled with hardware acceleration on
Posted on Reply
#94
phanbuey
erocker...and why should we believe (technology speaking) anything any VP of Investor relations has to say about a product they most likely don't understand? This is nothing but spin and bullshit, and why should it even matter to us? I want to hear what Mr. "Investor Relations" said coming out of one of Nvidia's engineers mouths.
they keep those people locked up far, far away from the investors... :roll:

But honestly, I think he does understand that Nv is going to be f*** very quickly if the market share for dx11 cards goes to ATI... As do the investors. There really isnt any need of tech knowledge to connect those dots.
Posted on Reply
#95
Benetanegia
phanbueyYeah BUT... they touted DX10 and CUDA and PhysX as being reasons why their GPU's will sell... and are now saying that DX11 doesn't really matter.

Of course you can argue it and spin it any way you want - and say that DX11 does or doesn't matter while giving valid reasons. But nvidia changed the tune of their song. That's the point... they used to be the ones touting benefits that didn't exist -(lol PhysX and even CUDA to a big extent). Like the photoshop "speedup" which only affects a sliver of the features in photoshop.

Yet now theyre trying to write off DX11... yeah... de nile is a river in africa... And they are talking to investors.

Honestly, think about it - if DX11 WAS a major reason ppl would buy gfx cards, hypothetically, would Nvidia really go out to investors and say "hey this is a huge feature and we got NOTHIN! this will sell, but we don't have it yet... sorry, our bad"?
Yeah and everbody thought DX10 was going to change the field of gaming. It did not, we all learnt. It's stupid to stumble into the same rock again. I know that, you know that, Nvidia knows that and AMD knows that. If anything the only dishonest voice regarding DX11 is AMD, with all the "know the future" and whatnot BS. Even when it does change how games could be made (AKA is a huge technology advancement), it won't change the gaming reality in any near future. No matter they released DX12 today with nuclear technology into it, games would still be "tweaked" DX9 games. That is the reality, DX11 means as much as DX10 did, that is 0.
Posted on Reply
#96
laszlo
what i know for sure:

-people who like new stuff will jump to buy
-benchmark fans also
-gamers with a lot of cash also
-e-peen people also
--------------------------------
-people who give a shit about won't
-gamers who try to squeeze all from older cards won't
-who have the cash but don't consider necessary yet won't

and the lists can be bigger...
Posted on Reply
#97
PVTCaboose1337
Graphical Hacker
Although DX10 did not change the field of gaming, it sold cards. That means DX11 will sell cards too. People are SCARED that their card will not work in the future. Fear drives sales. People FEAR that they will not have a card that can handle DX11. That will drive sales more than hardware will.

ALSO: I love how immediately we all called BS on this Nvidia statement. Nice one guys!
Posted on Reply
#98
Benetanegia
PVTCaboose1337Although DX10 did not change the field of gaming, it sold cards. That means DX11 will sell cards too.
Extreme example: Hitler won the elections legally, which means that most people voted him. Is that going to happen again? No. Definately no.

And DX10 didn't really sold a lot of cards, based on DX. 8800 sold, but what about 8600 or HD2600 when they were released? X19xx and 79xx cards vastly outsold those cards, because the performance was better. Hence 8800 sold a lot because it offered unprecedented performance, like ability to play every game at 1920x1200 4xAA, something that not even SLI, Crossfire could do at the time for any newer games.
Posted on Reply
#99
phanbuey
BenetanegiaYeah and everbody thought DX10 was going to change the field of gaming. It did not, we all learnt. It's stupid to stumble into the same rock again. I know that, you know that, Nvidia knows that and AMD knows that. If anything the only dishonest voice regarding DX11 is AMD, with all the "know the future" and whatnot BS. Even when it does change how games could be made (AKA is a huge technology advancement), it won't change the gaming reality in any near future. No matter they released DX12 today with nuclear technology into it, games would still be "tweaked" DX9 games. That is the reality, DX11 means as much as DX10 did, that is 0.
We agree there... but what about PhysX? I have it... and its bullsH** yet at the same time that they downplayed dx11 they talked up PhysX and "immersive experience"... you see?

They are both full of sh**. But only one is a hypocrite.
Posted on Reply
#100
PVTCaboose1337
Graphical Hacker
BenetanegiaExtreme example: Hitler won the elections legally, which means that most people voted him. Is that going to happen again? No. Definately no.

And DX10 didn't really sold a lot of cards, based on DX. 8800 sold, but what about 8600 or HD2600 when they were released? X19xx and 79xx cards vastly outsold those cards, because the performance was better. Hence 8800 sold a lot because it offered unprecedented performance, like ability to play every game at 1920x1200 4xAA, something that not even SLI, Crossfire could do at the time for any newer games.
Extreme example 2: Well we elected Obama...

But back to the topic, remember DX10.1 vs DX10. People bought the DX10.1 equipped cards because they thought that they would not be able to play the latest and greatest games.
Posted on Reply
Add your own comment
May 23rd, 2024 06:00 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts