Thursday, September 17th 2009

DirectX 11 Won't Define GPU Sales: NVIDIA

"DirectX 11 by itself is not going be the defining reason to buy a new GPU. It will be one of the reasons." This coming from the same company that a few years ago said that there was every reason to opt for a DirectX 10 compliant graphics card, to complete the Windows Vista experience, at a time when it was the first and only company to be out with compliant hardware. In the wake of rival AMD's ambitious Evergreen family of DirectX 11 compliant graphics cards being released, NVIDIA made it a point to tell the press that the development shouldn't really change anything in the industry.

Speaking at the Deutsche Bank Securities Technology Conference, NVIDIA's VP of investor relations said "DirectX 11 by itself is not going be the defining reason to buy a new GPU. It will be one of the reasons. This is why Microsoft is in work with the industry to allow more freedom and more creativity in how you build content, which is always good, and the new features in DirectX 11 are going to allow people to do that. But that no longer is the only reason, we believe, consumers would want to invest in a GPU."

"Now, we know, people are doing a lot in the area of video, people are going to do more and more in the area of photography… I think that the things we are doing would allow the GPU to be a co-processor to the CPU and deliver better user experience, better battery life and make that computers little bit more optimized," added Mr. Hara

NVIDIA, which was until very recently a firm believer in graphics processing horsepower to serve as the biggest selling points of new GPUs, now switches its line on what it believes will drive the market forward. All of a sudden, software that rely on the raw computational power of GPUs (eg: media transcoding software), and not advanced visual effects that a new generation API brings with it (in games and CGI applications), is what will drive people to buying graphics processors, according to the company.

Mr. Hara concluded saying "Graphics industry, I think, is on the point that microprocessor industry was several years ago, when AMD made the public confession that frequency does not matter anymore and it is more about performance per watt. I think we are the same crossroad with the graphics world: framerate and resolution are nice, but today they are very high and going from 120fps to 125fps is not going to fundamentally change end-user experience. But I think the things that we are doing with Stereo 3D Vision, PhysX, about making the games more immersive, more playable is beyond framerates and resolutions. Nvidia will show with the next-generation GPUs that the compute side is now becoming more important that the graphics side."

The timing of this comes when NVIDIA does not have any concrete product plans laid out, while AMD is working towards getting a headstart with its next-generation GPUs that are DirectX 11 compliant, and also has compliance with industry-wide GPGPU standards such as DirectCompute 11 and OpenCL.
Source: Xbit Labs
Add your own comment

194 Comments on DirectX 11 Won't Define GPU Sales: NVIDIA

#26
Mussels
Freshwater Moderator
AphexDreamerPlease don't consider this Fanboyism, but Nvidia are the biggest BSers I know off in the Computer Industry. I mean most companies say shit but Nvidia is ridiculous.
i lost all trust in nvidia after seeing GTX280 mobility cards using a cut down G92 core.

nvidias been struggling lately - if they werent, they'd have DX10.1 and 11 cards out, and they wouldnt need to rename products to keep in the news every time ATI release a new card.
Posted on Reply
#27
TheMailMan78
Big Member
mdm-adphPointing out hypocrisy is not flame baiting -- it's a public service to the community at large. :laugh:

And anyway, for your information, I personally think the G300 is going to be a much faster chip than any R800's. It's just going to be a good while before they come out, and when they do, they're going to cost an arm and a leg.
You know WTF your doing man. Cut it out. We all know about the renaming thing but at the same time ATI does it too. The HD2900 and 3870 come to mind. Same damn performance just a different fabrication.

Both cards at the time they were released were ATI's top tier card. At least Nvidias cards gave you a performance boost with the new named cards.
Posted on Reply
#28
heky
TheMailMan78You know WTF your doing man. Cut it out. We all know about the renaming thing but at the same time ATI does it too. The HD2900 and 3870 come to mind. Same damn performance just a different fabrication.

Both cards at the time they were released were ATI's top tier card. At least Nvidias cards gave you a performance boost with the new named cards.
And what performance boost would that be???
Posted on Reply
#29
Benetanegia
I don't know why all the buzz about this and I certainly don't relate this comments with Ati's DX11 or GT300. They've been saying that GPGPU would become more important than graphics when it comes to selling GPUs since they released the 8800GTX and CUDA. It's not as if this was new from them, it's not as if they suddenly changed their minds about this because they have no DX11 cards before Ati. They even spent 10% of GT200's die are for GPGPU even when that suposed less gaming performance-per die area. They even mede Intel angry because of this 3 years back.

This is not new.
This is not related to DX11.
This is not related to GT300 or RV870.
Posted on Reply
#30
mdm-adph
TheMailMan78You know WTF your doing man. Cut it out. We all know about the renaming thing but at the same time ATI does it too. The HD2900 and 3870 come to mind. Same damn performance just a different fabrication.

Both cards at the time they were released were ATI's top tier card. At least Nvidias cards gave you a performance boost with the new named cards.
What the who? I didn't say anything about any renaming... :wtf:

I was just pointing out the hypocrisy of a company saying that new technologies aren't worth buying a new card for, when at one time, that was all I ever heard about Nvidia. I still think it's a very valid point.
Posted on Reply
#31
phanbuey
Musselstriple monitor/eyefinity.

seriously, look how cheap LCD's are lately and tell me you dont see a new wave of gamers using 3 19-24" monitors for FPS gaming.
tbh im not impressed with having a screen border cut the image.. especially the 6 monitor one where the center of the "screen" has a line running through it. I think they were playing an RPG and you couldn't even see the character - just the head and the feet lol.

It's not that I dont see a wave of gamers using multiple screens... Its just that I definitely wouldn't buy it... the borders are way too distracting.
Posted on Reply
#32
HalfAHertz
TheMailMan78You know WTF your doing man. Cut it out. We all know about the renaming thing but at the same time ATI does it too. The HD2900 and 3870 come to mind. Same damn performance just a different fabrication.

Both cards at the time they were released were ATI's top tier card. At least Nvidias cards gave you a performance boost with the new named cards.
The 3xxx series added 10.1 support, new version of UVD for better HD playback and some other minor tweaks to the core...
Posted on Reply
#33
TheMailMan78
Big Member
HalfAHertzThe 3xxx series added 10.1 support, new version of UVD for better HD playback and some other minor tweaks to the core...
And the same exact performance in games.
mdm-adphWhat the who? I didn't say anything about any renaming... :wtf:

I was just pointing out the hypocrisy of a company saying that new technologies aren't worth buying a new card for, when at one time, that was all I ever heard about Nvidia. I still think it's a very valid point.
Somebody said something about someone naming something and I'm pissed!
Posted on Reply
#34
Easo
Imho just NVidia had to say something cause of ATi cards launch...
Posted on Reply
#35
Mussels
Freshwater Moderator
the point isnt what you think it is mailman.

while the ATI cards had the same performance, at least changes WERE made. features were added.

Nvidia doesnt add anything to theirs, you can even BIOS flash them between each "variant" of the cards.
Posted on Reply
#36
phanbuey
TheMailMan78And the same exact performance in games.



Somebody said something about someone naming something and I'm pissed!
well lets face it... ATI did it ONCE and the GPU had substantial internal changes - and the card itself was quite different in terms of characteristics (power draw, memory bus etc etc)... Nvidia did it twice, on the same card. W/e that isnt the point.

Point is they don't have a dx11 part in the immediate future whereas AMD's 5870 is imminent - quite obvious why NOW, of all times, they are beating the DX11 doesn't matter drum. DX11 will start to matter only after their product comes out :laugh: then they will say that everyone needs to have for the best computing experience.
Posted on Reply
#37
AphexDreamer
phanbueywell lets face it... ATI did it ONCE and the GPU had substantial internal changes - and the card itself was quite different in terms of characteristics (power draw, memory bus etc etc)... Nvidia did it twice, on the same card. W/e that isnt the point.

Point is they don't have a dx11 part in the immediate future whereas AMD's 5870 is imminent - quite obvious why NOW, of all times, they are beating the DX11 doesn't matter drum. DX11 will start to matter only after their product comes out :laugh: then they will say that everyone needs to have for the best computing experience.
Yup exactly what I stand by. Nvidia does a good job at fooling its consumers especially the not so PC friendly ones. That's why most people I talk to consider Nvidia as being the best, all knowing, god like graphics line. I try to enlighten them, but Nvidia has brainwashed them.
Posted on Reply
#38
Mussels
Freshwater Moderator
its no different to the P4 era, when people bought inferior products to what AMD had at the time, simply because "intel is best" was stuck in their minds.
Posted on Reply
#39
AphexDreamer
Musselsits no different to the P4 era, when people bought inferior products to what AMD had at the time, simply because "intel is best" was stuck in their minds.
Yup, this is where marketing really does effect sells. I really don't care though, doesn't effect me.
Posted on Reply
#40
phanbuey
well... its not just new tech - its also performance... the 5870's will be the fastest cards on the market until the next gen NV comes out.

That crown is key... if the current benches are correct, then two $350 5870's will spank what is now two $500 gtx295 in quad SLI. And after all this time, I think that is a bitter pill for NV to swallow.
Posted on Reply
#41
tkpenalty
Nvidia deliberately talked to the bank just to get more shareholders onboard nvidia because they always listen to what the advisors, etc say. Even if this proves to be bullshit they'll see more shares bought.
Posted on Reply
#42
[I.R.A]_FBi
Somehow i knew this 'wang' guy would send them to say sumpn like this too bad someone already said sour grapes ....
Posted on Reply
#43
KainXS
I used to LOVE Nvidia but after they started playing the renaming game with the G92's then lying about just about all of their current mobile GPU's I just went like F it, in order for me to buy nvidia again they would have to truly wow me because I just see alot of fooling going around in their corner.
Posted on Reply
#44
[I.R.A]_FBi
although they may be correct dx11 will not define gpu sales, but the gpu's spawned by it will make the current stuff look like chopped liver if the speculations im seing are correct
Posted on Reply
#45
polaromonas
Even DX11 does or doesn't boost GPU sales, AMD will have advantage over Nvidia. By launching DX11 GPU on the same time Windows 7 came out, they can deduct some money on the ads because MS will do that for them *IF* AMD creates the image for their cards that is the "only way" to bring the most out of your Windows 7 PC or something like that.

PS. I really hope AMD doesn't screw up this time. And please bring out the legit driver for DX11Compute / OpenCL. I wanna know how long Nvidia can hold onto their CUDA thing.
Posted on Reply
#46
TheMailMan78
Big Member
All I'm saying is they may be being honest from their perspective. They were betting big on DX10 and it fell through. They foresee the same thing on DX11.

Heres my prediction. Since I'm ALWAYS wrong it will probably be just the opposite.
We are seeing roles reversed from the last generation. ATI will lead this time in over all performance and Nvidia will take their lunch in price/performance with the 300s.
Posted on Reply
#47
1Kurgan1
The Knife in your Back
TheMailMan78All I'm saying is they may be being honest from their perspective. They were betting big on DX10 and it fell through. They foresee the same thing on DX11.

Heres my prediction. Since I'm ALWAYS wrong it will probably be just the opposite.
We are seeing roles reversed from the last generation. ATI will lead this time in over all performance and Nvidia will take their lunch in price/performance with the 300s.
Here's the major difference between DX10 and DX11, DX11 isn't even out yet and there are already more titles that are announced to support it than DX10 has on the market since it was released. (Or at least very close)

DX9 has been a good platform, but it's time to retire at some point here, this is going to be it.
Posted on Reply
#48
wolf
Performance Enthusiast
phanbueywell... its not just new tech - its also performance... the 5870's will be the fastest cards on the market until the next gen NV comes out.

That crown is key... if the current benches are correct, then two $350 5870's will spank what is now two $500 gtx295 in quad SLI. And after all this time, I think that is a bitter pill for NV to swallow.
+1 Very well said, this is a big hit to Nvidia's ego.

Hey if prices on GT200 class cards plummet that's good news for us :)
Posted on Reply
#49
KainXS
the way I am seeing this is like this,

Nvidia is going to release its 300 series from what all sources point to next year sometime, now . . . . by that time ATI would already be about to release the HD6k(they always replace their cards in under a year, sometimes in 6 months) series and nvidia will be taking losses, that gonna be a short lunch.

either way you look at it, a next year launch on the 300 series is not looking good.
Posted on Reply
#50
toyo
Someone prepare the 2nd violin for Nvidia, the wind's blowing towards ATI for now, even if it will be only for a few months (ore more? seems NV have birth issues with GT300). This will remain in the GPU history as an ATI win, hell, Wikipedia even mentions the few days of K6 III being the king back in 1999.
There's no stopping to the "horsepower", duh. To get more and better feats you have to pack some...
So NV, stop barking at the moon and get your act together, we need you so the prices go to normal levels.
Posted on Reply
Add your own comment
Jun 16th, 2024 02:03 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts