1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

DirectX 11 Won't Define GPU Sales: NVIDIA

Discussion in 'News' started by btarunr, Sep 17, 2009.

  1. toyo New Member

    Joined:
    Feb 18, 2009
    Messages:
    225 (0.11/day)
    Thanks Received:
    32
    Someone prepare the 2nd violin for Nvidia, the wind's blowing towards ATI for now, even if it will be only for a few months (ore more? seems NV have birth issues with GT300). This will remain in the GPU history as an ATI win, hell, Wikipedia even mentions the few days of K6 III being the king back in 1999.
    There's no stopping to the "horsepower", duh. To get more and better feats you have to pack some...
    So NV, stop barking at the moon and get your act together, we need you so the prices go to normal levels.
     
  2. TheMailMan78

    TheMailMan78 Big Member

    Joined:
    Jun 3, 2007
    Messages:
    21,151 (7.81/day)
    Thanks Received:
    7,678
    You guys calling this an ATI win have no clue what Nvidia has in store for us. No one does. I say wait until the 300s come out and we can talk. Until then all I see is mob rule and bandwagon jumpers.

    Ill forum fight all you bastards!
    [​IMG]
     
  3. toyo New Member

    Joined:
    Feb 18, 2009
    Messages:
    225 (0.11/day)
    Thanks Received:
    32
    I don't remember anyone (mentioned among others: radeon 9700, Nvidia Geforce 256, 8800) waiting for competitors to release their equivalent products when a new generation of cards based on a new DirectX or technology (T&L hardware) is released, before declaring the champion of the time.
    Although, before the 23rd we cannot know for sure, maybe NV has the GT300 already waiting to punch from the darkness at that date (mhmmm....)
     
  4. wahdangun

    wahdangun New Member

    Joined:
    Oct 2, 2008
    Messages:
    1,512 (0.68/day)
    Thanks Received:
    114
    Location:
    indonesia ku tercinta
    way to go nvdia,

    it's clear now, they won't have GT 300 ready when ati launch the evergreen
     
  5. mdm-adph

    mdm-adph New Member

    Joined:
    Mar 28, 2007
    Messages:
    2,478 (0.89/day)
    Thanks Received:
    340
    Location:
    Your house.
    I don't know what you're talking about -- the GeForce 5900 Ultra was a worthy competitor.

    Now that was the kind of card that could keep you warm on cold nights.
     
  6. toyo New Member

    Joined:
    Feb 18, 2009
    Messages:
    225 (0.11/day)
    Thanks Received:
    32
    Radeon 9700
    Radeon 9700's advanced architecture was very efficient and, of course, more powerful compared to its older peers of 2002. Under normal conditions it beat the GeForce4 Ti 4600, the previous top-end card, by 15–20%. However, when anti-aliasing (AA) and/or anisotropic filtering (AF) were enabled it would beat the Ti 4600 by anywhere from 40–100%. At the time, this was quite astonishing, and resulted in the widespread acceptance of AA and AF as critical, truly usable features.

    Besides advanced architecture, reviewers also took note of ATI's change in strategy. The 9700 would be the second of ATI's chips (after the 8500) to be shipped to third-party manufacturers instead of ATI producing all of its graphics cards, though ATI would still produce cards off of its highest-end chips. This freed up engineering resources that were channeled towards driver improvements, and the 9700 performed phenomenally well at launch because of this. id Software technical director John Carmack had the Radeon 9700 run the E3 Doom 3 demonstration.[3]

    The performance and quality increases offered by the R300 GPU is considered to be one of the greatest in the history of 3D graphics, alongside the achievements GeForce 256 and Voodoo Graphics. Furthermore, NVIDIA’s response in the form of the GeForce FX 5800 was both late to market and somewhat unimpressive, especially when pixel shading was used. R300 would become one of the GPUs with the longest useful lifetime in history, allowing playable performance in new games at least 3 years after its launch.[4]

    GeForce256 and Nvidia 8800 series were also uncontested winners at that time, no other player on the market had equivalent functional technologies.
     
  7. phanbuey

    phanbuey

    Joined:
    Nov 13, 2007
    Messages:
    5,205 (2.05/day)
    Thanks Received:
    975
    Location:
    Miami
    We have no clue what Nvidia has in store because we dont SEE ANYTHING. Not a peep from nvidia except a blatant attempt to write off DX11.

    I am a big nvidia fan (check specs)... but reality is reality. This is, for all intents and purposes a HUGE ati win. Nvidia has dominated for so long, and now they will lose the crown - they were SO far ahead... and now they are back where they were during the g7x series in relation to ATI. That is a win from ATI no matter how you spin it.

    ATI has a dx11 part that will take the crown... and nvidia is saying that DX11 wont matter ?!?:roll::roll::roll:

    These are the same muppets that told us all that physX matters.:nutkick: LOL. They haven't learned their lesson with DX10, they are just trying to convince their investors not to jump ship BC they don't have a competing part. This is a business move, plain and simple... Just trying to minimize the pain until they can compete.
     
    Last edited: Sep 17, 2009
    heky, department76 and extrasalty say thanks.
  8. EastCoasthandle

    EastCoasthandle New Member

    Joined:
    Apr 21, 2005
    Messages:
    6,889 (1.98/day)
    Thanks Received:
    1,505
    Wow, talk about trying to pull a "wool over your head". Lets go back to the G80 release shall we? Because ATI was late to the party with DX10 part (IE HD2900) Nvidia reaped the benefit to be the only DX10 card in town. In which their market share did increase during ATI's absence. In spite of their being little to no DX10 games out. Not only did consumers perfer G80 at it's higher price but caused ATI to lose a big piece of the discrete GPU market (as well as mobile, etc).

    I believe that their market share lost was initiated by the G80 released with no answer from ATI. Compounded by the HD 2900 release, more or less. Today, AMD is currently still trying to recover from "that". Now all of sudden we are to forget what happened and say that the DX11 is nice but not all that important. :shadedshu Yes, we know that market conditions during that time and now are completely different. However, if AMD is able to adapt and compensate for that I see no reason why they wouldn't do well.
     
    Last edited: Sep 17, 2009
  9. PVTCaboose1337

    PVTCaboose1337 Graphical Hacker

    Joined:
    Feb 1, 2006
    Messages:
    9,512 (2.98/day)
    Thanks Received:
    1,143
    Location:
    San Antonio, Texas
    Remember that lots of stupid people buy graphics card. If Nvidia says that they have DX11 equipped cards before ATI, then Nvidia will do worlds better than ATI!! Why? Well people are dumb and think DX11 makes the card faster etc. In truth DX11 WILL define graphics cards sales greatly.
     
    heky and phanbuey say thanks.
  10. phanbuey

    phanbuey

    Joined:
    Nov 13, 2007
    Messages:
    5,205 (2.05/day)
    Thanks Received:
    975
    Location:
    Miami
    +1... exactly... theyre just trying to pull a Baghdad Bob on their investors. "No No... we ARE winning the war... ATI is cowering in fear, and our customers don't care about new tech at all... its just not important." :roll:
     
  11. EastCoasthandle

    EastCoasthandle New Member

    Joined:
    Apr 21, 2005
    Messages:
    6,889 (1.98/day)
    Thanks Received:
    1,505
    People feign ignorance all the time when it comes to product they prefer. It doesn't mean that the masses simply don't know any better. I believe that it's the positive word of mouth from their friends, etc that causes brand recognition more so then just "not knowing better". Again, I'm talking about the masses, not individual cases.
     
  12. PVTCaboose1337

    PVTCaboose1337 Graphical Hacker

    Joined:
    Feb 1, 2006
    Messages:
    9,512 (2.98/day)
    Thanks Received:
    1,143
    Location:
    San Antonio, Texas
    Exactly! Cause we know that ATI will have the first DX11 card out soon (the HD 5xxx series) so ATI will win! That means Nvidia will be in dire trouble.
     
  13. AsRock

    AsRock TPU addict

    Joined:
    Jun 23, 2007
    Messages:
    11,022 (4.10/day)
    Thanks Received:
    1,733
    Location:
    US
    Yes my last post was a little narrow minded

    As by mid \ end of next year DX11 will COUGH should be more worth it as more games will be out for it. Although with the said boost makes it more temping but tell ya the truth i've played all games i want to play already and what are giving a issue is more CPU bound than GPU.

    Stupid to buy a DX10 card now ? but that depends on what card they have now. BUT DX11 cards are going be like $250-$300+. So they could get a DX10 card for around $150 and by mid\end of next year if DX11 is more exceptable you get one but the price is going be cheaper to get one. And there be a reason to get one.

    I'm a gamer so thats my view on it. I do very few benchmarks as thats not what i get faster hardware for.

    Sure if you have a lower end card it's going be more worth but if you already have a card like the 285 or the 4890 there is no need if your a gamer..
     
  14. PVTCaboose1337

    PVTCaboose1337 Graphical Hacker

    Joined:
    Feb 1, 2006
    Messages:
    9,512 (2.98/day)
    Thanks Received:
    1,143
    Location:
    San Antonio, Texas
    If DX11 turns out like DX10, we won't need it! Every DX10 game had the ability to run in DX9 mode. Were DX10 supporting cards necessary? No.
     
  15. [I.R.A]_FBi

    [I.R.A]_FBi New Member

    Joined:
    May 19, 2007
    Messages:
    7,664 (2.82/day)
    Thanks Received:
    540
    Location:
    c:\programs\kitteh.exe

    Yes they were to get extra high quality :rolleyes:
     
  16. mtosev

    mtosev New Member

    Joined:
    Mar 21, 2005
    Messages:
    1,463 (0.42/day)
    Thanks Received:
    145
    Location:
    Maribor, Slovenia
    nvidia needs to cut the crap and make a DX11 card. if they don't then they can STFU!
     
  17. WhiteLotus

    WhiteLotus

    Joined:
    Jul 30, 2007
    Messages:
    6,551 (2.47/day)
    Thanks Received:
    857
    I kind of agree with nVidia - I don't believe that DX11 ITSELF will cause AMD (or is it still ATi? I get confused. Anyway...) cards to fly of the shelf. What will cause these cards to sell is that they will be top dog for a good few months. And them being DX11 pretty much equates to being better at performing well in DX10 et al (hell there are still "will this run Crysis" threads).

    What nVidia are doing is saying to all those people that think that DX11 will make a huge difference is - hey you don't need DX11 just yet, here we'll (well i would think they would) cut the price on our cards that still perform pretty damn well.
     
  18. mtosev

    mtosev New Member

    Joined:
    Mar 21, 2005
    Messages:
    1,463 (0.42/day)
    Thanks Received:
    145
    Location:
    Maribor, Slovenia
    it's AMD: [​IMG]
     
  19. HossHuge

    HossHuge

    Joined:
    Jun 26, 2008
    Messages:
    2,048 (0.88/day)
    Thanks Received:
    503
    Location:
    EDM, AB, CAN
    I've enjoyed reading this thread. You guys are making alot of vaild points.
     
  20. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    28,749 (11.14/day)
    Thanks Received:
    13,677
    Location:
    Hyderabad, India
    That's not nerd rage, this is:

    [​IMG]

    He wasn't happy when he found out that his GTX 380M was based on 40 nm G92c. That aside, let's get back on track.
     
  21. extrasalty

    Joined:
    May 13, 2009
    Messages:
    176 (0.09/day)
    Thanks Received:
    25
    ATI: We have DX11 WHQL driver.
    nVidia: DX11? Phew, let's concentrate on whats important- the Powerpoint slides.
     
  22. [I.R.A]_FBi

    [I.R.A]_FBi New Member

    Joined:
    May 19, 2007
    Messages:
    7,664 (2.82/day)
    Thanks Received:
    540
    Location:
    c:\programs\kitteh.exe
    does this driver have opencl?
     
  23. newconroer

    newconroer

    Joined:
    Jun 20, 2007
    Messages:
    3,178 (1.18/day)
    Thanks Received:
    344
    "...framerate and resolution are nice, but today they are very high and going from 120fps to 125fps is not going to fundamentally change end-user experience. But I think the things that we are doing with Stereo 3D Vision, PhysX, about making the games more immersive, more playable is beyond framerates and resolutions. Nvidia will show with the next-generation GPUs that the compute side is now becoming more important that the graphics side.”

    Um no, going from 120 to 125 isn't worth anything, correct, but stopping performance going from 60 to 30 IS worth something.

    Compute side is all well and good, because without it, 'special' visuals won't work effeciently, but to say that pure computing is necessary is a bit premature.

    Hopefully he's hinting and suggesting what we want to see in the near future, which is real time vector drawing, rather than pre-rendered visuals. But that would require cards with massive computing flexibility, like the FIRE GL types used in AutoCAD programs.

    But still, stop making cards that do give you 125fps over 120fps, and start making ones that don't cower in fear at a few dynamic shadows in a 3d program. Then worry about 'compute' cards.
     
  24. Steevo

    Steevo

    Joined:
    Nov 4, 2005
    Messages:
    8,380 (2.55/day)
    Thanks Received:
    1,230
    ATI was hardware accelerating before Nvidia.
     
    10 Million points folded for TPU
  25. extrasalty

    Joined:
    May 13, 2009
    Messages:
    176 (0.09/day)
    Thanks Received:
    25

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page