1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce 4XX Series Discussion

Discussion in 'NVIDIA' started by qubit, Sep 25, 2009.

Thread Status:
Not open for further replies.
  1. jmcslob

    Joined:
    Mar 14, 2009
    Messages:
    2,901 (1.45/day)
    Thanks Received:
    459
    Location:
    Internet Heaven
    I can't wait for the GTX300's launch even though i will never buy an Nvidia product ever again, I hope they crush ATI's performance as well and dominate...and why would i want to see a terrific Nvidia product I will never buy...competition of course, if they crush the 5870 prices will Fall, and then ATI will release an even better card then Nvidia will crush it etc...I love competition, it's so much better when companies compete rather than collude with one another, I can't wait to see the prices of video cards that have more power than i need at a sub $250 price point...YAY all of us
  2. kid41212003

    kid41212003

    Joined:
    Jul 2, 2008
    Messages:
    3,584 (1.59/day)
    Thanks Received:
    533
    Location:
    California
    The GTX280 is as fast or just little slower than 9800GX2 in most games.

    And I'm expecting GTX300 series would do the same, which mean, a single GPU card would be as fast (or faster than) as GTX295.
  3. qubit

    qubit Overclocked quantum bit

    Joined:
    Dec 6, 2007
    Messages:
    9,821 (3.99/day)
    Thanks Received:
    3,480
    Well, I wouldn't say he's baseless at all. While his ranting style can get wearisome, he's usually (but not always) quite accurate.

    For example, he did us all a service, by exposing the bumpgate crap that nvidia was trying to hide and blame on everyone else - he wrote many many articles on that and made sure that nvidia couldn't sweep the problem under the carpet until everyone forgot about it. That article on The Inquirer where he exposed the dodgy video chips by cutting up a brand spanking new Apple notebook and looking at the chip with an electron microscope was an awesome bit of journalism - someone had to dig deep into their pockets to buy the laptop to cut up and hire the microscope. I haven't seen an article like that anywhere else.

    nvidia had totally denied this problem on the Macs until he exposed it and nvidia as liars. Kudos Charlie. :)
  4. Benetanegia

    Benetanegia New Member

    Joined:
    Sep 11, 2009
    Messages:
    2,683 (1.48/day)
    Thanks Received:
    694
    Location:
    Reaching your left retina.
    It was twice as fast at launch, but you needed a Core2 to unleash all of it's potential. Here's a pre-realease review (it got even better after launch and new drivers):

    http://techreport.com/articles.x/11211/14

    At 1600x1200 4xAA, common resolution at the time, it was twice as fast in Oblivion, GRAW and Quake 4, the most demanding games of that time, and in HL2:Ep1 is 80% faster, %50 in FEAR. A very different picture than what we see with HD5870. The thing was even more notable when we factored in the price at that time, the $400 8800 GTS was significantly faster than the $500 X1950XTX. Not much lower priced XT wasn't either a match in perf/price, neither Nvidia's 79xx cards. Nothing could touch the 8800 in price/performance and offer same kind of performance at the time. Now, GTX275/HD4890/GTX260/HD4870 literally destroy the HD58xx cards in perf/price and all of them. The 8800 competed with $300+ cards, HD58xx have to compete even with $150 cards, there's no contest between 8800 and HD58xx in that regards.

    EDIT: Important to note is also that in the 8800 review they are using a stock clocked Core2, so there was still some room from improvement, while Wizzard uses a heavily overclocked i7 and you can't find anything faster right now, nor you will find in quite some time.

    Yeah and that's the only thing he has ever been right about, and it wasn't even happening what he said it was happening, he just found something that was wrong and then made up the rest as usually does to portray something that wasn't true at all, as if it was the end of the world. He and every person that believes him come up with that "article" to say that he is most of the times right. One "article" doesn't make you be right "almost always", it makes you right once.

    Some of his contributions:

    - less than 30% yields for GT200 --- FALSE
    - GT200 will be late --- FALSE
    - Nvidia can't make GTX cards below $300 --- FALSE
    - GT200 will probably be slower than VR770 --- FALSE
    - Nvidia can't make dual GT200 card even at 55nm --- FALSE

    Wait, I think the tape is finished, turn it over.

    - less than 20% yields for GT300 --- FALSE
    - 2% yields for GT300 --- FALSE
    - GT300 will be a flop couse it's MIMD --- ?
    - GT300 will not be able to compete with evergreen --- ?
    Last edited: Sep 26, 2009
    newtekie1 says thanks.
  5. wahdangun

    wahdangun New Member

    Joined:
    Oct 2, 2008
    Messages:
    1,512 (0.70/day)
    Thanks Received:
    114
    Location:
    indonesia ku tercinta
    so where is this rumored "monster GPU", no information yet?

    what i see in this post, is just a bunch of fanboy defending their beloved company, and yet no actual information surfaced.

    and nvdia say they want to steal ati thunder by releasing benchmark ES card, but until this time no word from NVdia, just a BS talk about how useless is DX 11

    i want to see GT300 fast, so ATI can lowering their price and i will grab HD 5870 1 GB, no need for more expensive hardware, because with just that one it's already overkill.
  6. qubit

    qubit Overclocked quantum bit

    Joined:
    Dec 6, 2007
    Messages:
    9,821 (3.99/day)
    Thanks Received:
    3,480
    True. However, an article like that lends a lot of credence to his other work, too. Now, I can't remember all the stories he's written over all this time, but they seemed generally right to me.

    He was like a dog with a bone over the chip renaming fiasco for example, that nvidia is still pulling and I think not letting go of something like that is a good thing.

    If you can find an article where he was actually wrong - significantly wrong - I'd be very interested. I don't think there is one. Please reply about Charlie in a new thread, see below.

    And finally everyone, this thread is supposed to be about nvidia's nextgen graphics chips - please keep the discussion about that.

    I know that details are thin on the ground at the moment and we all want to know, but please let's not allow this to become a flamefest over Charlie's articles, which are controversial and off-topic. If you want to discuss them, please start another thread.
  7. Benetanegia

    Benetanegia New Member

    Joined:
    Sep 11, 2009
    Messages:
    2,683 (1.48/day)
    Thanks Received:
    694
    Location:
    Reaching your left retina.
    You have a list in my last post, I could have included what he said about G92 too, but really, there's no point, just change GT200 for G92 and you are good to go.

    IMO in the very first moment that Demerjian is cited as something related to GT300, which is on-topic, discrediting him for what he deserves becomes just as on-topic.
  8. newtekie1

    newtekie1 Semi-Retired Folder

    Joined:
    Nov 22, 2005
    Messages:
    19,863 (6.20/day)
    Thanks Received:
    5,968
    Not really, I put my computer to sleep when it Idles, as do 90% of the people who own computers. Modern operating systems are great at handling this.

    Besides that, the GTX295 is only what? 55w at idle, not a whole lot more, and remember we are talking about CF here, so double the 20w to get 40w. That 15w is not going to kill you, in fact you aren't even going to notice the difference in your electric bill. And the fact of the matter is that people buying these high end cards are NEVER worried about the power consumption when idle. They don't care. Period. You are talking about people that are putting out a minimum of $450 in graphics cards, do you really think they are worried about the $10 extra a year one is going to cost over the other because of slightly higher idle power consumption? NO, they aren't.

    However, power consumption under use is far more important to the people buying these cards. Why? Because it determines what power supply can be used. The different between a GTX295 and CF HD5870s might mean the difference between an 850w power supply and a 750w power supply.

    When you can explain to me why the renaming was bad for the consumer, then I'll call it a "fiasco". But renaming SKUs to make them better fall in line with the current numbering scheme based on performance only helps the consumer. There was no way a consumer would come out worse off by renaming the older SKUs to fill the mid and low end market. And if you really examine the situation, the consumers that he claims it affected, would have probably been worse off had nVidia not renamed the G92 SKUs.
    Last edited: Sep 26, 2009
    Crunching for Team TPU More than 25k PPD
  9. qubit

    qubit Overclocked quantum bit

    Joined:
    Dec 6, 2007
    Messages:
    9,821 (3.99/day)
    Thanks Received:
    3,480
    Please use the Charlie D thread to post about Charlie

    As I've said in big bold red letters above, this thread isn't about Charlie Demerjian and discussing him here is off topic and derails the thread and comments will only get unwanted moderator attention.

    Therefore, I won't reply to anything said about him on this thread.

    So, please direct all your replies (and flames! It's ok :) ) about him to my new thread on General Nonsense, The Charlie Demerjian flame thread

    Now, I can't say fairer than that!
  10. qubit

    qubit Overclocked quantum bit

    Joined:
    Dec 6, 2007
    Messages:
    9,821 (3.99/day)
    Thanks Received:
    3,480
    Benetanegia & newtekie1, I've replied to you there. ;)
  11. wolf

    wolf Performance Enthusiast

    Joined:
    May 7, 2007
    Messages:
    5,541 (2.07/day)
    Thanks Received:
    842
    I'm hoping GT300 packs a huge whallop, more than GTX295 average performance from a single GPU would be... woah. I had hopes a single 5870 could do it... I guess there's still time.

    MARS or higher performance.... seems bold but did'nt we recently have predictions (by nvidia) of GPU power increasing a few hundred times over the next several years? at that rate you'd hope that 2x GTX285 performance isn't too much of a stretch.... I can dream right? :D

    As for what's best to buy now (like 5870 CF vs GTX295) my money is on 2x GTX 260's or 275's (if the price is right)

    260's should about par a 295 on average and 275's will be faster, both combination's way cheaper than 5870's (and 295), hec 2x 260's might even be around the cost of 1x 5870, and better overall performance.
  12. qubit

    qubit Overclocked quantum bit

    Joined:
    Dec 6, 2007
    Messages:
    9,821 (3.99/day)
    Thanks Received:
    3,480
    +1

    +1 there Wolf. I was disappointed that the 5870 didn't beat the x2 cards too.

    However, as it's not all that far behind and nvidia are totally obsessed with the performance crown, I will expect that nvidia's latest and greatest will comprehensively beat it and will therefore beat all the x2's as well.

    I will then upgrade to it several months down the line, when the revised version (eg 280 to 285 revision) comes out and the prices have dropped to almost reasonable.
    wolf says thanks.
  13. DaedalusHelios

    DaedalusHelios

    Joined:
    Feb 21, 2008
    Messages:
    4,920 (2.06/day)
    Thanks Received:
    812
    Location:
    Greensboro, NC, USA
    Usually right? pfft mindless editorial ranting without sources makes youtube look like an encyclopedia.

    Awful sources. Its spam to say the least.
  14. qubit

    qubit Overclocked quantum bit

    Joined:
    Dec 6, 2007
    Messages:
    9,821 (3.99/day)
    Thanks Received:
    3,480
    Please see posts 56 & 59.
  15. wolf

    wolf Performance Enthusiast

    Joined:
    May 7, 2007
    Messages:
    5,541 (2.07/day)
    Thanks Received:
    842
    Cheers, any single gpu that can consistently beat a GTX295 has my name written all over it. I've had a 295 before (run 260 SLI now) and I wouldn't replace my GFX subsystem with any single GPU that cant at least do what I get now.

    I really want to see how a 5870 does with memory clocked at 5.5+ ghz and maybe higher core, but I feel the memory bandwidth is the limiter for that card.

    Somehow I fear like most of Nvidias big hitters, GT300 will pack A LOT of shaders, and A LOT of memory bandwidth.
  16. SNiiPE_DoGG New Member

    Joined:
    Apr 2, 2009
    Messages:
    582 (0.29/day)
    Thanks Received:
    135
    Dont dismiss power savings as no big difference... thats just being ignorant of facts in innovation. If you leave your computer on 24/7 the the Idle power usage means a LOT

    lets say we run the numbers on this chart and use the 20w assumed idle for the 5870 to say that the base system is 150w at idle - there you can see the 295 uses 89w at idle.... yeah, thats fricken terrible (compared to 5870 ;)) and even 2 x 5870 is less than half that power consumption.
    [​IMG]


    and dont tell me small numbers are meaningless, because half is half no matter how you look at it and that small numbers adds up BIG TIME in 6months of 24/7 use
  17. Binge

    Binge Overclocking Surrealism

    Joined:
    Sep 15, 2008
    Messages:
    6,982 (3.20/day)
    Thanks Received:
    1,752
    Location:
    PA, USA
    So not only are we looking at the performance of the GT300 but also the power and heat. This is important to remember, but in the end doesn't say which is the better card unless power consumption is wayyyyy out of proportion for the performance.
  18. DaedalusHelios

    DaedalusHelios

    Joined:
    Feb 21, 2008
    Messages:
    4,920 (2.06/day)
    Thanks Received:
    812
    Location:
    Greensboro, NC, USA
    We won't know how ATi vs. Nvidia will go this round because Nvidia has not entered the ring yet.
  19. newtekie1

    newtekie1 Semi-Retired Folder

    Joined:
    Nov 22, 2005
    Messages:
    19,863 (6.20/day)
    Thanks Received:
    5,968
    Yes, but we are talking about Crossfire here! That means twice the power usage of a single card! How is that hard to understand.

    A single HD5870 uses 20w when idle, look at W1z's review if you don't believe me. The card alone, regardless of the rest of the system, uses 20w. Double that and you get 40w. The GTX295 uses 55w. 40w is no where near half of 55w, so where are you pulling these numbers from?

    The 15w difference is next to nothing. You will never even notice the difference. We are talking maybe $1.50 a month, if you leave the computer on 24/7. These are $450+ cards, at least, so in the big picture idle power consumption really doesn't matter.

    Just to give you an idea:
    The current national average for electricity costs is: $0.10120 per KWH
    The difference is 15w or 0.015KW.
    Thats means 0.36KWH per day.
    Thats 131.4KWH difference for an entire year of 24/7 idle use.
    Thats $13.30 difference for an entire year of 24/7 idle use.

    So yes, for a $450-600+ graphics card setup, $13.30 a year means nothing, and can really be ignored.
    Last edited: Sep 26, 2009
    Crunching for Team TPU More than 25k PPD
  20. SNiiPE_DoGG New Member

    Joined:
    Apr 2, 2009
    Messages:
    582 (0.29/day)
    Thanks Received:
    135
    according to canucks the GTX 295 uses 89w, not 55

    and I was talking about CF because 40w (2 5870) < 1/2 * 89w GTX295
  21. Benetanegia

    Benetanegia New Member

    Joined:
    Sep 11, 2009
    Messages:
    2,683 (1.48/day)
    Thanks Received:
    694
    Location:
    Reaching your left retina.
    But it makes more sense to believe in Wizzards numbers because of 3 reasons:

    1- He measures the power consumption of the card and not the whole system, and he does so with a more precise method and tools.

    2- His reviews are the best ones all around.

    3- You are posting in TPU. :)
  22. wahdangun

    wahdangun New Member

    Joined:
    Oct 2, 2008
    Messages:
    1,512 (0.70/day)
    Thanks Received:
    114
    Location:
    indonesia ku tercinta
    i think SNiiPE_DoGG quite correct (although it's not really important like newtikie said) and the new HD 5870 use new method in CF, the secondary card will be shut down to sub-20 watt.
    here the faq that i quoted from tomshardware :

    "Speaking of CrossFire, when you have two 5870s running concurrently at idle, ATI says that secondary board will drop into an ultra-low power state (purportedly sub-20W). "

    and here are the result :
    [​IMG]

    so it's rather inaccurate to calculate CF HD 5870 power consumtion based on one card idle and then double it, and i think wizz should do like tomshardware because we don' now how future card behave in CF or SLI(and how the power management work if there are 2 card or more in one system) and what feature it may have to reduce the power from second card
  23. newtekie1

    newtekie1 Semi-Retired Folder

    Joined:
    Nov 22, 2005
    Messages:
    19,863 (6.20/day)
    Thanks Received:
    5,968
    Ok, so even if we assume the second card uses 0w when idle(I doubt it will be that low, but lets assume).

    The power savings is only $31 over the course of a year. Again, nothing that anyone buying these cards is going to be concerned with.
    Crunching for Team TPU More than 25k PPD
  24. wahdangun

    wahdangun New Member

    Joined:
    Oct 2, 2008
    Messages:
    1,512 (0.70/day)
    Thanks Received:
    114
    Location:
    indonesia ku tercinta
    ^
    ^ yeah i know, who ever buy CF HD 5870 will never think about power consumption but hey, a little efficiency not going hurt you, u know, and it's going to the good direction.

    cheer:toast:
  25. Benetanegia

    Benetanegia New Member

    Joined:
    Sep 11, 2009
    Messages:
    2,683 (1.48/day)
    Thanks Received:
    694
    Location:
    Reaching your left retina.
    Did you realise that the Crossfire setup is consuming 25W more than single HD5870 in that graph? That's from the AC source so that means the second card is consuming 25W, despite what AMD says.

Currently Active Users Viewing This Thread: 2 (0 members and 2 guests)

Thread Status:
Not open for further replies.

Share This Page