1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce 4XX Series Discussion

Discussion in 'NVIDIA' started by qubit, Sep 25, 2009.

Thread Status:
Not open for further replies.
  1. Bo_Fox

    Bo_Fox New Member

    Joined:
    May 29, 2009
    Messages:
    480 (0.26/day)
    Thanks Received:
    57
    Location:
    Barack Hussein Obama-Biden's Nation
    8x AA is nice, even at HD resolutions. However, to support your statement, I would still prefer 4x AA with Adaptive AA or TR-SSAA rather than 8x AA without AdAA or TRSSAA.

    Try playing Company of Heroes with 8xAA and it's jaw-dropping. The detail looks impeccable.. you just cannot get enough of this eye-candy, even with a magnifying glass.

    32x AA.. that, I think is a waste. I would rather see 32x AF than 16x AA.
  2. TheMailMan78

    TheMailMan78 Big Member

    Joined:
    Jun 3, 2007
    Messages:
    20,850 (8.03/day)
    Thanks Received:
    7,384
    My max resolution is 1680x1050 and 4x is all I have ever needed.

    2560x1600 and you need more than 4x AA? Something is ether wrong with your rig or your eyes.

    I'm talking between 4x and 8x.

    4x AA with Adaptive AA is all you need. You get the same effect as 8x without adaptive AA.
  3. Bo_Fox

    Bo_Fox New Member

    Joined:
    May 29, 2009
    Messages:
    480 (0.26/day)
    Thanks Received:
    57
    Location:
    Barack Hussein Obama-Biden's Nation
    Not exactly.. A few of the newer games (Fallout 3, Devil May Cry 4, Resident Evil 5, etc..) already include alpha-aliasing by default if you enable AA in the in-game menu. Some games only have a few transparent textures, while others use a lot, so it depends.

    For some games, 8xAA makes it that much more jaw-dropping, trust me on this. Do you know how detailed Company of Heroes is? With 4x AA, it's nice, but with 8x AA, it's "OH MY GOSH!"
    Last edited: Jan 4, 2010
  4. AddSub

    AddSub

    Joined:
    Aug 9, 2006
    Messages:
    1,001 (0.35/day)
    Thanks Received:
    152

    That could be it. I must admit though, I do run 32xAA on several titles. I found that if force SLI 32xAA through nVidia drivers on older DirectX 6/7/8 games which are hardcoded to 1280x1024 or sometimes even lower (along with maxing out every other IQ setting) I can bring these older low-res titles up to modern standards. Well not really, but the results can be pretty suprising. For instance, doing this with older titles in the Need For Speed series (NFS:HS & NFS: HPII) one of of which is DirectX6 and the other DirectX8, the results are pretty amazing when every possible driver setting is maxed out, SLI 32xAA included. It certainly freshens up the visuals, by a few years at least.

    But yeah, at 1920x1200 in most modern titles I can see very little difference above 4xAA.
  5. cadaveca

    cadaveca My name is Dave

    Joined:
    Apr 10, 2006
    Messages:
    13,743 (4.56/day)
    Thanks Received:
    6,767
    Location:
    Edmonton, Alberta
    Nah, it's the low-res textures in alot of games. Most seem optimized for 1920x1080/1200 though, and knowing how few others run 2560x1600, I can understand why.

    I mean really, 2xAA is MINIMUM needed for most apps @ 2560x1600, if not 4xx, even on titles screens, where typeface is jaggy as all heck.

    Frankly, I've never regreted anything more than buying this damn 30-inch monitor.

    You don't have one obviously, or you'd know all about it. No biggie. Nice how you assume something is wrong...lol.
  6. DrPepper

    DrPepper The Doctor is in the house

    Joined:
    Jan 16, 2008
    Messages:
    7,483 (3.16/day)
    Thanks Received:
    813
    Location:
    Scotland (It rains alot)
    Imagine 2560 x 1600 at 24" that would be a nice picture.
  7. TheMailMan78

    TheMailMan78 Big Member

    Joined:
    Jun 3, 2007
    Messages:
    20,850 (8.03/day)
    Thanks Received:
    7,384
    No I have a 52" ;)
  8. cadaveca

    cadaveca My name is Dave

    Joined:
    Apr 10, 2006
    Messages:
    13,743 (4.56/day)
    Thanks Received:
    6,767
    Location:
    Edmonton, Alberta
    You bet. the dot-pitch would be awesome. Hence my using 3x23-inch 1920x1080 for eyefinity...it's all about Pixel-Per-Inch.


    What does that have to do with 2560x1600? Try sitting less than a foot away from THAT PPI!:laugh: YUCK!!
  9. DrPepper

    DrPepper The Doctor is in the house

    Joined:
    Jan 16, 2008
    Messages:
    7,483 (3.16/day)
    Thanks Received:
    813
    Location:
    Scotland (It rains alot)
    Exactly. This 32" has terrible image quality compared to my 22" at 1680 x 1050. Also my 42" is 1368 x 768 so imagine the pain :(
  10. TheMailMan78

    TheMailMan78 Big Member

    Joined:
    Jun 3, 2007
    Messages:
    20,850 (8.03/day)
    Thanks Received:
    7,384
    You lost. Love it.
  11. cadaveca

    cadaveca My name is Dave

    Joined:
    Apr 10, 2006
    Messages:
    13,743 (4.56/day)
    Thanks Received:
    6,767
    Location:
    Edmonton, Alberta
    :wtf:

    Lost what?

    E-Peen? lol. You can have it. :nutkick:

    :shadedshu

    Fact of the matter is when I got this monitor, it was one of very few IPS-S panels on the market, and was bought for colour accuracy. In the end, that's all it's good for.:laugh:

    Would be nice if Fermi would make gaming nice on it though...been waiting some time for good gaming with awesome colour @ 30-inch.
  12. subhendu

    subhendu New Member

    Joined:
    Jan 26, 2009
    Messages:
    488 (0.24/day)
    Thanks Received:
    33
    HOT news/rumour :cool:

    GF100 outperforms ATi's 5870 by 46% on average
    GF100 outperforms ATi's 5970 by 8% on average

    The GF100 gets 148 fps in DiRT2
    The GF100 gets 73 fps in Crysis2
    The GF100 gets 82 fps in AvP3

    *GTX misnomers removed due to Business NDA*

    GF100's maximum load temperature is 55 C.

    The release week of GF100 is Mar 02nd



    Extra spoilers!
    GF100 and GF104 will feature a new 32x Anti Aliasing mode for enthusiast setups.
    GF100 and GF104 can do 100% hardware based decoding for 1080p BluRay and H264 playback.
    GF100 and GF104 feature full SLi capability and a new scaling method for rendering!
    GF100 and GF104 will provide full on chip native C++ operation for Windows and Linux environments. This will be further augmented with CUDA and OpenCL.
    GF104 will feature new technology designed for UHD OLED monitors!
    GF100 promises to deliver at least 40% more performance than the GTX295 for less money. GF104 promises double that.

    link:
    http://www.guildwarsguru.com/forum/rahjas-freeeeeeee-t10420384.html?

    AvP3 and Crysis2 can be taken with a grain of salt, since they are not finalized and drivers aren't finalized yet. DiRT2 may improve or decline slightly, depends on final drivers.

    Those tests were run using a Corei7 920, 6GBs of DDR3 1333, and an Intel 64GB SSD paired with a single GF100 card. The tests were run at 1920x1200 with 4x SSAA and 16xAF.
    Bo_Fox says thanks.
  13. TheMailMan78

    TheMailMan78 Big Member

    Joined:
    Jun 3, 2007
    Messages:
    20,850 (8.03/day)
    Thanks Received:
    7,384
    BU...BU....BU....BU......BULL SHIT BREAKER!
  14. yogurt_21

    yogurt_21

    Joined:
    Feb 18, 2006
    Messages:
    4,302 (1.40/day)
    Thanks Received:
    541
    Location:
    AZ
    sniff sniff...sniff...sniff...sniff.

    hmmm...

    I smell BS.

    your source is a forum post without a source. brilliant.

    still waiting for the stone cold nv to release some actual news.
  15. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    28,198 (11.43/day)
    Thanks Received:
    13,569
    Location:
    Hyderabad, India
    [​IMG]
    Bo_Fox says thanks.
  16. Bo_Fox

    Bo_Fox New Member

    Joined:
    May 29, 2009
    Messages:
    480 (0.26/day)
    Thanks Received:
    57
    Location:
    Barack Hussein Obama-Biden's Nation
    I would strongly recommend using 4x4 SSAA (or a combination of 2x2 SSAA with 4x MSAA if you rather have the edges be as sharp as true 16x AA) by using the nHancer program. 2x2 also doubles the AF to 32x, and 4x4 quadruples it to 64x effective AF. This boosted image quality is pretty much as good as it can get for those older games, and 32x AA is pretty much useless for those simple-looking games with a few polygons. Sometimes, SSAA can make things a tad bit blurry, but you could adjust the LOD to sharpen it, or just leave it at that. I usually find it to look better when a bit blurry at 4x4 with those older games, so that those polygons do not stand out with sharp knife-edges in such a simple game. Half Life 2 would look more "cartoony" and artistic.. just overwhelmingly beautiful! Take a look at the pictures on http://www.nhancer.com/help/AASamples.htm

    Then try using 8xAA instead of 4xAA. You will truly appreciate it on a 52" (especially if it's LCD, not plasma). The jaggies are still somewhat noticeable on my 24" screen when using 4xAA, during regular gameplay.

    40% faster than a GTX 295, but only 46% faster than a 5870? Well, a GTX 295 is more like 20% faster than a 5870, so the numbers do not make perfect sense yet. Perhaps it will actually be only 25% faster than a GTX 295, which is still good. It was so disappointing when the 5870 was only like 25% faster than a GTX 285, after we had such high expectations for a new generation.

    ROFL!! A cool smiley! I'd like that added to the smilies list for TPU forums! :laugh:
    cadaveca and qubit say thanks.
  17. qubit

    qubit Overclocked quantum bit

    Joined:
    Dec 6, 2007
    Messages:
    9,821 (4.08/day)
    Thanks Received:
    3,479
    Thanks Bo, I didn't know about the nhancer utility. :toast:

    Also, I'll +1 you on that smiley.
  18. 20mmrain

    20mmrain

    Joined:
    Oct 6, 2009
    Messages:
    2,765 (1.59/day)
    Thanks Received:
    824
    Location:
    Midwest USA
    I smell BS..... Or we could all wait two days and find out in Las Vegas. Sure they might be displaying the card in SLI (possibly) But we could just divide from there and get an approximate answer from that.

    I contribute the 25% faster speed to the GTX 285 for a couple different reasons. One the 5870 didn't release at $650 dollars unlike Nvidia's most likely cards will. Also the performance of the last generation cards on both sides was very impressive. The fact that they got it that much quicker to begin with is amazing. It is very hard to double the performance every year. But even with that said ATI vs. ATI they did. The 5870 is almost as powerful as two 4870's.
    So to double their performance and price the 5870 as they did I don't think it is disappointing at all. Not to mention that they lowered the power consummation and heat too.

    Now with Nvidia's offerings while I do think GF100 will be more powerful than the 5870. I don't think it will double the performance of a GTX 285 either. Nor do I think that it will be close to 50% more powerful than a 5870.

    Remember those are just rumors...... in all reality GF100 will be about as powerful as a single GTX 295 and the GTX 395 will probably be about 15% more powerful than a 5970. Kind of like last time.

    There are always over hyped rumors like this every time a new card releases. Femi will be awesome but I don't expect it to be the greatest thing since sliced bread either!
  19. Binge

    Binge Overclocking Surrealism

    Joined:
    Sep 15, 2008
    Messages:
    6,981 (3.29/day)
    Thanks Received:
    1,751
    Location:
    PA, USA
    This post has been a long time formulating, and I welcome any criticisms.

    How many of us have gotten at least 3 different claims as to the performance or release of this card? Cynically I've decided that I'm not going to bat my eyelashes at any claims that come out of CES. There's bound to be a little more truth circling the bowl, but most people will excuse me if I assume the cycle of bullsh!t has yet to flush. I'm not sure if the majority of posters/readers will excuse my overall indifference because that isn't very exciting. Likewise it's not hard to speculate that NV may have a true performer to take a crown in 2010, but ATI has a firm place in this generation's the line-up which could mean good or bad things in the future. With the downturn of the global economy there is enough of a depressant force present in a number of software companies to recycle old engines, or adopt some sort of broad design utility. The mainstream GPUs will see more action than chopsticks during the Chinese new-year. I think it's wise to assume that it's getting dangerously close to a point in time where the GPUs must offer stellar performance in a new API because Microsoft not only authors DX runtimes, but they are also a console competitor. Realistically (and correct me if I'm wrong) they're going to merge development of their runtimes with console development. The paradigm shift will be when enough of the software industry is willing to move.

    If you accept any of these ideas then I offer a summary of my thoughts.

    -the GT300/GT100 series cards are going to take a crown in performance, but this generation will offer little more than a spitting contest between ATI/NV.
    -3D environment software development will become further compartmentalized, and game developers will buy into a smart, economic, standard before leaning head-on into a new API which is not yet mature/affordable in hardware support.
    -Microsoft (3v!L3) will most likely decide which generation of GPU will hold the standard for a life determined by their next console.

    I'm a bit off topic, and a little on topic. It's pretty obvious, but I figured this is a nice mix of topic all rooted around the importance of the GT300.
    Last edited: Jan 18, 2010
    20mmrain and cadaveca say thanks.
  20. bobzilla2009 New Member

    Joined:
    Oct 7, 2009
    Messages:
    455 (0.26/day)
    Thanks Received:
    39
    I agree, for 90% of games, pc games are limited to what is on the consoles. So powerful cards are just generally overkill at the moment, seeing as most games don't even need an 8800GTX yet (the same gen the 360 was derived from, IIRC). However, i expect MS is in full swing with regards to the xbox 720 or whatever they call it, so i'd imagine it'll be dx11 derived.

    That would mean dx11 cards are the way to go, and if the gtx380 is an amazing card, it will likely suffice until the xbox 1080 :) So the stagnant tech of the console market is a double edged sword, the days of having to upgrading yearly are gone for high end buys, at the expense of fairly mediocre increases in graphical quality. Let's face it, apart from crysis, are there really any games that push the gpu's at all? and if the 720 weren't to run on a dx11 derivative, why are bigger companies wanting to use dx11? I just don't buy the 'it's because we like pc gamers' ruse :)

    However, if the cryengine 3 turns out to be extremely scalable to the different platforms as it makes out, it may be the pc's salvation for a few years with regards to graphics. If it is adopted by more than just crytek of course, since all cross platform games are limited by the weakest console. The 360 didn't have any 'next gen' games until GoW because of the ps2, they were all just tarted up with textures and resolution boosts, kind of like what we're putting up with now.
    Last edited: Jan 6, 2010
  21. cadaveca

    cadaveca My name is Dave

    Joined:
    Apr 10, 2006
    Messages:
    13,743 (4.56/day)
    Thanks Received:
    6,767
    Location:
    Edmonton, Alberta
    Personally, I think that this decision has already been made.

    The rest of your comment resounds loudly in my head...but knowing that ATi hardware was used for development of DX9, DX10, and now DX11, I find it hard to beleive that nVidia has any chance in the console market, as they've snubbed M$ too many times.

    At the same time, these performance "samples" we have heard hints at either tight NDAs, and fishing for leakers, pushing ATi to finalize the R8xx refresh clockspeeds, or general nonsense caused by trying to "save face" by certain parties.

    I ifnd it hard to beleive that the transistor density nV is going for will really excel so much further than ATi in graphics...gen after gen nV has more inside, but the numerical difference does not = FPS difference, especially considering the extra logic for things like audio and tesselation, rignbus, and the like, that ATi has been doing for some time now.

    Hopefully Fermi will be everything hoped for, but I find it ahrd to remian optimistic...even though I want them to completely trounce R8xx, by like 40%. I think at most we might see 10-20%, based on the past.
    KainXS says thanks.
  22. bobzilla2009 New Member

    Joined:
    Oct 7, 2009
    Messages:
    455 (0.26/day)
    Thanks Received:
    39
    Nvidia will develop for the ps4 most likely, unless they put the psp out of business ^^
  23. E.M.R

    E.M.R New Member

    Joined:
    Jan 6, 2010
    Messages:
    16 (0.01/day)
    Thanks Received:
    1
    How Much Are these card going to cost?
  24. KainXS

    KainXS

    Joined:
    Sep 25, 2007
    Messages:
    5,600 (2.26/day)
    Thanks Received:
    501
    nvidia might just develop the chip for the next gen psp you never know

    but will they develop for the next xbox, . . . . I am very sure they won't.

    it definitely isn't going to be a G300 or a HD6000 because by the time those consoles come out those cards will be old
  25. bobzilla2009 New Member

    Joined:
    Oct 7, 2009
    Messages:
    455 (0.26/day)
    Thanks Received:
    39
    ATi developed the 360 chip, and they worked a lot with m$ with dx11, so i'm assuming a hd5/6xxx derived chip in the '720'. Apparently the 720 will be out in 2010/11 though, and it makes sense considering how old the 360 is now.

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Thread Status:
Not open for further replies.

Share This Page