1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Is GTX 480 / Fermi a game changer for you?

Discussion in 'TPU Frontpage Polls' started by W1zzard, Mar 27, 2010.

?

Is GTX 480 / Fermi a game changer for you?

Poll closed Jun 29, 2010.
  1. Yes, competition is great for the customer

    1,470 vote(s)
    9.7%
  2. I can't pick a clear winner

    448 vote(s)
    2.9%
  3. I got tired of waiting and bought an ATI card

    1,940 vote(s)
    12.8%
  4. Power/Heat/Noise is important, no Fermi for me

    7,651 vote(s)
    50.3%
  5. I'm happy with what I have

    2,227 vote(s)
    14.7%
  6. I'm going to buy one

    744 vote(s)
    4.9%
  7. Yes, it's the better card

    720 vote(s)
    4.7%
  1. Black Hades

    Black Hades

    Joined:
    Sep 11, 2007
    Messages:
    300 (0.12/day)
    Thanks Received:
    31
    Location:
    Ambugaton
    I know that constantly waiting for the next big thing wont lead anywhere, especially with video cards that get a new cycle every 6 months or so. But I DO think that waiting for 2nd gen DirectX 11 may not be a bad idea if you think about it.

    A funny parallel to the automotive industry, and I'm sure it was used before but with this generation of video cards it applies.. ATi the japanese supercars vs the nVidia gas guzzling american muscle cars...

    Edit:
    Power Consumption/Heat aside and coming from an ATi user you people DO see that from all the reviews the Fermi is the DX11 king...right?
    something that in the long run is quite relevant especially if ATi/AMD wont grasp the same level of parallelization/modularity by the time 2nd gen DX11 cards are out. I keep my fingers crossed for that one. I honestly hope neither ATi or nVidia get the upper hand for too long from now on.

    Not to long ago I heard you guys bicker about how nVidia had no DX11 and gddr5 and so on and so forth. Well now that Fermi (seems) to have achieved in it's goal to dominate next gen techs like tessellation and you STILL bash it around a bit... remember when the 8800 Ultra came out it's pwr consumption/temps?

    On the other hand I'm a bit uneasy myself thinking about buying a card that'd add up to (if not above) 50% of the power consumption of the entire PC.
     
    Last edited: Mar 28, 2010
  2. Frick

    Frick Fishfaced Nincompoop

    Joined:
    Feb 27, 2006
    Messages:
    10,809 (3.41/day)
    Thanks Received:
    2,350
    I'm quite happy with what I've got.

    With that said, I'm still eager to see this in CUDA action, and under some heavy overclocking with pots and whatnot. ;)
     
  3. MikeX New Member

    Joined:
    Jun 15, 2006
    Messages:
    125 (0.04/day)
    Thanks Received:
    10
    not to mention how much we can overclock 58xx to ghz, for nvidia the drivers for new antialiasing techniques has improve by alot. If ati manage to pull good drivers off anytime soon (ie new antialiasing techniques), fermi will going to drop price sooner or later anyway.
    Either way, buying a new card is now just about prices now; unless research how far you could mod clock on the card :roll:.
     
  4. CDdude55

    CDdude55 Crazy 4 TPU!!!

    Joined:
    Jul 12, 2007
    Messages:
    8,179 (3.07/day)
    Thanks Received:
    1,277
    Location:
    Virginia
    Definitely not a game changer, it doesn't offer a significant performance boost over the 5 series, while adding more heat, power consumption and price. Keep in mind though, this could all change with the drivers, it's still to early to make a call on these cards since there still in a very premature state. For those with a 5 series card, have nothing to worry about right now, but as the drivers improve, that could change. I may get a Fermi later on.
     
  5. pantherx12

    pantherx12 New Member

    Joined:
    Jan 2, 2009
    Messages:
    9,714 (4.57/day)
    Thanks Received:
    1,699
    Location:
    ENGLAND-LAND-LAND

    Guess you've not been looking around the net, the 5870 may aswell be tickling the 480s balls its so close behind it in FPS on the new AVP under direct x11.

    It depends on game/bench mark etc.
     
  6. leonard_222003 New Member

    Joined:
    Jan 29, 2006
    Messages:
    241 (0.08/day)
    Thanks Received:
    25
    Really ? antialiasing ? how many people play with 32xAA ? even 8XAA is overkill.
    Antialiasing is a little dirty trick to fool some people into buying , like the memory , here you go a GTS250 with 2 gb of ram , wow , it must be good , and DX12 with gold plated stickers that imrprove perf. and productivity , it doesn't even has to make sense .
    Or , CUDA to bring you the most powerfull computing shit you ever seen, it will blow your head and paint the walls with your brains in real time and with real world physics , translate the bullshit , Cuda means some video/photo editing software will be accelerated to a very limited degree ( encoding with some limitations and problems that might discourage proffesionals to use it and some effects rendered ) and physx - some usually bad games with exagerated physics effects .
    With Ati being adopted so widely i think developers will think twice before accepting Nvidia's help.
     
    saikamaldoss says thanks.
  7. saikamaldoss

    saikamaldoss

    Joined:
    Jan 8, 2009
    Messages:
    200 (0.09/day)
    Thanks Received:
    27
    Location:
    India (Chennai)
    What do you mean by this ?? i don't want to assume :)
     
  8. saikamaldoss

    saikamaldoss

    Joined:
    Jan 8, 2009
    Messages:
    200 (0.09/day)
    Thanks Received:
    27
    Location:
    India (Chennai)
    8x is a overkill for your 4850 and my 5770 but not for 5870 and 480. for a 5870, 8x is like running 2x in our card man.. :)
     
    Last edited: Mar 28, 2010
  9. pantherx12

    pantherx12 New Member

    Joined:
    Jan 2, 2009
    Messages:
    9,714 (4.57/day)
    Thanks Received:
    1,699
    Location:
    ENGLAND-LAND-LAND

    I think what he means to say is, you can't see the difference :laugh:
     
  10. erocker

    erocker Super Moderator Staff Member

    Joined:
    Jul 19, 2006
    Messages:
    39,821 (13.16/day)
    Thanks Received:
    14,196
    8X MSAA is perfect for me. I can easily tell a difference. Add super sampling and it looks even better.
     
  11. Steevo

    Steevo

    Joined:
    Nov 4, 2005
    Messages:
    8,380 (2.55/day)
    Thanks Received:
    1,230
    nay.


    Maybe next spin, and I don't mean PR bullshit fanboi spin.
     
    10 Million points folded for TPU
  12. Fishymachine New Member

    Joined:
    May 25, 2009
    Messages:
    208 (0.10/day)
    Thanks Received:
    9
    Exactly,and there's a even bigger turn off
    For a bit more than a single a GTX480 you can buy 2 MSI R5850 http://www.newegg.com/Product/Product.aspx?Item=N82E16814127500&cm_re=5850-_-14-127-500-_-Product (or @ home http://www.emag.ro/placi_video/plac...0-1024mb-ddr5-256bit-pci-e--pR5850TwinFrozrII) and clock them to 1050,frequency at wich a single should beat a 480 in most titles
    As for a GTX495...even if they cherry pick the chips it'll have a TDP of 380W :twitch: and will need a Norse God not to die in the summer
     
  13. Sasqui

    Sasqui

    Joined:
    Dec 6, 2005
    Messages:
    7,731 (2.38/day)
    Thanks Received:
    1,462
    Location:
    Manchester, NH
    Those poll choices are rather subjective, but isn't everything? :)

    Lets see where the prices lie in a few months and if any other apparent pre-release issues have been ironed out. I'll decide then, not now. However, since I already spent $400 on a 5870, I'm not likely to budge! :laugh:
     
  14. Wile E

    Wile E Power User

    Joined:
    Oct 1, 2006
    Messages:
    24,324 (8.24/day)
    Thanks Received:
    3,778
    You do realize that nVidia still has the largest market share, right?
     
  15. Steevo

    Steevo

    Joined:
    Nov 4, 2005
    Messages:
    8,380 (2.55/day)
    Thanks Received:
    1,230
    Of DX11 cards? Really? They did have readily available inventory. :toast:
     
    10 Million points folded for TPU
  16. Wile E

    Wile E Power User

    Joined:
    Oct 1, 2006
    Messages:
    24,324 (8.24/day)
    Thanks Received:
    3,778
    No, of cards period. DX11 doesn't matter right now, as that user base is still relatively small. DX9 and DX10 are still the most relevant APIs. Devs will still accept nv help because they have the larger user base. That's the point I'm trying to make.
     
  17. Dunceiam New Member

    Joined:
    Mar 29, 2010
    Messages:
    3 (0.00/day)
    Thanks Received:
    3
    I actually just registered on these forums to respond to this post, the ignorance annoyed me that much.

    Anti-Aliasing makes a huge difference, and by no means is a 'dirty trick'. Just take this image as an example (the image is 8xAA). You do notice those jagged edges right? What's the purpose of having incredibly graphically intense games, but having jagged edges everywhere?

    And regarding the graphics card's onboard memory - every millisecond a graphics card has to produce 2,073,600 pixels for a 1080p image, once you raise this to 2560x1600, that number becomes 4,096,000 pixels. Say you're gaming at 60fps, every second 245,760,000 pixels. Each one of which needs to be stored on the card's RAM, and the queue of upcoming commands needs to be stored on the onboard RAM. You may not be gaming on a 24"-30", but others are, and require the extra memory. Also consider CUDA - to take a wild example, video editing. Processing video requires massive amounts of RAM and computing power, a 1GB card would be much slower than an equivalent 2GB card when it comes to video rendering. So no, it's not a damned marketing gimmick, it's an extremely useful trait of the newer graphics cards.

    My God, have you even used CUDA in your lifetime? CUDA is for massively parallel problems (Simplifying everything down, each one of the GPU's shader cores is equivalent to a CPU's 'core'. Essentially meaning the GPU has hundreds of low power cores - allowing it to do a repetitious task hundreds of times faster than a CPU) CUDA isn't even for physics, hell, CUDA isn't even on the graphics card. CUDA is an architecture allowing programmers to program FOR the GPU.

    CUDA (Or more commonly known as GPGPU computing) is the absolute future, no question about it. By no means does it accelerate only to a 'limited degree', it speeds things up by massive magnitudes.

    An example:
    Virus ion placement: 110 CPU-hours (QX6700)
    Same calculation now takes 1.35 GPU-hours (8800GTX)
    27 minutes if three GPUs are used concurrently

    This is straight from Dr. Dobbs CUDA article:

    GeForce 8800GTX w/ CUDA 1.1, Driver 169.09
    Speedup vs. Intel QX6700 CPU

    Fluorescence microphotolysis - 12x
    Pairlist calculation - 10x to 11x
    Pairlist update - 5x to 15x
    N-body cutoff force calculations - 10x to 20x
    Cutoff electron density sum - 15x to 23x
    Cutoff potential summation - 12x to 21x
    Direct Coulomb summation - 44x

    Keep in mind, that's on an 8800GTX versus a QX6700, the GTX480 is much more powerful than the 8800, and so expect those numbers to double.

    Did you just call Batman: Arkhams Asylum a bad game? That's just pure nonsense.

    I personally believe you need to get informed, and stop being a damned fanboy.

    (And to subdue the inevitable reply of, "You're just an Nvidia fanboy." I'll say it now, I've only owned ATI, my first card was a 4850, then a 4870, than a 4870x2.)

    Anyway, I apologize for my long post and thread hijacking, I just couldn't sit back and let the misinformed preach their ludicrous beliefs without some combat.
     
    NastyHabits, shevanel and Wile E say thanks.
  18. Wile E

    Wile E Power User

    Joined:
    Oct 1, 2006
    Messages:
    24,324 (8.24/day)
    Thanks Received:
    3,778
    Wow. Excellent post. Welcome to the forums.
     
  19. leonard_222003 New Member

    Joined:
    Jan 29, 2006
    Messages:
    241 (0.08/day)
    Thanks Received:
    25
    It depends how big the Nvidia's market share is , if it's 90% of discrete cards then Ati is fuc.ked , developers will play how Nvidia wants , if it's 50.1% then developers will think twice before accepting the devils contract , i mean they give up almost half the market.
    Also you have to consider that most people don't buy games , they use pirated copy's , the ones who don't buy games don't really buy high end video cards , so if a game developer makes some resarch discovers that there is a big chance those people that bought HD5850 , HD5870 , HD5970 are likely to buy a game.
    But why talk with no numbers , let's take valve hardware survey :
    http://store.steampowered.com/hwsurvey/
    No matter the graphic card (dx8,9,10,11 ):
    Nvidia 61.88% and Ati 30.92%
    Now if we talk about recent cards
    http://store.steampowered.com/hwsurvey/videocard/
    this is not sorted for easy view but read everything and you get the picture , Nvidia still has a big part of the market share but with weak cards , 8600 is considered a DX10 GPU but i doubt it can run any recent game with full details , or 8800 cards , come on , most 8800 cards are probably 8800GT or GTS320/640 , good cards but getting a bit old now and not suited for future games , also 9600 cards , do you see metro 2033 running on 9600GT with full details ?
    On the other hand we see a good percentage of 4800 cards , i bet most of them are 4850 wich are quite good and also a good deal of 4870 cards can be in that percentage , comparing these cards to 8800 , 8600 and 9600 cards from Nvidia you get the picture how the market looks for developers who want some eye cadny in their games.
    I overlooked 9800 cards wich are good so this is good for Nvidia but still not enough for smart developers to throw away their work just so Nvidia gets more sales.
    Take a look at GTX200 series , not very succesfull generation against 4800 from what i see in those charts.
     
  20. Wile E

    Wile E Power User

    Joined:
    Oct 1, 2006
    Messages:
    24,324 (8.24/day)
    Thanks Received:
    3,778
    Recent card sales don't matter. Power of the card does not matter. All that matters is installed user base. nVidia has a significantly higher installed user base. The majority of all gamers are still using DX9 and DX10 only capable systems, so devs still have to program for them. This makes most of those lower end cards still entirely relevant. Nobody ever said all gamers max settings. In fact, quite the opposite is true, most gamers are just happy to run the games period. Thus, devs are still going to take advantage of nVidia's free help whenever they can.

    You seem to be forgetting this is a heavy duty enthusiast site. We DO NOT represent the majority of PC gamers.
     
  21. pantherx12

    pantherx12 New Member

    Joined:
    Jan 2, 2009
    Messages:
    9,714 (4.57/day)
    Thanks Received:
    1,699
    Location:
    ENGLAND-LAND-LAND
    WAHH!?

    God damn where are you getting this info from Leonard, I'm pretty certain more people still actually buy their games.

    Look at the sales of Modern warfare 2 for example.
     
  22. TIGR

    Joined:
    Aug 17, 2008
    Messages:
    2,183 (0.96/day)
    Thanks Received:
    1,029
    Location:
    Minnesota, USA
    +1

    nVidia is probably going to be in a good position with their Fermi-based offerings in a year, but right now, I am simply disappointed.

    Curious to see Folding@home PPD though.
     
  23. leonard_222003 New Member

    Joined:
    Jan 29, 2006
    Messages:
    241 (0.08/day)
    Thanks Received:
    25
    I'm talking about some games that accept physx and Nvidia's whole deal to punish Ati , if you look at succesfull games that really make money they have nothing to do with physx , they make the game for everyone to enjoy with relative average/good hardware , examples :
    WOW , warhammer 2 , BC2 , dragon age origins , TF2 , GTA4 , COD MW2 , i don't see physx in these games or Ati being excluded or running the game badly compared to Nvidia , these are smart developers who don't give a fu.ck about Nvidia's agenda.
    On the other hand we have some games that pass trough our computers very briefly like crysis , crysis warhead , cyostasis , metro 2033 , these games have absurd high requirements for their respective times of launch and don't really make money like COD MW2 or BC2 , i could call them tech demo's but they are not that.
    I'm talking about developers who made these games that will think twice before accepting Nvidia's help , how much money do they get from Nvidia to lose a lot of potential customers (ati) who has the hardware to run the game if properly optimized ?
    Only one game was succesfull that had physx too , Batman , i bet you they lost some Ati costumers because they included physx.
    I see the present mainstream something in the range of HD4870/HD4890/5770/GTX260/GTX275/GTX285 , who do you think has more market share here ? in the high end i see 5850/5870/5970/GTX295 and that's it , GTX480 isn't even stores and i dont think it will catch up to anything , so if Ati has more market share in the area where more perf. is available so details can be activated and more eye candy can be done , why they would favor Nvidia ?
    Also , Nvidia doesn't give free help , only stupid developers can understand this as free , they come "helping" turning an easy on hardware game into a resource hungry game that needs a better graphic card from costumers and this makes sales bad for the developer , also they force them to not cooperate with Ati , don't give me statemens from Nvidia now , they are so credibile like the charts they give to the press , again the dev. loses potential buyers , look at metro 2033 requirements
    Recommended:

    Any Quad Core or 3.0+ GHz Dual Core CPU

    DirectX 10 compliant graphics card (GeForce GTX 260 and above)

    2GB RAM

    and the optimum :D
    Optimum:

    Core i7 CPU

    NVIDIA DirectX 11 compliant graphics card (GeForce GTX 480 and 470)

    As much RAM as possible (8GB+)

    Fast HDD or SSD

    Do you think these people will make money after this game ? do you think the fellow with 8800 will go buy this game ? or the fellow with the 9600GT ? even the one with GTX260 will have a hard time convincing himself to buy this game after what he read on the internet.
    It's a game that shines only when you activate all the eye candy , sets the mood better , it's not a great gameplay game like COD MW2 or BC2.
    Not all Nvidia deals can be so hurtfull to costumers who have Ati cards , some games accept Nvidia's help on their terms like help us and will put your logo there and that's it , those are few and can deal like this because they have a good game on their hands.
     
    Last edited: Mar 29, 2010
  24. leonard_222003 New Member

    Joined:
    Jan 29, 2006
    Messages:
    241 (0.08/day)
    Thanks Received:
    25
    I missed this post , i'm gonna answer the Nvidia fanboy while it's best to ignore them i chose to not.
    It is a dirty trick when you come up with some absrud 32XAA , it's not usable in games that put up some decent graphics , i wanna see you run crysis or recent metro with 32XAA , or even Batman that you mentioned , ohh you can't ? but why ? so in practice that feature won't be touched by most users , when i said 8XAA is overkill it was meant it doesn't make such a massive differnce , sure you can see it but you are playing the game or do you stare at jagged lines ? in games that pack some action you rarerly stop to look at lines , it's stupid and it's an argument a fanboy would give , look , Nvidia can do 100XAA , better than Ati , me stupid fanboy gonna take whatever bullshit Nvidia gives me.
    bla bla bla
    Dear fanboy , GTS250 is a weak GPU who can't possible use 2 gb of memory , it can be helpfull for a HD5970 or a GTX480 but not for a GTS250 , if you don't understand this then the disscution ends here.
    Also , for video edititing you don't require the memory to be on the video card , system memory is fine too , it's not like you have to run the video at 60 fps minimum and you need the data to be on the video cards memory so now swap , video content is easy compared to what games need from a video card and the speed to process it.
    It's mindblowing what you say there , obviuosly you didn't touched video editing to see how cuda means nothing , actually Ati has a FREE plugin that speeds up adobe premiere and encodes with the gpu but it needs the whole system to be AMD , so praise Nvidia for not giving the same.
    bla bla bla
    Why do i care for those figures ? do every Nvidia fanboy split the atom ? research the cure for cancer ? build rockets ? sure sure , you have a rocket in your computer but you barelly run a game , i dont care how many billions operations Nvidia can do , why can't GTX480 beat HD5970 in games ?




    Wile E , are you registering with another name so you can flame me ?
     
  25. the54thvoid

    the54thvoid

    Joined:
    Dec 14, 2009
    Messages:
    3,386 (1.90/day)
    Thanks Received:
    1,618
    Location:
    Glasgow - home of formal profanity
    I sense great anger in this thread....

    Does CUDA not actually stand for 'Compute Unified Device Architecture'? which really means it's a design that is formatively conceived to do 'everything', i.e. it is a single processor which contains the required hard and software to 'multi-task' more effectively?

    And GPGPU is balls as a title as it stands for 'General Purpose Graphics Processing Unit'. Yawn.

    And this is why Fermi is the let down it is. It was conceived to do 'everything' which given the 40nm process it's manufatcured on - it just can't do it properly. Jen S. Huang (or whatever) wanted to develop Fermi as a HPC component first and a GFX unit second. One card with two purposes - brave idea but not yet possible effectively.

    It will happen but 'when' is the question and will ATI (who do high end cards too) get there first? What will Northern Islands hold in store? Ooh, I can't wait for all the fighting to start again in 2011. Oh, hold on, people are still fighting.

    lol.
     

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page