1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Next Gen ATI vs. NVidia

Discussion in 'Graphics Cards' started by BloodTotal, Dec 24, 2012.

  1. Aquinus

    Aquinus Resident Wat-man

    Joined:
    Jan 28, 2012
    Messages:
    6,460 (6.47/day)
    Thanks Received:
    2,188
    Location:
    Concord, NH
    Never had a problem with crossfire on my 6870s...
    How so? Please explain this, otherwise you're trolling.

    Honestly, there is practically no data about upcoming GPUs, so I don't think we can discuss much of anything on the topic without going straight to speculation. I think we should sit tight and see what happens.
     
  2. james888

    james888

    Joined:
    Jun 27, 2011
    Messages:
    4,539 (3.74/day)
    Thanks Received:
    1,691
    We have almost nothing to go on but speculation. I would guess AMD and Nvidea would as close as they are in performance as they are this gen.

    Personally I won't be upgrading from my 7970 till maybe maxwell/amd equivalent if games actually can use that amount of power by then.
     
    Crunching for Team TPU
  3. buildzoid

    buildzoid

    Joined:
    Oct 29, 2012
    Messages:
    1,273 (1.76/day)
    Thanks Received:
    340
    Location:
    CZ
    from everything I have seen on spec so far I'm pretty sure the AMD 8970 will be around 25% more powerful because it has the same clock just 25% more compute units I wish they actually went 3072 stream processors for 50% more compute power.
    nvidia I have no idea. The gk110 would be cool but I think it be too damn expensive because of the amount of cuda cores and either way I hate the boost functionality on nvidia cards
     
  4. EarthDog

    EarthDog

    Joined:
    Dec 31, 2009
    Messages:
    3,316 (1.89/day)
    Thanks Received:
    683
    If the compute units worked like that you would be right...

    (some) AMD cards have boost too. ;)
     
  5. Aleksander

    Joined:
    Dec 2, 2009
    Messages:
    3,254 (1.82/day)
    Thanks Received:
    304
    I think that the prices of the cards nowadays are way too high from what they give and problems are rising more and more each day with a lot of people with bad cards. I personally have problems with AMD temps and had some weird artifacts with 9500GT from Nvidia.
    I know it is chance, but seriously for what you pay, they are totally not worth it.

    The prices raise and you get less. This is what I mean dissapointing.
    Performance gains are going to low rates and what I see is a dark future, since you can perfectly run a lot of games with the card I have.

    Simply put, the more money they make, the darker the future will be with graphics.

    This just means, we are already in the dark GPU era.

    For that 200mm^2 chip you pay 500$ or even more which is simply awful.
    The real price of the cards should be around 120$ from my point of view. (only because of hard-work)

    If you see for example a difference between Tegra (nvidia) and mali 400, that is simply stupid, because the whole difference is that support of HDMI, nothing more. So I buy a 70$ tablet which is the same quality (only graphics but still other stuff too) of a 200$ tablet or even more.
    This just brakes your heart at how many cunsumers loose money without knowing what they are doing.

    Note: I remember thinking of the Ipad as a 80$ real price and it turns out that its real building materials are around that price. So the tablet is really a cheap thing, but it is not only Nvidia or AMD who makes the pricy stuff, but all hardware companies. Here it sells for 1000$, which means that 1000:80 is a big ratio and whoever I see a tablet like that, I will immediately think they are stupid.
     
    Last edited: Dec 27, 2012
  6. Graogrim New Member

    Joined:
    Jan 1, 2008
    Messages:
    308 (0.12/day)
    Thanks Received:
    31
    Location:
    East Coast US
    I don't foresee myself upgrading substantially before 2014. Granted, something unexpected and amazing could come out of nowhere on the hardware front, but what besides Crysis 3 (which reasonable existing hardware should handle competently anyway) is going to push it?
     

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page