1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Two R700s Churn-out X12515 in 3DMark Vantage

Discussion in 'News' started by btarunr, Jun 30, 2008.

  1. [I.R.A]_FBi

    [I.R.A]_FBi New Member

    Joined:
    May 19, 2007
    Messages:
    7,664 (2.76/day)
    Thanks Received:
    540
    Location:
    c:\programs\kitteh.exe
    but to my wallet ... 4 < 3 ...
     
    WarEagleAU says thanks.
  2. Voyager

    Joined:
    Jun 18, 2008
    Messages:
    23 (0.01/day)
    Thanks Received:
    2
    :toast: great!

    Will be there any 4600 series?
     
  3. zOaib New Member

    Joined:
    Sep 23, 2005
    Messages:
    985 (0.29/day)
    Thanks Received:
    32
    Location:
    FL
    4 = 1000 usd

    3= 1950 usd

    if u get 4 for 950 less isnt that a bargain plus you dont have to own your own power generating stattion for the triple sli .....

    what i dont understand is ppl talking about it isnt fair to use 2 gpus to beat single gpu , well i got one word ( actually its a sentence ).......... if my V8 costs less to beat an over priced V6 car ........... which one shud i get .
     
    WarEagleAU says thanks.
  4. mdm-adph

    mdm-adph New Member

    Joined:
    Mar 28, 2007
    Messages:
    2,478 (0.88/day)
    Thanks Received:
    340
    Location:
    Your house.
    Yeah, what about us midrange people? :laugh:

    Seriously, I was actually going to CF two 3650's together just for the hell of it, but only if I can get a 3650 really cheap one day.
     
  5. DanTheBanjoman Señor Moderator

    Joined:
    May 20, 2004
    Messages:
    10,553 (2.73/day)
    Thanks Received:
    1,383
    That, and bta's financial argument are bad for NV. Can't compete at power consumption o price level. However, that 9800GTX+ doesn't look bad, so I guess NV will be fine in the segments where the money is.
    As long as NV markets their card as the fastest and responds to the x2 with "but those are two cards" the world will still fall for it. And last time I checked, NV is better at marketing than ATI/AMD.

    Also it's "capisce", as it isn't the most friendly choice of words you should at least spell it correctly.
     
  6. Megasty New Member

    Joined:
    Mar 18, 2008
    Messages:
    1,263 (0.51/day)
    Thanks Received:
    82
    Location:
    The Kingdom of Au
    :toast: :roll:

    Which one would I rather buy, a $500 (AA) that completely beats a $650 (B) that hardly beats a $300 (A): AA>>B>A so $500>>$650>$300. Its a Paradox that will never make sense - unless NV lower their prices or ppl just blow sick amounts of cash on names instead of researching first. That's why kids can't afford things of this nature...
     
    WarEagleAU says thanks.
  7. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    28,965 (11.00/day)
    Thanks Received:
    13,757
    Location:
    Hyderabad, India
    Even in mainstream segments, there's a bad deal from NV. Sure, for $230 you get a 9800 GTX+, but for $69 more you get a HD4870 that equals/beats the GTX 260 and the 9800 GX2 according to some reviews.

    Something hints that this 2x R700 bench was run on a machine running a Phenom X4.
     
  8. DanTheBanjoman Señor Moderator

    Joined:
    May 20, 2004
    Messages:
    10,553 (2.73/day)
    Thanks Received:
    1,383
    There always is the for $x more you can have a *insert item* argument.
    Besides, $69 is 30% more, not exactly "a bit' more. Considering it isn't near 30% faster than a 9800GTX and the + being faster than the normal GTX I hardly believe NV is that far behind.
     
  9. VanguardGX

    VanguardGX New Member

    Joined:
    Jun 30, 2008
    Messages:
    27 (0.01/day)
    Thanks Received:
    5
    Location:
    Jamaica
    Why do people use the fact that the X2 is a dual chip card as if it’s a bad thing? So what if it’s a dual chip card, it’s gonna be faster than the GTX280 and not to mention cheaper.
    Another thing is people keep sayin ATi will lead til the green team makes a 280GX2? Cmon people lets be serious that’s not gonna happened, well not this generation. Do you want a GPU that burns 400+ watts? Didn’t think so.
     
  10. MoA New Member

    Joined:
    Jun 24, 2008
    Messages:
    27 (0.01/day)
    Thanks Received:
    0
    hahah pretty obvious answer:
    because they need a reason for themselves to believe why Nvidia is better :p
     
  11. vojc New Member

    Joined:
    Mar 29, 2008
    Messages:
    85 (0.03/day)
    Thanks Received:
    9
    it is what intel does with quad core, just stick 2 + 2 cores, do does AMD/ati at graphic market.
    Nvidia on other side is don`t know how to do single board dual gpu, so they are only able to stick 2 boards together
     
  12. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    28,965 (11.00/day)
    Thanks Received:
    13,757
    Location:
    Hyderabad, India
    It's not that they don't know. Never underestimate the engineering prowess of NVIDIA. It's just that the power and thermal characteristics of their GPUs don't allow sticking two of them onto one board.
     
  13. 0o0o0 New Member

    Joined:
    Apr 24, 2008
    Messages:
    6 (0.00/day)
    Thanks Received:
    0
    Uhum,
    Do you really think that Nvidia is going to make a GTX280 X2? 2x 500mm² on 1 card, then you need to have an extreme fan to keep them cool, with a standard fan they could reach 100-110°C.
    And euh, 1 GTX280 costs 650$, a GTX280 X2 would be 1200$ or so. Who would buy that?

    Congratz AMD, nice job :respect:
     
  14. DaMulta

    DaMulta My stars went supernova

    Joined:
    Aug 3, 2006
    Messages:
    16,132 (5.27/day)
    Thanks Received:
    1,457
    Location:
    Oklahoma T-Town
    [​IMG]
    http://www.xtremesystems.org/forums/showthread.php?t=191313
    k|ngp|n did that on all air. That's 3 GPUs vs 4 gpus. If Nvidia does come out with a dual card again, I think it could really put a hurting on the x2.(this is before physics drivers I think)


    Then he turned around and did this wow.
    [​IMG]

    Maybe on the next die shrink.
     
  15. Morgoth

    Morgoth

    Joined:
    Aug 4, 2007
    Messages:
    3,795 (1.41/day)
    Thanks Received:
    250
    Location:
    Netherlands
    now try to beat hd4870x2 in crossfire with bloomfield on 4ghz ;)
     
  16. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    28,965 (11.00/day)
    Thanks Received:
    13,757
    Location:
    Hyderabad, India
    Wonder what a brutal overclocker such as k|p can do to these cards on a Intel platform. In similar publications by both TG Daily and Tom's Hardware, the slide (in the first post) uses a resource-name "3dmarkonnextgenphenom" leading me to guess they ran it on a Bulldozer :confused:

    [​IMG]
     
  17. mlupple

    mlupple New Member

    Joined:
    Jun 27, 2008
    Messages:
    209 (0.09/day)
    Thanks Received:
    13
    Location:
    USA
  18. VanguardGX

    VanguardGX New Member

    Joined:
    Jun 30, 2008
    Messages:
    27 (0.01/day)
    Thanks Received:
    5
    Location:
    Jamaica
    Last edited: Jun 30, 2008
  19. DanTheBanjoman Señor Moderator

    Joined:
    May 20, 2004
    Messages:
    10,553 (2.73/day)
    Thanks Received:
    1,383
    For someone with two posts, one being fanboyism and another being a direct insult towards another member you deserve a warning. Consider this it.
     
    WarEagleAU and btarunr say thanks.
  20. DaMulta

    DaMulta My stars went supernova

    Joined:
    Aug 3, 2006
    Messages:
    16,132 (5.27/day)
    Thanks Received:
    1,457
    Location:
    Oklahoma T-Town
    When you compete in 3dmark score is all that matters, not how much it cost.:)

    It would still cost a thousand dollars for 2 x2 cards which is also a lot of money.

    The x2 is going to be a great card for the money, but when you want more than that the 280 will be the way to go. I know that it is considered a single card, but the fact is that it is 3 cores vs 4.
     
  21. farlex85

    farlex85 New Member

    Joined:
    Mar 29, 2007
    Messages:
    4,829 (1.71/day)
    Thanks Received:
    638
    I don't think so, b/c as far as I know you still can't put 4 single cards in sli. This means maxed out ati becomes the top dog (and costs much less doing so). The only way nvidia can re-claim the performance crown now is if and when they are able to make a dual gt200, which may be a while.
     
  22. Morgoth

    Morgoth

    Joined:
    Aug 4, 2007
    Messages:
    3,795 (1.41/day)
    Thanks Received:
    250
    Location:
    Netherlands
    lol if that was true, then its true that nehalem is again a superior architecture
     
  23. DaMulta

    DaMulta My stars went supernova

    Joined:
    Aug 3, 2006
    Messages:
    16,132 (5.27/day)
    Thanks Received:
    1,457
    Location:
    Oklahoma T-Town
    I don't know if they can take the top 280 score in vantage as it is now. Yes, I think it will be a while before and if we see a dual card with the 280. But they do have loads of cash that they could throw at it and have it on the market in record time.
     
  24. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    28,965 (11.00/day)
    Thanks Received:
    13,757
    Location:
    Hyderabad, India
    go to the tgdaily link in the 1st post. The Phenom 9950 manages 11000-change xtreme CPU score. DaMulta's YF manages 40000 in performance score.

    I bet this would have been a better bench if they ran it on a Intel setup though I'm just guessing they didn't.
     
  25. farlex85

    farlex85 New Member

    Joined:
    Mar 29, 2007
    Messages:
    4,829 (1.71/day)
    Thanks Received:
    638
    That's physX causing the 40k cpu score, intel or amd doesn't have as much to do w/ it. W/o physX he still gets like 18k on the cpu score though. I get 30k in vantage cpu w/ a 6750 w/ physX. :laugh:
     
    Last edited: Jun 30, 2008

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page