1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Possible Specifications of the GeForce GTX 350 Emerge

Discussion in 'News' started by btarunr, Jul 19, 2008.

  1. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    28,705 (11.16/day)
    Thanks Received:
    13,667
    Location:
    Hyderabad, India
    Hardspell released is list of possible specifications for the GeForce GTX 350 graphics processor (GPU):


    • NVIDIA GeForce GTX 350
    • GT300 core
    • 55nm technology
    • 576 sq.mm die area
    • 512bit GDDR5 memory controller
    • GDDR5 2GB memory, doubled GTX280
    • 480 stream processors
    • Grating operation units are 64 the same with GTX280
    • 216 GB/s memory bandwidth
    • Default clock speeds of core: 830MHz, shader: 2075 MHz, memory: 3360MHz (effective)
    • Pixel fill-rate 36.3G pixels/s
    • Texture fill-rate 84.4Gpixels/s
    • DirectX 10, no DX 10.1 support yet.
    Source: Hardspell
     
    Last edited: Jul 19, 2008
    calvary1980 says thanks.
  2. CrackerJack

    CrackerJack

    Joined:
    Dec 13, 2007
    Messages:
    2,707 (1.08/day)
    Thanks Received:
    450
    Location:
    East TN
    calvary1980 says thanks.
  3. indybird

    indybird New Member

    Joined:
    Sep 24, 2007
    Messages:
    176 (0.07/day)
    Thanks Received:
    9
    I dunno...
     
    calvary1980 says thanks.
  4. Megasty New Member

    Joined:
    Mar 18, 2008
    Messages:
    1,263 (0.52/day)
    Thanks Received:
    82
    Location:
    The Kingdom of Au
    But now it is news.

    That is one massive (fill-in-the-blank). It shows the mighty will of NV to keep on shelling out these uber GPUs when the dual monsters are still rearing in the drink.
    That's just too respectable, now only if they would stop charging so much for it - but I think they learned their lesson when it comes to that, hopefully :pimp:
     
    calvary1980 says thanks.
  5. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    28,705 (11.16/day)
    Thanks Received:
    13,667
    Location:
    Hyderabad, India
    Wolf is not a news poster. It's fun when you post news in the forum, but even more fun when you submit it. Sure, there's been a delay but there's very low tendency of us missing out on posting something. Sooner or later, we post everything. So we're not an incompetent news staff, submit news, not post it in the forums. Like the forum guidelines state, leave news posting to the staff.
     
    calvary1980 says thanks.
  6. indybird

    indybird New Member

    Joined:
    Sep 24, 2007
    Messages:
    176 (0.07/day)
    Thanks Received:
    9
    If it is $600 or more than this will not sell. So is this essentially a doubled GTX 280 except with only one GPU?

    -Indybird
     
    calvary1980 says thanks.
  7. From_Nowhere New Member

    Joined:
    Jun 13, 2008
    Messages:
    661 (0.28/day)
    Thanks Received:
    77
    This must be the rumor of the 55nm GTX 280GX2?. It could be possible if they cut power consumption enough.
     
    Last edited: Oct 17, 2009
    calvary1980 says thanks.
  8. Megasty New Member

    Joined:
    Mar 18, 2008
    Messages:
    1,263 (0.52/day)
    Thanks Received:
    82
    Location:
    The Kingdom of Au
    Bah, they're just trying to make something that will play the stock Crysis all the way through at high res (1920-2560) w/o dying. The GTX280 still can't :cry:

    Don't rain on NV's single GPU parade. Its a single chip.
     
    calvary1980 says thanks.
  9. candle_86 New Member

    Joined:
    Dec 28, 2006
    Messages:
    3,916 (1.37/day)
    Thanks Received:
    233
    omg, this thing is gonna pwn, just dont take it to the artic, to many will accelerate global warming more than any CO2 ever will
     
    calvary1980 says thanks.
  10. farlex85

    farlex85 New Member

    Joined:
    Mar 29, 2007
    Messages:
    4,830 (1.75/day)
    Thanks Received:
    638
    Another monolithic? This thing is epic if those are indeed the specs. The cost and thus the price would likely be epic as well. As such that would be an easy counter for ati w/ the way things are now I would think, and a bad move from nvidia, unless this were to be prepared to trump the R700 soon after it's release, in which case I guess it would be a win, still a little silly though.
     
    calvary1980 says thanks.
  11. jimmy246 New Member

    Joined:
    Feb 27, 2008
    Messages:
    3 (0.00/day)
    Thanks Received:
    2
    Can I say buying this dreamlike vga card is equal to buying another high watts power supply to support it? That's not good. To be contrary to the spirit of the Energy- Saving and Carbon Dioxide-reducing :D
     
    calvary1980 says thanks.
  12. vojc New Member

    Joined:
    Mar 29, 2008
    Messages:
    85 (0.04/day)
    Thanks Received:
    9
    GT300 is a big lol 570mm chip size? wtf....... it is as big as 4x r870 chip (let say it will be quad core and it will be much faster ;) )
     
    calvary1980 says thanks.
  13. Cold Storm

    Cold Storm Battosai

    Joined:
    Oct 7, 2007
    Messages:
    15,014 (5.84/day)
    Thanks Received:
    2,999
    Location:
    In a library somewhere on this earth
    It's going to be instating on whats going to happen with this. The possible sounds pretty dang good to me.
     
    calvary1980 says thanks.
  14. vojc New Member

    Joined:
    Mar 29, 2008
    Messages:
    85 (0.04/day)
    Thanks Received:
    9
    i think this GPU is going to pwn as much as radeon 2900XT did :)
     
    calvary1980 says thanks.
  15. Live OR Die

    Live OR Die

    Joined:
    May 19, 2007
    Messages:
    3,991 (1.47/day)
    Thanks Received:
    401
    Sold
     
    calvary1980 says thanks.
  16. candle_86 New Member

    Joined:
    Dec 28, 2006
    Messages:
    3,916 (1.37/day)
    Thanks Received:
    233
    where do you get that impression?

    this thing if correct has the shader power of SLI GTX280's in one core, that alone makes this thing deadly
     
    calvary1980 says thanks.
  17. Megasty New Member

    Joined:
    Mar 18, 2008
    Messages:
    1,263 (0.52/day)
    Thanks Received:
    82
    Location:
    The Kingdom of Au
    Uh, the die is the same size with the smaller manufacturing process. That boy could have 1.5-1.8 billion transistors. HOLY... :eek:
     
    Last edited: Jul 20, 2008
    calvary1980 says thanks.
  18. Hayder_Master

    Hayder_Master

    Joined:
    Apr 21, 2008
    Messages:
    5,177 (2.18/day)
    Thanks Received:
    638
    Location:
    IRAQ-Baghdad
    nvidia must support dx11 in new card , not let people take card's and throw it after month of use
     
    calvary1980 says thanks.
  19. Kursah

    Kursah

    Joined:
    Oct 15, 2006
    Messages:
    7,859 (2.68/day)
    Thanks Received:
    1,705
    Location:
    Missoula, MT, USA
    I really don't see DX11 being that big of a deal, especially right away...if only a few expensive cards take advantage of it...those that design games for it are better off primarily still supporting DX9, maybe add some DX10/11 goodies to keep the new tech crowd happy...but the money won't be made in the 10/11 arena by game mfg's, at least not for a couple years.
     
    calvary1980 says thanks.
  20. PCpraiser100 New Member

    Joined:
    Jul 17, 2008
    Messages:
    1,062 (0.46/day)
    Thanks Received:
    68
    They are absolutely crazy, put that kind of specs into a card. You'll need a bigger boat to handle this card, like a Skulltrail platform. They are really robbing the $h*t out of us if this happens outside of USA, like Canada or the UK. This card is for bottleneck whores who do ridiculous overclocking, and pay for a hydro bill so large that after overclocking for one whole day you'll have to move to Mexico to start a new life lol. NVIDIA is more of a company with products that work in conjunction with the latest processors with their most powerful solutions. On the other hand, ATI is more of a company with products that has extra features, never discontinued driver compatibility, and newer cores that can hold their own at AA and high res on the most demanding games.

    BTW bout DX11 check this out

    http://www.neowin.net/forum/index.php?showtopic=628854&pid=589318056&st=30&#entry589318056

    Dunno if fake
     
    Last edited: Jul 20, 2008
    calvary1980 says thanks.
  21. newconroer

    newconroer

    Joined:
    Jun 20, 2007
    Messages:
    3,158 (1.18/day)
    Thanks Received:
    339

    Wooooooow, QQ much?


    Anyways,

    That sounds like a pretty hefty card. What on Earth would we use it for?


    Doubtful. ATi is going to shoot themselves in the foot with the 4870X2, throwing all their remaining resources at taking the crown; which it might possibly do, yet it will be shortlived, and they'll be back at square one with very little to show for it.
     
    calvary1980 says thanks.
  22. Hayder_Master

    Hayder_Master

    Joined:
    Apr 21, 2008
    Messages:
    5,177 (2.18/day)
    Thanks Received:
    638
    Location:
    IRAQ-Baghdad

    exactly what i am saying in thread of dx11 release , but you know we need to see some perfect card that's all
     
    calvary1980 says thanks.
  23. sethk New Member

    Joined:
    Apr 14, 2007
    Messages:
    63 (0.02/day)
    Thanks Received:
    5
    I'm calling shenanigans on this.
     
    calvary1980 says thanks.
  24. Megasty New Member

    Joined:
    Mar 18, 2008
    Messages:
    1,263 (0.52/day)
    Thanks Received:
    82
    Location:
    The Kingdom of Au
    I don't think AMD is shooting themselves in the foot with an already sure bet. That card is the only thing that made the GTX280 come down to a point where many others are considering it. However, just as many ppl are holding out for the 4870x2 as well. AMD has already moved on to the next gen card anyway. That's where they can shoot themselves in the foot. If this monster comes to fruition then it will be truly a chip in AMD's armor that they put on in this series. If AMD's single GPU is only 20% faster than a 4870 while this boy is lurking around then the dual GPU model will only hope to match this thing at best.
     
    calvary1980 says thanks.
  25. jimmy246 New Member

    Joined:
    Feb 27, 2008
    Messages:
    3 (0.00/day)
    Thanks Received:
    2
    actually I am always excited to hear new specs for next generation video card but I finally realize the heat made by it is truly a problem. take me for example, I think I am crazy enough to buy a 9800gx2 card, yup there is not much disappointment I've had with this card so far because 9800gx2 has excellent performance. BUT...recently I notice the temp. indicator keeps giving me a warning message reading "temp. unusual" So, all I can do is remove pc case side cover or adjust the fan speed to full speed. Hmm....it seems like taking care of a feverish patient. One day if the fan goes down or even stop running...oh god! that would be terrible to think what might happen
     
    calvary1980 says thanks.

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page