Saturday, July 19th 2008

Possible Specifications of the GeForce GTX 350 Emerge

Hardspell released is list of possible specifications for the GeForce GTX 350 graphics processor (GPU):
  • NVIDIA GeForce GTX 350
  • GT300 core
  • 55nm technology
  • 576 sq.mm die area
  • 512bit GDDR5 memory controller
  • GDDR5 2GB memory, doubled GTX280
  • 480 stream processors
  • Grating operation units are 64 the same with GTX280
  • 216 GB/s memory bandwidth
  • Default clock speeds of core: 830MHz, shader: 2075 MHz, memory: 3360MHz (effective)
  • Pixel fill-rate 36.3G pixels/s
  • Texture fill-rate 84.4Gpixels/s
  • DirectX 10, no DX 10.1 support yet.
Source: Hardspell
Add your own comment

30 Comments on Possible Specifications of the GeForce GTX 350 Emerge

#3
Megasty
But now it is news.

That is one massive (fill-in-the-blank). It shows the mighty will of NV to keep on shelling out these uber GPUs when the dual monsters are still rearing in the drink.
That's just too respectable, now only if they would stop charging so much for it - but I think they learned their lesson when it comes to that, hopefully :pimp:
Posted on Reply
#4
btarunr
Editor & Senior Moderator
CrackerJack
Wolf posted this yesterday:
forums.techpowerup.com/showthread.php?p=891126#post891126


thanks though
Wolf is not a news poster. It's fun when you post news in the forum, but even more fun when you submit it. Sure, there's been a delay but there's very low tendency of us missing out on posting something. Sooner or later, we post everything. So we're not an incompetent news staff, submit news, not post it in the forums. Like the forum guidelines state, leave news posting to the staff.
Posted on Reply
#5
Unregistered
If it is $600 or more than this will not sell. So is this essentially a doubled GTX 280 except with only one GPU?

-Indybird
#6
From_Nowhere
This must be the rumor of the 55nm GTX 280GX2?. It could be possible if they cut power consumption enough.
Posted on Reply
#7
Megasty
indybird
If it is $600 or more than this will not sell. So is this essentially a doubled GTX 280 except with only one GPU?

-Indybird
Bah, they're just trying to make something that will play the stock Crysis all the way through at high res (1920-2560) w/o dying. The GTX280 still can't :cry:
From_Nowhere
This must be the rumor of the 55nm GTX 280GX2. It could be possible if they cut power consumption enough.
Don't rain on NV's single GPU parade. Its a single chip.
Posted on Reply
#8
candle_86
omg, this thing is gonna pwn, just dont take it to the artic, to many will accelerate global warming more than any CO2 ever will
Posted on Reply
#9
farlex85
Another monolithic? This thing is epic if those are indeed the specs. The cost and thus the price would likely be epic as well. As such that would be an easy counter for ati w/ the way things are now I would think, and a bad move from nvidia, unless this were to be prepared to trump the R700 soon after it's release, in which case I guess it would be a win, still a little silly though.
Posted on Reply
#10
jimmy246
Can I say buying this dreamlike vga card is equal to buying another high watts power supply to support it? That's not good. To be contrary to the spirit of the Energy- Saving and Carbon Dioxide-reducing :D
Posted on Reply
#11
vojc
GT300 is a big lol 570mm chip size? wtf....... it is as big as 4x r870 chip (let say it will be quad core and it will be much faster ;) )
Posted on Reply
#12
Cold Storm
Battosai
It's going to be instating on whats going to happen with this. The possible sounds pretty dang good to me.
Posted on Reply
#13
vojc
candle_86
omg, this thing is gonna pwn, just dont take it to the artic, to many will accelerate global warming more than any CO2 ever will
i think this GPU is going to pwn as much as radeon 2900XT did :)
Posted on Reply
#15
candle_86
vojc
i think this GPU is going to pwn as much as radeon 2900XT did :)
where do you get that impression?

this thing if correct has the shader power of SLI GTX280's in one core, that alone makes this thing deadly
Posted on Reply
#16
Megasty
Uh, the die is the same size with the smaller manufacturing process. That boy could have 1.5-1.8 billion transistors. HOLY... :eek:
Posted on Reply
#17
Hayder_Master
nvidia must support dx11 in new card , not let people take card's and throw it after month of use
Posted on Reply
#18
Kursah
hayder.master
nvidia must support dx11 in new card , not let people take card's and throw it after month of use
I really don't see DX11 being that big of a deal, especially right away...if only a few expensive cards take advantage of it...those that design games for it are better off primarily still supporting DX9, maybe add some DX10/11 goodies to keep the new tech crowd happy...but the money won't be made in the 10/11 arena by game mfg's, at least not for a couple years.
Posted on Reply
#19
PCpraiser100
They are absolutely crazy, put that kind of specs into a card. You'll need a bigger boat to handle this card, like a Skulltrail platform. They are really robbing the $h*t out of us if this happens outside of USA, like Canada or the UK. This card is for bottleneck whores who do ridiculous overclocking, and pay for a hydro bill so large that after overclocking for one whole day you'll have to move to Mexico to start a new life lol. NVIDIA is more of a company with products that work in conjunction with the latest processors with their most powerful solutions. On the other hand, ATI is more of a company with products that has extra features, never discontinued driver compatibility, and newer cores that can hold their own at AA and high res on the most demanding games.

BTW bout DX11 check this out

www.neowin.net/forum/index.php?showtopic=628854&pid=589318056&st=30&#entry589318056

Dunno if fake
Posted on Reply
#20
newconroer
btarunr
Wolf is not a news poster. It's fun when you post news in the forum, but even more fun when you submit it. Sure, there's been a delay but there's very low tendency of us missing out on posting something. Sooner or later, we post everything. So we're not an incompetent news staff, submit news, not post it in the forums. Like the forum guidelines state, leave news posting to the staff.
Wooooooow, QQ much?


Anyways,

That sounds like a pretty hefty card. What on Earth would we use it for?
farlex85
Another monolithic? This thing is epic if those are indeed the specs. The cost and thus the price would likely be epic as well. As such that would be an easy counter for ati w/ the way things are now I would think, and a bad move from nvidia, unless this were to be prepared to trump the R700 soon after it's release, in which case I guess it would be a win, still a little silly though.
Doubtful. ATi is going to shoot themselves in the foot with the 4870X2, throwing all their remaining resources at taking the crown; which it might possibly do, yet it will be shortlived, and they'll be back at square one with very little to show for it.
Posted on Reply
#21
Hayder_Master
Kursah
I really don't see DX11 being that big of a deal, especially right away...if only a few expensive cards take advantage of it...those that design games for it are better off primarily still supporting DX9, maybe add some DX10/11 goodies to keep the new tech crowd happy...but the money won't be made in the 10/11 arena by game mfg's, at least not for a couple years.
exactly what i am saying in thread of dx11 release , but you know we need to see some perfect card that's all
Posted on Reply
#22
sethk
I'm calling shenanigans on this.
Posted on Reply
#23
Megasty
newconroer
Doubtful. ATi is going to shoot themselves in the foot with the 4870X2, throwing all their remaining resources at taking the crown; which it might possibly do, yet it will be shortlived, and they'll be back at square one with very little to show for it.
I don't think AMD is shooting themselves in the foot with an already sure bet. That card is the only thing that made the GTX280 come down to a point where many others are considering it. However, just as many ppl are holding out for the 4870x2 as well. AMD has already moved on to the next gen card anyway. That's where they can shoot themselves in the foot. If this monster comes to fruition then it will be truly a chip in AMD's armor that they put on in this series. If AMD's single GPU is only 20% faster than a 4870 while this boy is lurking around then the dual GPU model will only hope to match this thing at best.
Posted on Reply
#24
jimmy246
actually I am always excited to hear new specs for next generation video card but I finally realize the heat made by it is truly a problem. take me for example, I think I am crazy enough to buy a 9800gx2 card, yup there is not much disappointment I've had with this card so far because 9800gx2 has excellent performance. BUT...recently I notice the temp. indicator keeps giving me a warning message reading "temp. unusual" So, all I can do is remove pc case side cover or adjust the fan speed to full speed. Hmm....it seems like taking care of a feverish patient. One day if the fan goes down or even stop running...oh god! that would be terrible to think what might happen
Posted on Reply
#25
newconroer
Megasty
I don't think AMD is shooting themselves in the foot with an already sure bet. That card is the only thing that made the GTX280 come down to a point where many others are considering it. However, just as many ppl are holding out for the 4870x2 as well. AMD has already moved on to the next gen card anyway. That's where they can shoot themselves in the foot. If this monster comes to fruition then it will be truly a chip in AMD's armor that they put on in this series. If AMD's single GPU is only 20% faster than a 4870 while this boy is lurking around then the dual GPU model will only hope to match this thing at best.
Exactly, 280 prices will come down, and people won't see a purpose is running this card, which we can already see has pretty nasty power requirements.

If ATi relies too heavily on the price point of this card against the 280, then it's going to hurt, especially with Nvidia's possible 300 series on it's way, and that doesn't even include any "X2" models or 'extreme GT200b' that might be unearthed soon.

And you could be quite right about their next gen card.
Posted on Reply
Add your own comment