• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Possible Specifications of the GeForce GTX 350 Emerge

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,670 (7.43/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Hardspell released is list of possible specifications for the GeForce GTX 350 graphics processor (GPU):


  • NVIDIA GeForce GTX 350
  • GT300 core
  • 55nm technology
  • 576 sq.mm die area
  • 512bit GDDR5 memory controller
  • GDDR5 2GB memory, doubled GTX280
  • 480 stream processors
  • Grating operation units are 64 the same with GTX280
  • 216 GB/s memory bandwidth
  • Default clock speeds of core: 830MHz, shader: 2075 MHz, memory: 3360MHz (effective)
  • Pixel fill-rate 36.3G pixels/s
  • Texture fill-rate 84.4Gpixels/s
  • DirectX 10, no DX 10.1 support yet.

View at TechPowerUp Main Site
 
Last edited:
But now it is news.

That is one massive (fill-in-the-blank). It shows the mighty will of NV to keep on shelling out these uber GPUs when the dual monsters are still rearing in the drink.
That's just too respectable, now only if they would stop charging so much for it - but I think they learned their lesson when it comes to that, hopefully :pimp:
 

Wolf is not a news poster. It's fun when you post news in the forum, but even more fun when you submit it. Sure, there's been a delay but there's very low tendency of us missing out on posting something. Sooner or later, we post everything. So we're not an incompetent news staff, submit news, not post it in the forums. Like the forum guidelines state, leave news posting to the staff.
 
If it is $600 or more than this will not sell. So is this essentially a doubled GTX 280 except with only one GPU?

-Indybird
 
This must be the rumor of the 55nm GTX 280GX2?. It could be possible if they cut power consumption enough.
 
Last edited:
If it is $600 or more than this will not sell. So is this essentially a doubled GTX 280 except with only one GPU?

-Indybird

Bah, they're just trying to make something that will play the stock Crysis all the way through at high res (1920-2560) w/o dying. The GTX280 still can't :cry:

This must be the rumor of the 55nm GTX 280GX2. It could be possible if they cut power consumption enough.

Don't rain on NV's single GPU parade. Its a single chip.
 
omg, this thing is gonna pwn, just dont take it to the artic, to many will accelerate global warming more than any CO2 ever will
 
Another monolithic? This thing is epic if those are indeed the specs. The cost and thus the price would likely be epic as well. As such that would be an easy counter for ati w/ the way things are now I would think, and a bad move from nvidia, unless this were to be prepared to trump the R700 soon after it's release, in which case I guess it would be a win, still a little silly though.
 
Can I say buying this dreamlike vga card is equal to buying another high watts power supply to support it? That's not good. To be contrary to the spirit of the Energy- Saving and Carbon Dioxide-reducing :D
 
GT300 is a big lol 570mm chip size? wtf....... it is as big as 4x r870 chip (let say it will be quad core and it will be much faster ;) )
 
It's going to be instating on whats going to happen with this. The possible sounds pretty dang good to me.
 
i think this GPU is going to pwn as much as radeon 2900XT did :)

where do you get that impression?

this thing if correct has the shader power of SLI GTX280's in one core, that alone makes this thing deadly
 
Uh, the die is the same size with the smaller manufacturing process. That boy could have 1.5-1.8 billion transistors. HOLY... :eek:
 
Last edited:
nvidia must support dx11 in new card , not let people take card's and throw it after month of use
 
nvidia must support dx11 in new card , not let people take card's and throw it after month of use

I really don't see DX11 being that big of a deal, especially right away...if only a few expensive cards take advantage of it...those that design games for it are better off primarily still supporting DX9, maybe add some DX10/11 goodies to keep the new tech crowd happy...but the money won't be made in the 10/11 arena by game mfg's, at least not for a couple years.
 
They are absolutely crazy, put that kind of specs into a card. You'll need a bigger boat to handle this card, like a Skulltrail platform. They are really robbing the $h*t out of us if this happens outside of USA, like Canada or the UK. This card is for bottleneck whores who do ridiculous overclocking, and pay for a hydro bill so large that after overclocking for one whole day you'll have to move to Mexico to start a new life lol. NVIDIA is more of a company with products that work in conjunction with the latest processors with their most powerful solutions. On the other hand, ATI is more of a company with products that has extra features, never discontinued driver compatibility, and newer cores that can hold their own at AA and high res on the most demanding games.

BTW bout DX11 check this out

http://www.neowin.net/forum/index.php?showtopic=628854&pid=589318056&st=30&#entry589318056

Dunno if fake
 
Last edited:
Wolf is not a news poster. It's fun when you post news in the forum, but even more fun when you submit it. Sure, there's been a delay but there's very low tendency of us missing out on posting something. Sooner or later, we post everything. So we're not an incompetent news staff, submit news, not post it in the forums. Like the forum guidelines state, leave news posting to the staff.


Wooooooow, QQ much?


Anyways,

That sounds like a pretty hefty card. What on Earth would we use it for?

Another monolithic? This thing is epic if those are indeed the specs. The cost and thus the price would likely be epic as well. As such that would be an easy counter for ati w/ the way things are now I would think, and a bad move from nvidia, unless this were to be prepared to trump the R700 soon after it's release, in which case I guess it would be a win, still a little silly though.


Doubtful. ATi is going to shoot themselves in the foot with the 4870X2, throwing all their remaining resources at taking the crown; which it might possibly do, yet it will be shortlived, and they'll be back at square one with very little to show for it.
 
I really don't see DX11 being that big of a deal, especially right away...if only a few expensive cards take advantage of it...those that design games for it are better off primarily still supporting DX9, maybe add some DX10/11 goodies to keep the new tech crowd happy...but the money won't be made in the 10/11 arena by game mfg's, at least not for a couple years.


exactly what i am saying in thread of dx11 release , but you know we need to see some perfect card that's all
 
Doubtful. ATi is going to shoot themselves in the foot with the 4870X2, throwing all their remaining resources at taking the crown; which it might possibly do, yet it will be shortlived, and they'll be back at square one with very little to show for it.

I don't think AMD is shooting themselves in the foot with an already sure bet. That card is the only thing that made the GTX280 come down to a point where many others are considering it. However, just as many ppl are holding out for the 4870x2 as well. AMD has already moved on to the next gen card anyway. That's where they can shoot themselves in the foot. If this monster comes to fruition then it will be truly a chip in AMD's armor that they put on in this series. If AMD's single GPU is only 20% faster than a 4870 while this boy is lurking around then the dual GPU model will only hope to match this thing at best.
 
actually I am always excited to hear new specs for next generation video card but I finally realize the heat made by it is truly a problem. take me for example, I think I am crazy enough to buy a 9800gx2 card, yup there is not much disappointment I've had with this card so far because 9800gx2 has excellent performance. BUT...recently I notice the temp. indicator keeps giving me a warning message reading "temp. unusual" So, all I can do is remove pc case side cover or adjust the fan speed to full speed. Hmm....it seems like taking care of a feverish patient. One day if the fan goes down or even stop running...oh god! that would be terrible to think what might happen
 
Back
Top