Thursday, April 2nd 2009

NVIDIA Extends Performance Lead With New GeForce GTX 275 GPU

NVIDIA today announced the GeForce GTX 275 GPU, the latest addition to NVIDIA's performance GPU lineup, offering first-rate graphics, physics, and GPU computing performance, as well as the absolute best price-performance for today's value-conscious gamers.

Based on the GT200 architecture, the GeForce GTX 275 GPU features 240 processor cores operating at 1404 MHz, 80 texture processing units, a 448-bit memory interface, and an 896 MB framebuffer. In price and performance, it sits between the GeForce GTX 260 and the GeForce GTX 285.

Product Highlights:
  • Highest performing GPU in the $229-$249 price category
  • Incorporates NVIDIA PhysX technology for the most interactive gaming experience
  • NVIDIA CUDA for accelerated desktop applications
  • Supports NVIDIA 3D Vision for immersive stereoscopic gaming
  • Ready for Windows 7
  • Launches with new GeForce Release 185 software drivers for optimal performance and features, including a new Ambient Occlusion setting
The GeForce GTX 275 is being announced alongside the availability of the new GeForce Power Pack #3, which includes a host of free, new PhysX and CUDA-enabled content for GeForce owners. The Power Pack #3 includes:
  • Three new applications that take advantage of PhysX hardware acceleration, including a custom PhysX patch for Ascaron's Sacred 2: Fallen Angel; a never-before-seen demo and benchmark for Star Tales, a highly ambitious social networking game from the acclaimed Chinese developer QWD1; and the source code to the PhysX screensaver for community modding and distribution.
  • Two new CUDA-accelerated applications, including MotionDSP vReveal and SETI@home.
The GeForce GTX 275 will be available globally on or before April 14, 2009. Some regions are available now; other regions are available for pre-order. Current availability varies by region.

GeForce GTX 275 will be available from the world's leading add-in card partners, including: ASUS, BFG, Colorful, EVGA, Gainward, Galaxy, Gigabyte, Innovision, Leadtek, MSI, Palit, Sparkle, Zotac, PNY, Point of View, and XFX.

Additional Global Links:

NVIDIA Power Pack #3
nvidia.com/powerpacks

GeForce GTX 275

UK
novatech.co.uk

Norway
komplett.no

Sweden
komplett.se

Denmark
komplett.dk

UK
scan.co.uk

Holland
dunnet.nl

Canada
ncix.com

US
amazon.com
Source: NVIDIA
Add your own comment

41 Comments on NVIDIA Extends Performance Lead With New GeForce GTX 275 GPU

#1
Tatty_Two
Gone Fishing
Thanks for this, of course everyone is now waiting for the performance comparisons between this and the 4890. It looks to be simalarily priced to the 4890 in the UK albeit I have found a 4890 for £205.

In stock at Overclockers UK also for £239

www.overclockers.co.uk/showproduct.php?prodid=GX-116-OK
Posted on Reply
#3
mdm-adph
I guess the only difference is that you can actually buy a 4890, but not a GTX 275.
Posted on Reply
#4
wolf
Performance Enthusiast
This thing already kills a 4890, and anyone has has owned ANY GT200 knows they are not cards to be run stock!

now just wait for availability.
Posted on Reply
#5
phanbuey
mdm-adphI guess the only difference is that you can actually buy a 4890, but not a GTX 275.
Sort of... this is nvidia over-reacting to ATI by releasing a card that kicks the crap out of everything single-gpu
Posted on Reply
#6
wolf
Performance Enthusiast
they could have called it the GTX260 core 240 imo, the only difference is clocks peeds.
Posted on Reply
#7
Tatty_Two
Gone Fishing
mdm-adphI guess the only difference is that you can actually buy a 4890, but not a GTX 275.
Lol, in the UK there are bucket loads available for both of them, and the cheapest 275 is cheaper than the cheapest 4890 suprisingly.

www.aria.co.uk/SuperSpecials/Other+products/nVidia+GeForce+GTX+275+896MB+PCI-E+2.0+Ret+?noVat=0&productId=35615

Versus............

www.aria.co.uk/SuperSpecials/Other+products/ATI+Radeon+HD4890+1GB+PCI-E+2.0+Ret+?productId=35432

That e tailer has the cheapest in stock of either cards I can see ATM
Posted on Reply
#9
newtekie1
Semi-Retired Folder
phanbueySort of... this is nvidia over-reacting to ATI by releasing a card that kicks the crap out of everything single-gpu(and beats the 285)...
It doesn't beat the 285.
Posted on Reply
#12
phanbuey
Ketxxx:roll::roll::roll: All its doing is adding some static shadows.
yeah after an hour of playing with it, its not that great :nutkick:... especially in L4D when it adds static shadows even when your'e shining the the flashlight on an object. *SiGh*... got excited :slap:.

Not to mention, the 185.65's dont work with SLI and CSS on my system.
Posted on Reply
#13
h3llb3nd4
rather wait for a new driver and better card then...
Posted on Reply
#14
DarkMatter
Ketxxx:roll::roll::roll: All its doing is adding some static shadows.
They are not static. It's not related to lights, to direct lighting I mean, AO is a way to simulate indirect light (the one that comes after several reflections in walls, objects etc), so they won't move if the light sources change. But the shadows do move according to the geometry, if geometry changes the AO changes. Of course you won't see that looking at a wall. ;)

IMO AO adds a lot of realism to the scenes, after seing one scene with it enabled I can't stand watching at the one without. It looks like a 90's game after that. That being said the last time I checked the Nvidia AO feature through the CP it was almost unusable because of the performance drop, so it's pretty much a gimmick at this point. Unless 182.50 have dramatically improved the performance with AO enabled.
phanbueyyeah after an hour of playing with it, its not that great :nutkick:... especially in L4D when it adds static shadows even when your'e shining the the flashlight on an object. *SiGh*... got excited :slap:.

Not to mention, the 185.65's dont work with SLI and CSS on my system.
Yeah and in L4D your companions flashlights don't iluminate, blame the drivers. lol.
Posted on Reply
#15
phanbuey
DarkMatterYeah and in L4D your companions flashlights don't iluminate, blame the drivers. lol.
That may be so, l4d is definitely no coding wonder... but thats really not the point - if the drivers break one game and then adds shadows where there arent supposed to be any in another, then it is the drivers' fault (they are beta drivers, so im not "blaming" anything, just pointing out that it might be a better idea to have the game developers take care of the lighting) . Great idea, but it doesn't work for 2 of the 3 games I have tried.
Posted on Reply
#16
devguy
I think it was stupid of Nvidia to release this card. Sure, the GTX260 c216 is getting pounded by the 4890, but this GTX275 brings in waaaay too much competition for Nvidia's own GTX285. Why would anyone bother buying the 285 now? It is priced considerably more than the GTX275, but only performs slightly better. And those who want the most performance will skip over the GTX285 and purchase the GTX295 anyway. From my perspective, Nvidia essentially killed off their GTX285 sales.

What I believe they should've done is drop the price on the GTX285 to around $275, and left out the GTX275.
Posted on Reply
#17
DarkMatter
phanbueyThat may be so, l4d is definitely no coding wonder... but thats really not the point - if the drivers break one game and then adds shadows where there arent supposed to be any in another, then it is the drivers' fault. Great idea, but it doesn't work for 2 of the 3 games I have tried.
I tried it in many games when the beta came out some months ago and visually it worked perfectly. Only drawback was the performance hit, which rendered the feature unusable. On the screenshots in the link you provided seems like it works even better right now. You have to understand one thing though, AO is for indirect lighting, in a game or place where you have to use the flashlight (like L4D), there wouldn't be any indirect lighting as is natural*, so there' shouldn't be any AO. So AO shouldn't be enabled, but this is an external feature, it's not built in the game engine so it's either ON or OFF. They can't make it work only when it should, but IMO is a great idea that greatly improves the visual quality in properly lit areas. The inconsistency when in dark areas and with flashlights is a minor con IMO considering the pros. Once again speaking anout the visual impact without considering the performance one.

EDIT: I get your point, but I think that implementing it in the drivers is a good way of pushing the tech, encouraging developers to use it, something that Nvidia is doing a lot lately and we all should thank them for that. IMO

* Except for the one created by the flashlight, but flashlights are not actual lights in most games, i.e Source games.
Posted on Reply
#18
phanbuey
devguyI think it was stupid of Nvidia to release this card. Sure, the GTX260 c216 is getting pounded by the 4890, but this GTX275 brings in waaaay too much competition for Nvidia's own GTX285. Why would anyone bother buying the 285 now? It is priced considerably more than the GTX275, but only performs slightly better. And those who want the most performance will skip over the GTX285 and purchase the GTX295 anyway. From my perspective, Nvidia essentially killed off their GTX285 sales.

What I believe they should've done is drop the price on the GTX285 to around $275, and left out the GTX275.
I think that dropping the price of the 285 is exactly what they will do...
Posted on Reply
#19
DarkMatter
devguyI think it was stupid of Nvidia to release this card. Sure, the GTX260 c216 is getting pounded by the 4890, but this GTX275 brings in waaaay too much competition for Nvidia's own GTX285. Why would anyone bother buying the 285 now? It is priced considerably more than the GTX275, but only performs slightly better. And those who want the most performance will skip over the GTX285 and purchase the GTX295 anyway. From my perspective, Nvidia essentially killed off their GTX285 sales.

What I believe they should've done is drop the price on the GTX285 to around $275, and left out the GTX275.
Yeah I said the same in another thread, but after thinking more about it, the point is that they would have to drop the price anyway, because of the Ati cards. This way they can mantain a little bit higher price for the GTX285. Anyway after seing more reviews than the ones here in TPU, where they test at higher resolutions, at above 1920x1200 resolutions the GTX285 is quite a bit faster and the 285 was always meant for that res anyway...
Posted on Reply
#20
devguy
I've seen the reviews of the GTX285 performing better above 1920x1200 too, but I figure if I had the money to afford a monitor that could go above that resolution, then I'd purchase the GTX295 or 4870x2 anyway. But that's just me.
Posted on Reply
#21
Edito
No metter how much we or u like ATI (im nvidia fan but i respect ATI) nvidia allways come on top of this and the did it again with the GTX275 the HD4890 its good too good its a nice upgrade but the GTX275 blow away in terms of performance and its price is good... in the end its a metter of preference... cause in think we just can´t feel the difference between 60FPS with nvidia and 60FPS with ATI in game...
Posted on Reply
#22
MilkyWay
talk about market dilution this will just either be a waste because people will
A go for the gtx 260 or B go for the 285 OR C it will steal sales from those cards

fuck all these cards i want a few cards then maybe a revision for each

back when they had 3 different types of a card slightly better than each other then one with 128mb ram one with 256 mb of ram and the 512mb ram version so that would be like 9 different cards

this is exactly like that market saturation is not a good thing!

i want proper resources pulled into making the next series of cards not a rehash to gain performance or market dominance
Posted on Reply
#23
Tatty_Two
Gone Fishing
devguyI think it was stupid of Nvidia to release this card. Sure, the GTX260 c216 is getting pounded by the 4890, but this GTX275 brings in waaaay too much competition for Nvidia's own GTX285. Why would anyone bother buying the 285 now? It is priced considerably more than the GTX275, but only performs slightly better. And those who want the most performance will skip over the GTX285 and purchase the GTX295 anyway. From my perspective, Nvidia essentially killed off their GTX285 sales.

What I believe they should've done is drop the price on the GTX285 to around $275, and left out the GTX275.
And the stupid thing is, I dont think the 4890 will steal many sales from the 260, over here you can get a 260 now for £137, a cheap 4890 at the moment will cost you £205, there are not many that will fork out that large difference for what still is a fairly minor (ish) performance hike, and of course you could say the same for ATi's own HD4870 1GB cards, what 5, 10% slower than the 4890 but over here you can get a 1GB 4870 for £165 thats still £40 cheaper. On top of that, in the UK, the GTX275 is released cheaper than the HD4890.... how often does that happen? that NVidia actually delivers a competing card cheaper than ATi?
Posted on Reply
#24
DarkMatter
Tatty_Onehow often does that happen? that NVidia actually delivers a competing card cheaper than ATi?
Here happens a lot.
Posted on Reply
#25
Tatty_Two
Gone Fishing
DarkMatterHere happens a lot.
Never here.
Posted on Reply
Add your own comment
Apr 25th, 2024 03:11 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts