• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Next-gen NVIDIA GeForce Specifications Unveiled

malware

New Member
Joined
Nov 7, 2004
Messages
5,422 (0.72/day)
Location
Bulgaria
Processor Intel Core 2 Quad Q6600 G0 VID: 1.2125
Motherboard GIGABYTE GA-P35-DS3P rev.2.0
Cooling Thermalright Ultra-120 eXtreme + Noctua NF-S12 Fan
Memory 4x1 GB PQI DDR2 PC2-6400
Video Card(s) Colorful iGame Radeon HD 4890 1 GB GDDR5
Storage 2x 500 GB Seagate Barracuda 7200.11 32 MB RAID0
Display(s) BenQ G2400W 24-inch WideScreen LCD
Case Cooler Master COSMOS RC-1000 (sold), Cooler Master HAF-932 (delivered)
Audio Device(s) Creative X-Fi XtremeMusic + Logitech Z-5500 Digital THX
Power Supply Chieftec CFT-1000G-DF 1kW
Software Laptop: Lenovo 3000 N200 C2DT2310/3GB/120GB/GF7300/15.4"/Razer
After we already know what AMD/ATI are planning on their camp, it's NVIDIA's turn to show us what we should be prepared for. Verified by DailyTech, NVIDIA plans on refreshing its GPU line-up on June 18th with two new video cards that will feature the first CUDA-enabled graphics core, codenamed D10U. Two models are expected to be launched simultaneously, the flagship GeForce GTX 280 (D10U-30) and GeForce GTX 260 (D10U-20). The first chip will utilize 512-bit memory bus width, 240 stream processors (128 on the 9800 GTX) and support for up to 1GB memory. GTX 260 will be trimmed down version with 192 stream processors, 448-bit bus and up to 896MB graphics memory. Both cards will use the PCI-E version 2.0 interface, and will support NVIDIA's 3-way SLI technology. NVIDIA also promises that the unified shaders of both cards are to perform 50% faster than previous generation cards. Compared to the upcoming AMD Radeon 4000 series, the D10U GPU lacks of DirectX 10.1 support and is also limited to GDDR3 only memory. NVIDIA's documentation does not list an estimated street price for the new cards.

View at TechPowerUp Main Site
 
Sounds like im getting ATI card this time, to me the new ATI R700 cards sound much better on paper than the GT200?? I might be wrong, but sounds like the nV card is just a revolution of G92 and not a new gpu? I mean GDDR3 still, they could atleast go over to 4, preferably to 5 as ATI. And still no dx 10.1?? No im getting me a 4870 this summer, sounds like ill be getting it for cheap as well. Maybe finallly I can afford a Xfire system?? :)
 
Malware, why did you ignore my links that had similar information posted (release date), sent to you half a month ago?
 
would have thought 10.1 would have been implemented, bit like having dx9.0b over dx9.0c?
 
As if RV770 was a "new GPU"... Whatever that means. The architecture of G80/G92 is superior than anything out there. Why on earth would you think nV should dump it?
And DX10.1 is hardly worth mentioning, how many DX10.1 titles are there again?
 
Malware, why did you ignore my links that had similar information posted (release date), sent to you half a month ago?

The only recent PM I have from you is the one with the GIGABYTE Extreme motherboard?
 
Maybe they just don't see any performance improve from GDDR4 over GDDR3 either i, look at the 8800GTS G92 has a spectaluar performance but still use GDDR3 when the time comes they will make a good use of it i believe cause i think ATI is using it but they are nothing using it well cause we just can't see any performance improvement... Don't get me wrong its what i think...
 
Sounds like im getting ATI card this time, to me the new ATI R700 cards sound much better on paper than the GT200??

That's always the case. We're drenched into amazing numbers such as "512bit", "1 GB GDDR4", "320 SP's".

No, I don't think the HD4870 can beat the GTX 280 in raw performance at least, maybe price, power and other factors.
 
there would probably be more if nv was using 10.1,


As if RV770 was a "new GPU"... Whatever that means. The architecture of G80/G92 is superior than anything out there. Why on earth would you think nV should dump it?
And DX10.1 is hardly worth mentioning, how many DX10.1 titles are there again?
 
GDDR4 could have been a smart move as it is much more power efficient than GDDR3. Those 16/14 chips of GDDR3 on GTX280/GTX260 are going to suck stupid amounts of power, something like freaking 60-80W for the GDDR3 alone...

65nm - instead of 55nm - is another problem and causes more unnecessary power consumption.

And yet again, nV fails in creating a practical PCB layout. The board used for GTX280/260 is pure horror.
 
As if RV770 was a "new GPU"... Whatever that means. The architecture of G80/G92 is superior than anything out there. Why on earth would you think nV should dump it?
And DX10.1 is hardly worth mentioning, how many DX10.1 titles are there again?

Well, you KNOW why there's very little DX10.1 implementation, look at Assassin's Creed, NV moaned, stamped their feet, and so on to get it removed. Dx10.1 is "insignificant" because NV want it to be. In reality, NV can't implement it whereas DX10.1 on the 3800s shows massive performance gains when enabled.
 
That's always the case. We're drenched into amazing numbers such as "512bit", "1 GB GDDR4", "320 SP's".

No, I don't think the HD4870 can beat the GTX 280 in raw performance at least, maybe price, power and other factors.




I agree :toast:

GT200 rocks ! :rockout:
 
Dx10.1 Upgrade?

Hi!

I have one question. If DX10.1 can be removed by a patch, does it mean that it works the other way around? Like upgrade Crysis to DX10.1? Or any other DX10 title.
That would be nice, and I presume not to hard to accomplish(technically).
 
Sounds like im getting ATI card this time, to me the new ATI R700 cards sound much better on paper than the GT200?? I might be wrong, but sounds like the nV card is just a revolution of G92 and not a new gpu? I mean GDDR3 still, they could atleast go over to 4, preferably to 5 as ATI. And still no dx 10.1?? No im getting me a 4870 this summer, sounds like ill be getting it for cheap as well. Maybe finallly I can afford a Xfire system?? :)

Completley wrong.


GT200 is a FULL new GPU, and the GDDR3 works alright better than the GDDR5.At the end you get the same results but the GDDR3s they are more exploitable.


The differences betwheen DX10 and DX10.1 are least ! The games have just begun to use the DX10s and they are little of it !!
 
Hi!

I have one question. If DX10.1 can be removed by a patch, does it mean that it works the other way around? Like upgrade Crysis to DX10.1? Or any other DX10 title.
That would be nice, and I presume not to hard to accomplish(technically).

nvidia owns crytek now, so there will be no dx10.1 support (crytek officialy confirmed this).
 
It looks great on paper, but, how will the wallet look like when you buy one of these?

Anyone knows the price or has a hint?
 
JAKra,
Unlikely, but really, anything's possible but ofcourse it's way easier to cut rather than add something.

(...) NV can't implement it whereas DX10.1 on the 3800s shows massive performance gains when enabled.
DX10.1 in AC allows performance boost when AA is used. Sure.
But then again, it also causes incompatibility with nV GPUs that only support DX10.

Choose now, which would you fix?
nvidia owns crytek now, so there will be no dx10.1 support (crytek officialy confirmed this).
Link please.
 
Completley wrong.


GT200 is a FULL new GPU, and the GDDR3 works alright better than the GDDR5.At the end you get the same results but the GDDR3s they are more exploitable.


The differences betwheen DX10 and DX10.1 are least ! The games have just begun to use the DX10s and they are little of it !!

The gt200 is not new it's just an improved g80. The memory controller in g80 is not flexible, so they have to use gddr3 in gt200 too.
 
JAKra,
Unlikely, but really, anything's possible but ofcourse it's way easier to cut rather than add something.


DX10.1 in AC allows performance boost when AA is used. Sure.
But then again, it also causes incompatibility with nV GPUs that only support DX10.

Choose now, which would you fix?Link please.

it's not incompatible, just when vista sp1 installed, nv gpus don't use dx10.1 features, but there is no incompatibility.
 
The gt200 is not new it's just an improved g80. The memory controller in g80 is not flexible, so they have to use gddr3 in gt200 too.


Sure? Then when a new GPU will go out? They had everybody confirmed that it was new !!

DAMN :shadedshu:mad::banghead:
 
Sure? Then when a new GPU will go out? They had everybody confirmed that it was new !!

DAMN :shadedshu:mad::banghead:

Don't be sad, the gt200 will be the fastest gpu ever released, 9900gtx will be a brutal card, much more faster than 8800ultra/9800gtx :)
 
I very much doubt it's a new g80 as the most recent cards are g92..

I would also discourage the fanboy attitudes already emerging in this thread...get whichever is best, they're both unreleased yet..

also, this comes out on my birthday
mega lol
 
Exavier,
G200 is evolved G92 which is evolved G80. So it's more like a "new G80" as it's targeted for ultra high-end rather than performance-sector as G92.
it's not incompatible, just when vista sp1 installed, nv gpus don't use dx10.1 features, but there is no incompatibility.
Well obviously nV chips are incompatible with Ubisoft's DX10.1 as removing that removes problems with nV GPUs.
 
I very much doubt it's a new g80 as the most recent cards are g92..

g92 is just a revised g80, as rv670 is a revised r600, and rv770 is an improved rv670.
 
Back
Top