• We've upgraded our forums. Please post any issues/requests in this thread.

Next-gen NVIDIA GeForce Specifications Unveiled

Joined
May 6, 2005
Messages
2,786
Likes
435
Location
Tre, Suomi Finland
Processor i7 4770K Haswell, watercooled
Motherboard Asus Z87-C2 Maximus VI Formula
Cooling Fuzion V1, MCW60/R2, DDC1/DDCT-01s top, PA120.3, EK200, 3× D12SL-12, liquid metal TIM
Memory 2× 8GB Crucial Ballistix Tactical LP DDR3-1600
Video Card(s) between GPUs
Storage Samsung 840Pro 256@178GB + 4× WD Red 2TB in RAID10 + LaCie Blade Runner 4TB
Display(s) HP ZR30w 30" 2560×1600 (WQXGA) H2-IPS
Case Lian Li PC-A16B
Audio Device(s) Onboard
Power Supply Corsair AX860i
Mouse Logitech PM MX / Contour RollerMouse Red+
Keyboard Logitech diNovo Edge / Logitech Elite Keyboard from 2006
Software W10 x64
Benchmark Scores yes
#51
Joined
Sep 2, 2005
Messages
294
Likes
25
Location
Szekszárd, Hungary
Processor AMD Phenom II X4 955BE
Motherboard Asus M4A785TD-V Evo
Cooling Xigmatek HDT S1283
Memory 4GB Kingston Hyperx DDR3
Video Card(s) GigaByte Radeon HD3870 512MB GDDR4
Storage WD Caviar Black 640GB, Hitachi Deskstar T7K250 250GB
Display(s) Samsung SyncMaster F2380M
Audio Device(s) Creative Audigy ES 5.1
Power Supply Corsair VX550
Software Microsoft Windows 7 Professional x64
#52
Memory manufacturers prefer money. That's all they want. They don't care who is buying their products as long as they pay and as long as they can sell ALL thier STOCK. Since they have low stock and they don't have high production right now, ANY company can buy that amount, so they would sell it to the one that paid more. Could Nvidia pay more than AMD? Maybe (well, sure), but why would they want to do so? It would make their cards more expensive, but what is worse for them is the REALLY LOW AVAILABILITY. Let's face it, Nvidia has a 66% of market share. That's twice of what Ati has. If availability is low for Ati, much more for Nvidia. Contrary to what people think, I don't think Nvidia cares too much about Ati and a lot more about their market audience. GDDR5 would make their product a lot more expensive and scarce. They don't want that. Plain and simple.

And HD4850 WON'T have a GDDR5 version from AMD. They gave partners the choice to use it. That way partners can decide if they want to pay the price premium or not. GDDR5 price is so high, that AMD has decided is not cost effective for HD4850. Now knowing that it's only an underclocked HD4870, think about GDDR5 and tell me in all honesty that it's not just a marketing strategy.
Maybe you're right. But samsung and hynix are going to produce gddr5 too, not only qimonda. I would bet, we will not see a nv card with ddr5 later :) I don't think that the gddr5 is just a marketing strategy (gddr4 was this), but we will see when there will be benchmarks on the net :)

Meanwhile i edited my previous post :)
 
Last edited:
Joined
Oct 5, 2007
Messages
1,714
Likes
182
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
#53
Maybe you're right. But samsung and hynix are going to produce gddr5 too, not only qimonda. I would bet, we will not see a nv card with ddr5 later :) I don't think that the gddr5 is just a marketing strategy (gddr4 was this), but we will see when there will be benchmarks on the net :)

Meanwhile i edited my previous post :)
But availability NOW is low, and they have to release NOW. In the near future I don't know. They have already said there's going to be a 55nm refresh soon and they could add GDDR5 support then, once availability is better.
I know that's something that's going to piss off some people, as if the fact that a 55nm version comes out would make their cards worse, but it's going to happen. People will buy >> people will enjoy >> new 55 nm card will launch >> people will complain "why they didn't release 55 nm in the first place? ****ng nvidiots". Even though they already know it will happen before launch...
 
Joined
Feb 26, 2007
Messages
850
Likes
23
Location
USA
#54
Nvidia sounds like they are getting comfortable where they are as far as designs go. I don't know about the availability of GDDR5 but I do remember the performance increase of GDDR4 wasn't that much better than GDDR3 so Nvidia may not even see it as worth it untill GDDR5 is standard/common.

Anyone ever think that Nvidia may never go to DX10.1, there are a lot of companys these days that don't like/want to work with MS. 2¢ But I think some of the industry is trying to get away from MS controlled graphics.
 
Joined
Feb 18, 2005
Messages
1,257
Likes
603
Location
South Africa
System Name Firelance
Processor i5-3570K @ 4.6GHz / 1.19V
Motherboard Gigabyte Z77X-UD5H @ F16h mod BIOS
Cooling Corsair H105 + 4x Gentle Typhoon 1850
Memory 2x 8GB Crucial Ballistix Sport DDR3-1600 CL9 @ CL7
Video Card(s) MSI GTX 1070 Armor OC @ 2000 core / 2300 mem
Storage 2x 256GB Samsung 840 Pro (RAID-0) + Hitachi Deskstar 7K3000 (3TB)
Display(s) Dell U2713HM (25x14) + Acer P243W (19x12)
Case Fractal Design ARC XL
Audio Device(s) Logitech G930
Power Supply Seasonic M12-II Bronze Evo Edition 750W
Mouse Logitech G400
Keyboard Logitech G19
Software Windows 7 Professional x64 Service Pack 1
#55
IMO, by the time we see games that require DirectX 10.1, the D10 GPU will be ancient history.
 
Joined
May 16, 2007
Messages
2,355
Likes
237
Location
Nova Scotia, Canada
System Name Main Rig
Processor Intel Core i7 4770K De-lidded 4.4GHz 1.24v
Motherboard MSI Z87-GD65 Gaming
Cooling NZXT Kraken X61
Memory 8GB(2x4GB) DDR3-1600 CL8 Crucial Ballistx Tracer Tactical
Video Card(s) XFX AMD Radeon Fury X
Storage 120GB Samsung 840 - 250GB 840 EVO - 640GB WD Black - 2TB Seagate
Display(s) LG 27" 4K Freesync IPS - LG 29" Ultrawide 21:9
Case Fractal Design Arc Midi R2
Audio Device(s) Onboard - Focusrite Scarlett 2i2
Power Supply Corsair 650TX
Software Windows 10 64Bit
#56
So, i don't get why people are saying the R770 is a just a beefed up R670...

The R7xx was in development before the R600 was even released, AMD said they were taking all there focus to the R7xx. The R770 is all new....AMD has confirmed the above.

And the GT200 is also all-new, the cards both look amazing on paper. Just like the G80 and R600 did. Remember how many people thought the R600 was gonna lay down the law:p, when they saw the specs? This is no different, the specs look much better, just as the R600 looked better on paper....that doesn't always transfer to real-world performance. All we can do is wait and see.

PS: "R770/GT200 rulezzz!!"....is just 97% fanyboy crap...
 
Last edited:
Joined
Apr 7, 2008
Messages
632
Likes
64
Location
Australia
System Name _Speedforce_ (Successor to Strike-X, 4LI3NBR33D-H, Core-iH7 & Nemesis-H)
Processor Intel Core i9 7900X (Lapped) @ 4.9Ghz With XSPC Raystorm (Lapped)
Motherboard Asus Prime X299 Deluxe (XSPC Watercooled) - Custom Heatsinks
Cooling XSPC Custom Water Cooling + Custom Air Cooling (From Delta 120's TFB1212GHE to Spal 30101504&5)
Memory 8x 8Gb Corsair Dominator Platinum 3400MHz @ 3667Mhz (CMU32GX4M4C3466C16)
Video Card(s) 3x Asus GTX1080 Ti (Lapped) With Customised EK Waterblock (Lapped) + Custom heatsinks (Lapped)
Storage 5x Samsung 960 Pro 1Tb M.2 2280 (Hyper M.2 x16 Card), 1x Samsung 850 Pro 1Tb, 6x Samsung EVO 850 4Tb
Display(s) 6x Asus ROG Swift PG27AQ
Case Aerocool Strike X (Modified)
Audio Device(s) Creative Sound BlasterX AE-5 & Aurvana XFi Headphones
Power Supply 2x Corsair AX1500i With Custom Sheilding, Custom Switching Unit. Braided Cables.
Mouse Razer Copperhead + R.A.T 9
Keyboard Ideazon Zboard + Optimus Maximus. Logitech G13.
Software w10 Pro x64.
Benchmark Scores pppft, gotta see it to believe it. . .
#57
Latency Vs Bandwith ? Its a wait and see story.
I have to purchase 2 x 4870x2's because i decided that a single 3870x2 would do the job in the ATi system i have. That wont stop me from upgrading my Nv system. I wouldnt mind playing with the CUDA on the EN8800GTX before i throw the card away.

I look forward to the GDDR5 bandwith to be utillized efficiently by AMD/ATi because its the way of the future! And suspect the reason Nvidia havnt moved onto GDDR5 is due to;
* Cheaper ram modules for a well aged technology with better latency, hoping to keep price competative with AMD/ATi's cards.
* To allow themselves to make as much money as possible off GDDR3 technology now that they got CUDA working before the public designs crazy C based software for the rest of us that might give them a greater advantage in sales the next round of releases.

Either way, im looking at big bucks, we all are. . .
 
Joined
Mar 1, 2008
Messages
242
Likes
46
Location
Antwerp, Belgium
Processor Intel Xeon X5650 @ 3.6Ghz - 1.2v
Motherboard Gigabyte G1.Assassin
Cooling Thermalright True Spirit 120
Memory 12GB DDR3 @ PC1600
Video Card(s) nVidia GeForce GTX 780 3GB
Storage 256GB Samsung 840 Pro + 3TB + 3TB + 2TB
Display(s) HP ZR22w
Case Antec P280
Audio Device(s) Asus Xonar DS
Power Supply Antec HCG-620M
Software Windows 7 x64
#58
Are the GT200 and R700 new gpu's or not?
Well the basic designs aren't. The actual gpu's are new ofcourse.

History:
R300 => R420 => R520
ATI used the basic R300 design from august 2002 until R600 was released (may 2007 but should have been on the market 6 months earlier without delay).

NV40 => G70
nVidia used NV40 technology from april 2004 until november 2006.

So it's quiet common to use certain technology for a couple generations. This will be even more profound with current generation of gpu's because of the increased complexity of unified shaders.
It takes 2 to 3 years to design a gpu like the R300, NV40, R600 or G80. After that you get the usual updates. Even a process shrink, let's say 65nm to 45nm, takes almost a year without even touching the design. These companies manage to hide this time because they have multiple design teams working in parallel.
The same thing happens with cpu's. Look at K8 and K10. Look at Conroe and Penryn.
Expect really new architectures from ATI and nVidia somewhere in 2009, maybe even later and they will be DX11.
 

warhammer

New Member
Joined
Jan 10, 2008
Messages
204
Likes
25
Processor Q6600@3.6
Motherboard Evga 680i
Cooling H20
Memory 2GB DDR2
Video Card(s) 8800GTS 512 SLI
Storage 4x320gig
Display(s) 21CRT
Case ARMOR
Audio Device(s) SB
Power Supply 750W
Software VISTA ULTIMATE
#59
Wich ever new card ATI or Nvidia can crack the 100+ FPS in CRYSIS maxred out will be the winner.
The price is going to be the KILLER.:cry:
 
Joined
May 17, 2008
Messages
490
Likes
29
#60
Are the GT200 and R700 new gpu's or not?
Well the basic designs aren't. The actual gpu's are new ofcourse.

History:
R300 => R420 => R520
ATI used the basic R300 design from august 2002 until R600 was released (may 2007 but should have been on the market 6 months earlier without delay).

NV40 => G70
nVidia used NV40 technology from april 2004 until november 2006.

So it's quiet common to use certain technology for a couple generations. This will be even more profound with current generation of gpu's because of the increased complexity of unified shaders.
It takes 2 to 3 years to design a gpu like the R300, NV40, R600 or G80. After that you get the usual updates. Even a process shrink, let's say 65nm to 45nm, takes almost a year without even touching the design. These companies manage to hide this time because they have multiple design teams working in parallel.
The same thing happens with cpu's. Look at K8 and K10. Look at Conroe and Penryn.
Expect really new architectures from ATI and nVidia somewhere in 2009, maybe even later and they will be DX11.
acctualy the conroe was based on core that was based on pentium-m that was based on p3, if u check some cpu id apps the core2 chips come up as pentium3 multi core chips, imagin if they had stuck with p3 insted of moving to pentium4..........
 

sethk

New Member
Joined
Apr 14, 2007
Messages
63
Likes
5
#61
The combination of GDDR5 and 512bit would have been too much for the consumer to bear, cost wise. There's plenty of GDDR3, and with twice the bandwidth no need to clock the memory particularly high. Think about it, who's going to be supply constrained?

Once (if?) GDDR5 is plentiful, Nvidia will come out with a lower cost redesign that's GDDR5 and 256bit or some odd bit depth like 320 or 384. Just like G92 was able to take the much more expensive G80 design and get equivalent performance at 256bit, we will see something similar for the GT200. In the meanwhile make no mistake, this is the true successor the G80, going for the high end crown.

I'm sure we'll also see the GX2 version of this before year's end.
 

bill_d

New Member
Joined
Mar 9, 2008
Messages
35
Likes
0
Processor 3930k
Motherboard Asus Rampage lV Extreme
Cooling water
Memory 16gb
Video Card(s) xfx 6970 cf water
Display(s) hp lp3065
Software windows 7 64
#62
how you going to get 500+ watts to a 280gx2
 
Joined
May 19, 2007
Messages
7,662
Likes
536
Location
c:\programs\kitteh.exe
Processor C2Q6600 @ 1.6 GHz
Motherboard Anus PQ5
Cooling ACFPro
Memory GEiL2 x 1 GB PC2 6400
Video Card(s) MSi 4830 (RIP)
Storage Seagate Barracuda 7200.10 320 GB Perpendicular Recording
Display(s) Dell 17'
Case El Cheepo
Audio Device(s) 7.1 Onboard
Power Supply Corsair TX750
Software MCE2K5
#63
its own built in psu maybe?
 

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
34,312
Likes
17,423
Location
Hyderabad, India
System Name Long shelf-life potato
Processor Intel Core i7-4770K
Motherboard ASUS Z97-A
Cooling Xigmatek Aegir CPU Cooler
Memory 16GB Kingston HyperX Beast DDR3-1866
Video Card(s) 2x GeForce GTX 970 SLI
Storage ADATA SU800 512GB
Display(s) Samsung U28D590D 28-inch 4K
Case Cooler Master CM690 Window
Audio Device(s) Creative Sound Blaster Recon3D PCIe
Power Supply Corsair HX850W
Mouse Razer Abyssus 2014
Keyboard Microsoft Sidewinder X4
Software Windows 10 Pro Creators Update
#64
how you going to get 500+ watts to a 280gx2
NVidia never made a dual-GPU card using G80, ATI never made one using R600 either. I think there will be a tone-down GPU derived from GT200 that will make it to the next dual-GPU card from NV. By 'tone-down' I'm refering to what G92 and RV670 were to G80 and R600.
 

bill_d

New Member
Joined
Mar 9, 2008
Messages
35
Likes
0
Processor 3930k
Motherboard Asus Rampage lV Extreme
Cooling water
Memory 16gb
Video Card(s) xfx 6970 cf water
Display(s) hp lp3065
Software windows 7 64
#65
maybe, but i think till they move to 55nm and if they get power savings like ati did from 2900 to 3870 you won't see a new gx2
 
Joined
Aug 30, 2006
Messages
6,374
Likes
983
System Name ICE-QUAD // ICE-CRUNCH
Processor Q6600 // 2x Xeon 5472
Memory 2GB DDR // 8GB FB-DIMM
Video Card(s) HD3850-AGP // FireGL 3400
Display(s) 2 x Samsung 204Ts = 3200x1200
Audio Device(s) Audigy 2
Software Windows Server 2003 R2 as a Workstation now migrated to W10 with regrets.
#66
Joined
May 17, 2008
Messages
490
Likes
29
#67
damn u lemonadesoda, you stole my bit, i been using that 2nd link for weeks/months now, well versions of it.........:)
 
Joined
Oct 5, 2007
Messages
1,714
Likes
182
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
#68
damn u lemonadesoda, you stole my bit, i been using that 2nd link for weeks/months now, well versions of it.........:)
and damn you Rebo, I felt the urge to click on that second link after your post, even though I didn't want to do so. :D

It's been removed BTW.

The combination of GDDR5 and 512bit would have been too much for the consumer to bear, cost wise. There's plenty of GDDR3, and with twice the bandwidth no need to clock the memory particularly high. Think about it, who's going to be supply constrained?

Once (if?) GDDR5 is plentiful, Nvidia will come out with a lower cost redesign that's GDDR5 and 256bit or some odd bit depth like 320 or 384. Just like G92 was able to take the much more expensive G80 design and get equivalent performance at 256bit, we will see something similar for the GT200. In the meanwhile make no mistake, this is the true successor the G80, going for the high end crown.

I'm sure we'll also see the GX2 version of this before year's end.
Exactly what I was saying. For Nvidia supply is a very important thing. 8800 GT was an exception in a long history of delibering plenty of cards at launch. Paper launch is Ati's bussines, not Nvidia's, don't forget this guys.
 
Joined
Jan 11, 2005
Messages
1,086
Likes
357
Location
66 feet from the ground
System Name 2nd AMD puppy
Processor Phenom II x4 BE 970-3.6 Ghz
Motherboard Gigabyte GA-970A-UD3
Cooling Cooler Master Hyper TX2
Memory 16 Gb DDR3:8GB Kingston HyperX Beast + 8Gb G.Skill Sniper(by courtesy of tabascosauz &TPU)
Video Card(s) MSI HD6850 Cyclone;860/1100 Mhz
Storage SSD-840 pro 128 Gb;Seagate 500Gb;WD-1Tb
Display(s) Samsung 2032BW
Case NZXT 820 PHANTOM
Audio Device(s) Audigy SE with Logitech Z-5500
Power Supply Cooleer Master RP M520
Mouse Razer copperhead / Gamdias zeus (by courtesy of sneekypeet & TPU)
Keyboard MS Sidewinder x4
Software win7 64bit sp1
Benchmark Scores irrelevant for me
#71
is normal that Amd-Ati will use DDR5 and Nvidia not mainly because Ati has promoted and invested in the research and production and all ram manufacturers will serve 1st ati with the new tech. and this 2 step distance will remain between them i think
 

InnocentCriminal

Resident Grammar Amender
Joined
Feb 21, 2005
Messages
6,477
Likes
844
System Name BeeR 6
Processor Intel Core i7 3770K*
Motherboard ASUS Maximus V Gene (1155/Z77)
Cooling Corsair H100i
Memory 16GB Samsung Green 1600MHz DDR3**
Video Card(s) 4GB MSI Gaming X RX480
Storage 256GB Samsung 840 Pro SSD
Display(s) 27" Samsung C27F591FDU
Case Fractal Design Arc Mini
Power Supply Corsair HX750W
Software 64bit Microsoft Windows 10 Pro
Benchmark Scores *@ 4.6GHz **@ 2133MHz
#72
It's interesting to see that the 280 will be using a 512Mbit memory bus, that alone should help on the performance. ATi should have implemented it in the 4870(X2).
 
Joined
Mar 1, 2008
Messages
242
Likes
46
Location
Antwerp, Belgium
Processor Intel Xeon X5650 @ 3.6Ghz - 1.2v
Motherboard Gigabyte G1.Assassin
Cooling Thermalright True Spirit 120
Memory 12GB DDR3 @ PC1600
Video Card(s) nVidia GeForce GTX 780 3GB
Storage 256GB Samsung 840 Pro + 3TB + 3TB + 2TB
Display(s) HP ZR22w
Case Antec P280
Audio Device(s) Asus Xonar DS
Power Supply Antec HCG-620M
Software Windows 7 x64
#73
acctualy the conroe was based on core that was based on pentium-m that was based on p3, if u check some cpu id apps the core2 chips come up as pentium3 multi core chips, imagin if they had stuck with p3 insted of moving to pentium4..........
Well i didn't want to go into details of cpu's. I'm just making a reference.
 
Joined
Oct 5, 2007
Messages
1,714
Likes
182
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
#74
imagin if they had stuck with p3 insted of moving to pentium4..........
That's something that has been debated a lot. IMO P4 was a good decision at that time, but it stayed too long. P3 had reached a wall and P4 was the only way they saw to overcome this. It's like a jam on the highway, sometimes your lane does not move forward and you find that the next one does, so you change lanes. A bit later your lane stops and you see that your previous lane is moving faster, but you can not change right away again. At the end you always come back, but the question remains whether you would have advanced more if you had stayed in it in the first place. Usually if you're smart and lucky enough, you advance more changing lanes. It didn't work for Intel, or yes, actually there is no way of knowing.
 

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
34,312
Likes
17,423
Location
Hyderabad, India
System Name Long shelf-life potato
Processor Intel Core i7-4770K
Motherboard ASUS Z97-A
Cooling Xigmatek Aegir CPU Cooler
Memory 16GB Kingston HyperX Beast DDR3-1866
Video Card(s) 2x GeForce GTX 970 SLI
Storage ADATA SU800 512GB
Display(s) Samsung U28D590D 28-inch 4K
Case Cooler Master CM690 Window
Audio Device(s) Creative Sound Blaster Recon3D PCIe
Power Supply Corsair HX850W
Mouse Razer Abyssus 2014
Keyboard Microsoft Sidewinder X4
Software Windows 10 Pro Creators Update
#75
Right, it's back to discussing about two companies that are pregnant and of whose baby is better. Like babies, things look better after birth, scans and graphs are always blurry.