• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Next-gen NVIDIA GeForce Specifications Unveiled

Joined
May 6, 2005
Messages
2,792 (0.40/day)
Location
Tre, Suomi Finland
System Name Ladpot ◦◦◦ Desktop
Processor R7 5800H ◦◦◦ i7 4770K, watercooled
Motherboard HP 88D2 ◦◦◦ Asus Z87-C2 Maximus VI Formula
Cooling Mixed gases ◦◦◦ Fuzion V1, MCW60/R2, DDC1/DDCT-01s top, PA120.3, EK200, D12SL-12, liq.metal TIM
Memory 2× 8GB DDR4-3200 ◦◦◦ 2× 8GB Crucial Ballistix Tactical LP DDR3-1600
Video Card(s) RTX 3070 ◦◦◦ heaps of dead GPUs in the garage
Storage Samsung 980 PRO 2TB ◦◦◦ Samsung 840Pro 256@178GB + 4× WD Red 2TB in RAID10 + LaCie Blade Runner 4TB
Display(s) HP ZR30w 30" 2560×1600 (WQXGA) H2-IPS
Case Lian Li PC-A16B
Audio Device(s) Onboard
Power Supply Corsair AX860i
Mouse Logitech MX Master 2S / Contour RollerMouse Red+
Keyboard Logitech Elite Keyboard from 2006 / Contour Balance Keyboard / Logitech diNovo Edge
Software W11 x64 ◦◦◦ W10 x64
Benchmark Scores It does boot up? I think.
I'm talking about the DX10.1 implementation in the game, not in SP1. DX10.1 code in AC causes problems with nV GPUs. That's why it was removed by Ubisoft.
 

HTC

Joined
Apr 1, 2008
Messages
4,604 (0.78/day)
Location
Portugal
System Name HTC's System
Processor Ryzen 5 2600X
Motherboard Asrock Taichi X370
Cooling NH-C14, with the AM4 mounting kit
Memory G.Skill Kit 16GB DDR4 F4 - 3200 C16D - 16 GTZB
Video Card(s) Sapphire Nitro+ Radeon RX 480 OC 4 GB
Storage 1 Samsung NVMe 960 EVO 250 GB + 1 3.5" Seagate IronWolf Pro 6TB 7200RPM 256MB SATA III
Display(s) LG 27UD58
Case Fractal Design Define R6 USB-C
Audio Device(s) Onboard
Power Supply Corsair TX 850M 80+ Gold
Mouse Razer Deathadder Elite
Software Ubuntu 19.04 LTS
IMHO, not having Dx10.1 is a shot in the foot, in the long run.

Sure, right now there aren't many games for it but that will change and when it does, ATI will be prepared but nVidia won't.
 

PVTCaboose1337

Graphical Hacker
Joined
Feb 1, 2006
Messages
9,501 (1.43/day)
Location
Texas
System Name Whim
Processor Intel Core i5 2500k @ 4.4ghz
Motherboard Asus P8Z77-V LX
Cooling Cooler Master Hyper 212+
Memory 2 x 4GB G.Skill Ripjaws @ 1600mhz
Video Card(s) Gigabyte GTX 670 2gb
Storage Samsung 840 Pro 256gb, WD 2TB Black
Display(s) Shimian QH270 (1440p), Asus VE228 (1080p)
Case Cooler Master 430 Elite
Audio Device(s) Onboard > PA2V2 Amp > Senn 595's
Power Supply Corsair 750w
Software Windows 8.1 (Tweaked)
Caboose senses some fail. No DX 10.1... fail. GDDR3? Fail...
 
Joined
Mar 29, 2007
Messages
4,838 (0.78/day)
System Name Aquarium
Processor Ryzen 9 7950x
Motherboard ROG Strix X670-E
Cooling Lian Li Galahead 360 AIO
Memory 2x16gb Flare X5 Series 32GB (2 x 16GB) DDR5-6000 PC5-48000
Video Card(s) Asus RTX 3060
Storage 2TB WD SN850X Black NVMe, 500GB Samsung 970 NVMe
Display(s) Gigabyte 32" IPS 144Hz
Case Hyte Y60
Power Supply Corsair RMx 850
Software Win 11 Pro/ PopOS!
I wouldn't say fail. Nvidia is getting a 512-bit bus, Ati is going for gddr5 memory. Both expensive ways to increase bandwidth, and both will be great. Whats strange to me is nvidia keeps making strange memory amounts. 896mb of memory? Odd, I'm sure the math works out though.

And who cares about 10.1. We still don't have a native dx10 game, and the improvements for 10.1 I'm sure will be minor. I seem to remember a thread here where everyone seemed to think there wasn't much difference between dx9 and dx10. And now everyone's complaining about 10.1. Methinks some would rather find the bad and complain than the good and rejoice.:shadedshu
 
Last edited:
Joined
Sep 2, 2005
Messages
294 (0.04/day)
Location
Szekszárd, Hungary
Processor AMD Phenom II X4 955BE
Motherboard Asus M4A785TD-V Evo
Cooling Xigmatek HDT S1283
Memory 4GB Kingston Hyperx DDR3
Video Card(s) GigaByte Radeon HD3870 512MB GDDR4
Storage WD Caviar Black 640GB, Hitachi Deskstar T7K250 250GB
Display(s) Samsung SyncMaster F2380M
Audio Device(s) Creative Audigy ES 5.1
Power Supply Corsair VX550
Software Microsoft Windows 7 Professional x64
I'm talking about the DX10.1 implementation in the game, not in SP1. DX10.1 code in AC causes problems with nV GPUs. That's why it was removed by Ubisoft.

I never heard about it. I'm sure this thing would be mentioned in the hardocp article, but it's not, so i don't think it's true.
 
Joined
Feb 8, 2008
Messages
2,665 (0.45/day)
Location
Switzerland
Processor i9 9900KS ( 5 Ghz all the time )
Motherboard Asus Maximus XI Hero Z390
Cooling EK Velocity + EK D5 pump + Alphacool full copper silver 360mm radiator
Memory 16GB Corsair Dominator GT ROG Edition 3333 Mhz
Video Card(s) ASUS TUF RTX 3080 Ti 12GB OC
Storage M.2 Samsung NVMe 970 Evo Plus 250 GB + 1TB 970 Evo Plus
Display(s) Asus PG279 IPS 1440p 165Hz G-sync
Case Cooler Master H500
Power Supply Asus ROG Thor 850W
Mouse Razer Deathadder Chroma
Keyboard Rapoo
Software Win 10 64 Bit
Guys i found new pics this is how he looks the GTX280









 

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.28/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
The gt200 is not new it's just an improved g80. The memory controller in g80 is not flexible, so they have to use gddr3 in gt200 too.

It's because GDDR5 supply won't be enough for both companies. It's not even enough for Ati, indeed they droped it from HD4850 AND reduced HD4870's frame buffer to 512 because of this same thing. If Nvidia tried to fight to get GDDR5 too, prices would go up >>> worse for consumers.
 

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.28/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
GT200 is as much a "new chip" as RV770 is. There's nothing new in RV770 that there isn't in RV670 besides GDDR5 support. And that means nothing, it's just e-penis and marketing.

Indeed, if what has been said about the Shader Processors is true, GT200 is more "new" or "advanced/improved" relative to G92 than RV670 to 770. Making SPs 50% more efficient and faster IS what I call IMPROVED architecture and not adding a GDDR5 memory support that is not going to be used anyway. I could say the same about 512 memory interface though.

What is that has improved so much otherwise? SPs running faster than the core? 50% more of them? Double the TMUs?

No, time for a reality check, guys. There's no innovation in any of the new chips.
 
Joined
Sep 2, 2005
Messages
294 (0.04/day)
Location
Szekszárd, Hungary
Processor AMD Phenom II X4 955BE
Motherboard Asus M4A785TD-V Evo
Cooling Xigmatek HDT S1283
Memory 4GB Kingston Hyperx DDR3
Video Card(s) GigaByte Radeon HD3870 512MB GDDR4
Storage WD Caviar Black 640GB, Hitachi Deskstar T7K250 250GB
Display(s) Samsung SyncMaster F2380M
Audio Device(s) Creative Audigy ES 5.1
Power Supply Corsair VX550
Software Microsoft Windows 7 Professional x64
It's because GDDR5 supply won't be enough for both companies. It's not even enough for Ati, indeed they droped it from HD4850 AND reduced HD4870's frame buffer to 512 because of this same thing. If Nvidia tried to fight to get GDDR5 too, prices would go up >>> worse for consumers.

I don't think so. If g80/g92/gt200 memcontroller could use ddr5 it's obviously it could use gddr4. But we didn't see g80 or g92 with gddr4. (i know gddr4 isn't faster than gddr3, but if g80 could use ddr4 we would've seen that already, just like the 2gb 9600gt, it doesn't make any sense, but many people don't know that. It's just marketing.)

(i know my English a bit crap, but i hope you'll understand what i wrote)
 
Joined
May 6, 2005
Messages
2,792 (0.40/day)
Location
Tre, Suomi Finland
System Name Ladpot ◦◦◦ Desktop
Processor R7 5800H ◦◦◦ i7 4770K, watercooled
Motherboard HP 88D2 ◦◦◦ Asus Z87-C2 Maximus VI Formula
Cooling Mixed gases ◦◦◦ Fuzion V1, MCW60/R2, DDC1/DDCT-01s top, PA120.3, EK200, D12SL-12, liq.metal TIM
Memory 2× 8GB DDR4-3200 ◦◦◦ 2× 8GB Crucial Ballistix Tactical LP DDR3-1600
Video Card(s) RTX 3070 ◦◦◦ heaps of dead GPUs in the garage
Storage Samsung 980 PRO 2TB ◦◦◦ Samsung 840Pro 256@178GB + 4× WD Red 2TB in RAID10 + LaCie Blade Runner 4TB
Display(s) HP ZR30w 30" 2560×1600 (WQXGA) H2-IPS
Case Lian Li PC-A16B
Audio Device(s) Onboard
Power Supply Corsair AX860i
Mouse Logitech MX Master 2S / Contour RollerMouse Red+
Keyboard Logitech Elite Keyboard from 2006 / Contour Balance Keyboard / Logitech diNovo Edge
Software W11 x64 ◦◦◦ W10 x64
Benchmark Scores It does boot up? I think.
I never heard about it. I'm sure this thing would be mentioned in the hardocp article, but it's not, so i don't think it's true.
In the beginning, everything looked perfect. The DX10.1 API included in Assassin’s Creed enabled Anti-Aliasing in a single pass, which allowed ATI Radeon HD 3000 hardware (which supports DX10.1) to flaunt a competitive advantage over Nvidia (which support only DX10.0). But Assassin's Creed had problems. We noticed various reports citing stability issues such as widescreen scaling, camera loops and crashes - mostly on Nvidia hardware.

(...)

So, what is it that convinced Ubisoft to drop the DirectX 10.1 code path? Here is the official explanation:

“We’re planning to release a patch for the PC version of Assassin’s Creed that addresses the majority of issues reported by fans. In addition to addressing reported glitches, the patch will remove support for DX10.1, since we need to rework its implementation. The performance gains seen by players who are currently playing Assassin’s Creed with a DX10.1 graphics card are in large part due to the fact that our implementation removes a render pass during post-effect which is costly.”
http://www.tgdaily.com/content/view/37326/98/
 

CDdude55

Crazy 4 TPU!!!
Joined
Jul 12, 2007
Messages
8,178 (1.33/day)
Location
Virginia
System Name CDdude's Rig!
Processor AMD Athlon II X4 620
Motherboard Gigabyte GA-990FXA-UD3
Cooling Corsair H70
Memory 8GB Corsair Vengence @1600mhz
Video Card(s) XFX HD 6970 2GB
Storage OCZ Agility 3 60GB SSD/WD Velociraptor 300GB
Display(s) ASUS VH232H 23" 1920x1080
Case Cooler Master CM690 (w/ side window)
Audio Device(s) Onboard (It sounds fine)
Power Supply Corsair 850TX
Software Windows 7 Home Premium 64bit SP1
Might have to pick me up one of those new nvidia cards. First i need my stimulus check come. I also don't care if my CPU bottlenecks it. I will still the the raw performance.;)
 

DrPepper

The Doctor is in the house
Joined
Jan 16, 2008
Messages
7,482 (1.26/day)
Location
Scotland (It rains alot)
System Name Rusky
Processor Intel Core i7 D0 3.8Ghz
Motherboard Asus P6T
Cooling Thermaltake Dark Knight
Memory 12GB Patriot Viper's 1866mhz 9-9-9-24
Video Card(s) GTX470 1280MB
Storage OCZ Summit 60GB + Samsung 1TB + Samsung 2TB
Display(s) Sharp Aquos L32X20E 1920 x 1080
Case Silverstone Raven RV01
Power Supply Corsair 650 Watt
Software Windows 7 x64
Benchmark Scores 3DMark06 - 18064 http://img.techpowerup.org/090720/Capture002.jpg
Completley wrong.


GT200 is a FULL new GPU, and the GDDR3 works alright better than the GDDR5.At the end you get the same results but the GDDR3s they are more exploitable.


The differences betwheen DX10 and DX10.1 are least ! The games have just begun to use the DX10s and they are little of it !!

He said it SOUNDS better on paper, How do you know GDDR3 works better than GDDR5 when it hasn't been implemented yet on a card, also if GDDR3 is not better than GDDR4 would it not be right to assume that GDDR3 isn't better than GDDR2 and because all games don't use 10.1 doesn't mean Nvidia shouldn't be innovative and implement it because soon all games will adopt it like directX 9.0c.
 
Last edited:

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.28/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
I don't think so. If g80/g92/gt200 memcontroller could use ddr5 it's obviously it could use gddr4. But we didn't see g80 or g92 with gddr4. (i know gddr4 isn't faster than gddr3, but if g80 could use ddr4 we would've seen that already, just like the 2gb 9600gt, it doesn't make any sense, but many people don't know that. It's just marketing.)

(i know my English a bit crap, but i hope you'll understand what i wrote)

I can say the same taht I said with GDDR5 plus GDDR4 has proved to not be better than GDDR3. So why use it if it's not for marketing? GDDR3 is as good and it's cheaper, and so is that what you use. There's nothing like incompatibility, they could use it if they wanted, but I'm sure they would have to pay royalties for a performance gain that doesnt exist. Same for GDDR5 and DX10.1. People like to mention "conspiracy" theories about TWIMTBP, so I'm going to say one that I have been thinking of for some time about DX10.1 and why Nvidia doesn't want to implement it. There are many "hints" out there that suggest me that MS and Ati developed DX10.1 (even DX10) specifications together. And is very likely that Ati filled many patents about it's imlementation in hardware long before Nvidia even knew anything about how DX10 was going to be. As some have suggested DX10.1 is what DX10 was going to be before Nvidia did their suggestions, what Ati wanted it to be. So now Nvidia has to pay if they want to implement it. Don't ask me for proofs, since I have the same as those who say Nvidia guys pay developers to make Nvidia hardware faster. That is: NONE.
 
Joined
May 9, 2006
Messages
2,116 (0.32/day)
System Name Not named
Processor Intel 8700k @ 5Ghz
Motherboard Asus ROG STRIX Z370-E Gaming
Cooling DeepCool Assassin II
Memory 16GB DDR4 Corsair LPX 3000mhz CL15
Video Card(s) Zotac 1080 Ti AMP EXTREME
Storage Samsung 960 PRO 512GB
Display(s) 24" Dell IPS 1920x1200
Case Fractal Design R5
Power Supply Corsair AX760 Watt Fully Modular
Whats with all this fighting. This should be a time of celebration when we have another fancy/expensive card coming out that we can buy. If nvidia thought they needed faster ram they would have done it. Engineers are not stupid after all. As for the whole dx10.1 thing, that sounds like a discussion for another thread, perhaps even in general nonsense for the huge amount of flaming and fanboyism.
 

DrPepper

The Doctor is in the house
Joined
Jan 16, 2008
Messages
7,482 (1.26/day)
Location
Scotland (It rains alot)
System Name Rusky
Processor Intel Core i7 D0 3.8Ghz
Motherboard Asus P6T
Cooling Thermaltake Dark Knight
Memory 12GB Patriot Viper's 1866mhz 9-9-9-24
Video Card(s) GTX470 1280MB
Storage OCZ Summit 60GB + Samsung 1TB + Samsung 2TB
Display(s) Sharp Aquos L32X20E 1920 x 1080
Case Silverstone Raven RV01
Power Supply Corsair 650 Watt
Software Windows 7 x64
Benchmark Scores 3DMark06 - 18064 http://img.techpowerup.org/090720/Capture002.jpg
:toast: good idea man I hate getting wrapped up reading these posts and feeling I need to say something.
 
Joined
Sep 2, 2005
Messages
294 (0.04/day)
Location
Szekszárd, Hungary
Processor AMD Phenom II X4 955BE
Motherboard Asus M4A785TD-V Evo
Cooling Xigmatek HDT S1283
Memory 4GB Kingston Hyperx DDR3
Video Card(s) GigaByte Radeon HD3870 512MB GDDR4
Storage WD Caviar Black 640GB, Hitachi Deskstar T7K250 250GB
Display(s) Samsung SyncMaster F2380M
Audio Device(s) Creative Audigy ES 5.1
Power Supply Corsair VX550
Software Microsoft Windows 7 Professional x64
Joined
May 6, 2005
Messages
2,792 (0.40/day)
Location
Tre, Suomi Finland
System Name Ladpot ◦◦◦ Desktop
Processor R7 5800H ◦◦◦ i7 4770K, watercooled
Motherboard HP 88D2 ◦◦◦ Asus Z87-C2 Maximus VI Formula
Cooling Mixed gases ◦◦◦ Fuzion V1, MCW60/R2, DDC1/DDCT-01s top, PA120.3, EK200, D12SL-12, liq.metal TIM
Memory 2× 8GB DDR4-3200 ◦◦◦ 2× 8GB Crucial Ballistix Tactical LP DDR3-1600
Video Card(s) RTX 3070 ◦◦◦ heaps of dead GPUs in the garage
Storage Samsung 980 PRO 2TB ◦◦◦ Samsung 840Pro 256@178GB + 4× WD Red 2TB in RAID10 + LaCie Blade Runner 4TB
Display(s) HP ZR30w 30" 2560×1600 (WQXGA) H2-IPS
Case Lian Li PC-A16B
Audio Device(s) Onboard
Power Supply Corsair AX860i
Mouse Logitech MX Master 2S / Contour RollerMouse Red+
Keyboard Logitech Elite Keyboard from 2006 / Contour Balance Keyboard / Logitech diNovo Edge
Software W11 x64 ◦◦◦ W10 x64
Benchmark Scores It does boot up? I think.
Making SPs 50% more efficient and faster IS what I call IMPROVED architecture (...)
There's no reason to believe the SPs are more efficient. And infact, the exact quote is:
NVIDIA also promises that the unified shaders of both cards are to perform 50% faster than previous generation cards.
Faster would more than likely just mean they run at 1.5x the frequency of previous generation shaders.
 

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.28/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
There's no reason to believe the SPs are more efficient. And infact, the exact quote is:
Faster would more than likely just mean they run at 1.5x the frequency of previous generation shaders.

I don't know where did I read it, but they said efficient.
Also in DailyTech at the OP link, they say:

NVIDIA documentation claims these second-generation unified shaders perform 50 percent better than the shaders found on the D9 cards released earlier this year.

You would just say "run 50% faster" and not "second-generation" and "perform 50% better" if that was the case. I'm not taking that as a fact. But IMO Nvidia and DailyTech are in the end saying more "efficient". In the other site that I said (and can't remember what is, I read 20+ tech sites each day) they used "efficient" word. If that ends up being true, that's another story.

EDIT: Also it's that I think it's a lot more probable that shaders are more "efficient" (i.e by adding another ALU, I don't know) than shaders running at 2400+ Mhz. The card is still 65nm, correct me if I'm wrong, but 2400Mhz is not going to to happen at 65nm on a reference design.
 
Last edited:
Joined
Sep 2, 2005
Messages
294 (0.04/day)
Location
Szekszárd, Hungary
Processor AMD Phenom II X4 955BE
Motherboard Asus M4A785TD-V Evo
Cooling Xigmatek HDT S1283
Memory 4GB Kingston Hyperx DDR3
Video Card(s) GigaByte Radeon HD3870 512MB GDDR4
Storage WD Caviar Black 640GB, Hitachi Deskstar T7K250 250GB
Display(s) Samsung SyncMaster F2380M
Audio Device(s) Creative Audigy ES 5.1
Power Supply Corsair VX550
Software Microsoft Windows 7 Professional x64
I can say the same taht I said with GDDR5 plus GDDR4 has proved to not be better than GDDR3. So why use it if it's not for marketing? GDDR3 is as good and it's cheaper, and so is that what you use. There's nothing like incompatibility, they could use it if they wanted, but I'm sure they would have to pay royalties for a performance gain that doesnt exist. Same for GDDR5 and DX10.1. People like to mention "conspiracy" theories about TWIMTBP, so I'm going to say one that I have been thinking of for some time about DX10.1 and why Nvidia doesn't want to implement it. There are many "hints" out there that suggest me that MS and Ati developed DX10.1 (even DX10) specifications together. And is very likely that Ati filled many patents about it's imlementation in hardware long before Nvidia even knew anything about how DX10 was going to be. As some have suggested DX10.1 is what DX10 was going to be before Nvidia did their suggestions, what Ati wanted it to be. So now Nvidia has to pay if they want to implement it. Don't ask me for proofs, since I have the same as those who say Nvidia guys pay developers to make Nvidia hardware faster. That is: NONE.


They would use gddr4 if they could! Just for marketing! (not on reference boards) Just like 2gb 9600gt, doesn't make any sense, but it sounds good-> people would buy it. 8800gt with 512 gddr4 sounds good -> people would buy it (higher is better - lot of people thinks).
But they can't use ddr4 because g80 doesn't support gddr4 (and gddr5). I can't explain myself better.

dx10(.1) specs were available to every manufacturer early, i don't think it was a secret in front of nvidia. Even S3 has a dx10.1 card.
 
Last edited:

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.28/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
They would use gddr4 if they could! Just for marketing! (not on reference boards) Just like 2gb 9600gt, doesn't make any sense, but it sounds good-> people would buy it. 8800gt with 512 gddr4 sounds good -> people would buy it (higher is better - lot of people thinks).
But they can't use ddr4 because g80 doesn't support gddr4 (and gddr5). I can't explain myself better.

dx10(.1) specs were available to every manufacturer early, i don't think it was a secret in front of nvidia. Even S3 has a dx10.1 card.

You didn't understand me. G80/92 can't use GDDR4, so they can't use GDDR4 on the cards. Nvidia CAN!!! There's nothing special in implementing a new memory into the controler, they would do if it was good for them or necesary if you prefer to look at it like that.

About DX10.1 what exactly is "early"? I mean how much early in the scheme of things? I.e 2 months are too much. There are even hints that MS didn't gave Nvidia all the necesary to make their DX10 drivers run well, because they were pissed off with what happened with the Xbox GPU.
 
Joined
Sep 2, 2005
Messages
294 (0.04/day)
Location
Szekszárd, Hungary
Processor AMD Phenom II X4 955BE
Motherboard Asus M4A785TD-V Evo
Cooling Xigmatek HDT S1283
Memory 4GB Kingston Hyperx DDR3
Video Card(s) GigaByte Radeon HD3870 512MB GDDR4
Storage WD Caviar Black 640GB, Hitachi Deskstar T7K250 250GB
Display(s) Samsung SyncMaster F2380M
Audio Device(s) Creative Audigy ES 5.1
Power Supply Corsair VX550
Software Microsoft Windows 7 Professional x64
You didn't understand me. G80/92 can't use GDDR4, so they can't use GDDR4 on the cards. Nvidia CAN!!! There's nothing special in implementing a new memory into the controler, they would do if it was good for them or necesary if you prefer to look at it like that.

So then why no gddr5 on the new cards?
 

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.28/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
So then why no gddr5 on the new cards?

I have said it already. Availability and price. The price it would have have if both companies had to fight to get the few GDDR5 chips there are available.

If you are not convinced already, think about this: why is Ati's HD4850 going to have GDDR3 memory? Why not even GDDR4? Answers above.
 
Last edited:
Joined
Sep 2, 2005
Messages
294 (0.04/day)
Location
Szekszárd, Hungary
Processor AMD Phenom II X4 955BE
Motherboard Asus M4A785TD-V Evo
Cooling Xigmatek HDT S1283
Memory 4GB Kingston Hyperx DDR3
Video Card(s) GigaByte Radeon HD3870 512MB GDDR4
Storage WD Caviar Black 640GB, Hitachi Deskstar T7K250 250GB
Display(s) Samsung SyncMaster F2380M
Audio Device(s) Creative Audigy ES 5.1
Power Supply Corsair VX550
Software Microsoft Windows 7 Professional x64
I have said it already. Availability and price. The price it would have have if both companies had to fight to get the few GDDR5 chips there are available.

If you are not convinced already, think about this: why is Ati's HD4850 going to have GDDR3 memory? Why not even GDDR4? Answers above.

It is unlikely that the memory manufacturers prefer the smaller company over the market leading company. There is two logical answer for that: nvidia don't want ddr5 because their product doesn't support it, perhaps it is a bit harder to redesign the g80 memory controller than you think. Or it is cheaper to produce an 512bit card for nvidia than using a much faster memory, i don't know :)

Rumours says there will be a gddr5 version of the hd4850. The 0.8ns gddr4 doesn't make much sense in the light of 0.8ns gddr3, apart from the less power usage. The gddr3 has better latencies at the same clock.
 
Last edited:

Morgoth

Fueled by Sapphire
Joined
Aug 4, 2007
Messages
4,226 (0.69/day)
Location
Netherlands
System Name Wopr "War Operation Plan Response"
Processor 5900x ryzen 9 12 cores 24 threads
Motherboard aorus x570 pro
Cooling air (GPU Liquid graphene) rad outside case mounted 120mm 68mm thick
Memory kingston 32gb ddr4 3200mhz ecc 2x16gb
Video Card(s) sapphire RX 6950 xt Nitro+ 16gb
Storage 300gb hdd OS backup. Crucial 500gb ssd OS. 6tb raid 1 hdd. 1.8tb pci-e nytro warp drive LSI
Display(s) AOC display 1080p
Case SilverStone SST-CS380 V2
Audio Device(s) Onboard
Power Supply Corsair 850MX watt
Mouse corsair gaming mouse
Keyboard Microsoft brand
Software Windows 10 pro 64bit, Luxion Keyshot 7, fusion 360, steam
Benchmark Scores timespy 19 104
sounds like hd2900xt...
 

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.28/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
It is unlikely that the memory manufacturers prefer the smaller company over the market leading company. There is one logical answer for that: nvidia don't want ddr5 because their product doesn't support it, perhaps it is a bit harder to redesign the g80 memory controller than you think.

Rumours says there will be a gddr5 version of the hd4850. The 0.8ns gddr4 doesn't make much sense in the light of 0.8ns gddr3, apart from the less power usage. The gddr3 has better latencies at the same clock.

Memory manufacturers prefer money. That's all they want. They don't care who is buying their products as long as they pay and as long as they can sell ALL thier STOCK. Since they have low stock and they don't have high production right now, ANY company can buy that amount, so they would sell it to the one that paid more. Could Nvidia pay more than AMD? Maybe (well, sure), but why would they want to do so? It would make their cards more expensive, but what is worse for them is the REALLY LOW AVAILABILITY. Let's face it, Nvidia has a 66% of market share. That's twice of what Ati has. If availability is low for Ati, much more for Nvidia. Contrary to what people think, I don't think Nvidia cares too much about Ati and a lot more about their market audience. GDDR5 would make their product a lot more expensive and scarce. They don't want that. Plain and simple.

And HD4850 WON'T have a GDDR5 version from AMD. They gave partners the choice to use it. That way partners can decide if they want to pay the price premium or not. GDDR5 price is so high, that AMD has decided is not cost effective for HD4850. Now knowing that it's only an underclocked HD4870, think about GDDR5 and tell me in all honesty that it's not just a marketing strategy.
 
Top