• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Next-gen NVIDIA GeForce Specs Unveiled, Part 2

PVTCaboose1337

Graphical Hacker
Joined
Feb 1, 2006
Messages
9,501 (1.42/day)
Location
Texas
System Name Whim
Processor Intel Core i5 2500k @ 4.4ghz
Motherboard Asus P8Z77-V LX
Cooling Cooler Master Hyper 212+
Memory 2 x 4GB G.Skill Ripjaws @ 1600mhz
Video Card(s) Gigabyte GTX 670 2gb
Storage Samsung 840 Pro 256gb, WD 2TB Black
Display(s) Shimian QH270 (1440p), Asus VE228 (1080p)
Case Cooler Master 430 Elite
Audio Device(s) Onboard > PA2V2 Amp > Senn 595's
Power Supply Corsair 750w
Software Windows 8.1 (Tweaked)
That is too much for me to pay! Why would you spend that much money?
 
Joined
May 9, 2006
Messages
2,116 (0.32/day)
System Name Not named
Processor Intel 8700k @ 5Ghz
Motherboard Asus ROG STRIX Z370-E Gaming
Cooling DeepCool Assassin II
Memory 16GB DDR4 Corsair LPX 3000mhz CL15
Video Card(s) Zotac 1080 Ti AMP EXTREME
Storage Samsung 960 PRO 512GB
Display(s) 24" Dell IPS 1920x1200
Case Fractal Design R5
Power Supply Corsair AX760 Watt Fully Modular
I'm almost afraid to say it but it is strange why the ati cards do so much better in benchmarks. (once again i may regret saying this) It does seem to be kind of like a lot of games are specifically designed with nvidia cards in mind.
 
Joined
Jan 11, 2005
Messages
1,491 (0.21/day)
Location
66 feet from the ground
System Name 2nd AMD puppy
Processor FX-8350 vishera
Motherboard Gigabyte GA-970A-UD3
Cooling Cooler Master Hyper TX2
Memory 16 Gb DDR3:8GB Kingston HyperX Beast + 8Gb G.Skill Sniper(by courtesy of tabascosauz &TPU)
Video Card(s) Sapphire RX 580 Nitro+;1450/2000 Mhz
Storage SSD :840 pro 128 Gb;Iridium pro 240Gb ; HDD 2xWD-1Tb
Display(s) Benq XL2730Z 144 Hz freesync
Case NZXT 820 PHANTOM
Audio Device(s) Audigy SE with Logitech Z-5500
Power Supply Riotoro Enigma G2 850W
Mouse Razer copperhead / Gamdias zeus (by courtesy of sneekypeet & TPU)
Keyboard MS Sidewinder x4
Software win10 64bit ltsc
Benchmark Scores irrelevant for me
The problem with this opinion is that back then the performance increase WAS NEEDED. As it stands now, the added performance increase is NOT NEEDED at such a high premium. A lot of high end card owners aren't over-clocking like they did in days past. That's a pure indicator that we have plateau'd above average frame rates in most games. Sure, e-peen dictates higher FPS is better then what you get now. However, having a card that can play games at above average frame rates without even overclocking (in most cases) would be hard pressed to buy a card that cost $600+.


exactly;no matter the gpu brand in the last year the midrange cards 150-250$ proved to be enough for all titles to be played on a decent fps;when via or s3, or intel will have a gpu capable to achieve a decent fps people will buy it because the price/performance ratio, so the best buy card will have the best $/fps ratio,tendency which already rule the market.

i don't really understand why people must be fanboys of Nvidia or Ati ;the point is to buy a card who suits you better and use it for a few years without upgrade; i don't like when mud is thrown from both sides now just to prove that "i have a card from the 1st and best gpu manufacturer in the world"(this can be Nv or Ati) wtf cares about?
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.21/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
I don't know how accurate this information is, especially since we haven't seen any other reports of it from any reputable sources. I think I'll wait to believe specs until the cards are actually out, but to me the shader speeds on these cards seem a little low to me. I know there is more of them, but to drop the speeds that much seems insane to me.
 

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.28/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
I'm almost afraid to say it but it is strange why the ati cards do so much better in benchmarks. (once again i may regret saying this) It does seem to be kind of like a lot of games are specifically designed with nvidia cards in mind.

Of course, because it's much easier to make ALMOST ALL developers to make your card run the games faster than to make 2-3 benchmarks run faster. Funny how corrupted all game developers are, but benchmark developers are so incorruptible...
 
Joined
Jun 12, 2007
Messages
4,815 (0.78/day)
Location
Wangas, New Zealand
System Name Darth Obsidious
Processor Intel i5 2500K
Motherboard ASUS P8Z68-V/Gen3
Cooling Cooler Master Hyper 212+ in Push Pull
Memory 2X4GB Corsair Vengeance DDR3 1600
Video Card(s) ASUS R9 270x TOP
Storage 128GB Samsung 830 SSD, 1TB WD Black, 2TB WD Green
Display(s) LG IPS234V-PN
Case Corsair Obsidian 650D
Audio Device(s) Infrasonic Quartet
Power Supply Corsair HX650w
Software Windows 7 64bit and Windows XP Home
Benchmark Scores 2cm mark on bench with a razor blade.
No boubt nVidia will end up releasing a GT 280 which ends up being the next gen 8800GT.

So most probably just a wait for the price vs performance minded people after the GTX 280 is released.
 
Joined
May 6, 2005
Messages
2,792 (0.40/day)
Location
Tre, Suomi Finland
System Name Ladpot ◦◦◦ Desktop
Processor R7 5800H ◦◦◦ i7 4770K, watercooled
Motherboard HP 88D2 ◦◦◦ Asus Z87-C2 Maximus VI Formula
Cooling Mixed gases ◦◦◦ Fuzion V1, MCW60/R2, DDC1/DDCT-01s top, PA120.3, EK200, D12SL-12, liq.metal TIM
Memory 2× 8GB DDR4-3200 ◦◦◦ 2× 8GB Crucial Ballistix Tactical LP DDR3-1600
Video Card(s) RTX 3070 ◦◦◦ heaps of dead GPUs in the garage
Storage Samsung 980 PRO 2TB ◦◦◦ Samsung 840Pro 256@178GB + 4× WD Red 2TB in RAID10 + LaCie Blade Runner 4TB
Display(s) HP ZR30w 30" 2560×1600 (WQXGA) H2-IPS
Case Lian Li PC-A16B
Audio Device(s) Onboard
Power Supply Corsair AX860i
Mouse Logitech MX Master 2S / Contour RollerMouse Red+
Keyboard Logitech Elite Keyboard from 2006 / Contour Balance Keyboard / Logitech diNovo Edge
Software W11 x64 ◦◦◦ W10 x64
Benchmark Scores It does boot up? I think.
I'm almost afraid to say it but it is strange why the ati cards do so much better in benchmarks. (once again i may regret saying this) It does seem to be kind of like a lot of games are specifically designed with nvidia cards in mind.
Or that benchmarks are optimized for ATi.
*shrug*

The reason is in the architecture. For a game to run well on ATi R6 gen GPUs some heavy optimizations are required as the architecture of R6 GPUs is so different from earlier, and then ofcourse there's the obvious flaws in R600/RV670 like the absolutely horrible texture filtering capabilities and the innate inefficiency of the superscalar shaders. RV770 will partially fix texture filtering shortcomings but unfortunately the TMUs are only doubled - thus RV770's texturing muscle will still clearly trail that of even G92.
 
Joined
May 12, 2006
Messages
11,119 (1.69/day)
System Name Apple Bite
Processor Intel I5
Motherboard Apple
Memory 40gb of DDR 4 2700
Video Card(s) ATI Radeon 500
Storage Fusion Drive 1 TB
Display(s) 27 Inch IMac late 2017
Six hundred dollars!!! :eek: awww..

Specs look good, but man.. NVidia is hunting for fat wallets again :twitch:

they are on crack at that price get real
 
Joined
Sep 11, 2007
Messages
305 (0.05/day)
Location
Ambugaton
Processor Intel i5 12600KF
Motherboard MSI PRO Z690-P DDR4, Socket 1700
Cooling MSI Ventus AIO
Memory Corsair Vengeance LPX Black 32GB, DDR4, 3200MHz
Video Card(s) MSI VentusRTX 3060 12Gb
Storage XPS 1TB | 2x Kingston 2TB Sata | Sinology 4TB (Raid1) |
Display(s) 24" Dell U2417H
Case Msi Mpg Odin
Audio Device(s) Realtek ALC887 + Microlab Solo 6C
Power Supply Seasonic PRIME TX-750, 80
Mouse Razer
Keyboard Razer
Software Windows 10 x64
Of course, because it's much easier to make ALMOST ALL developers to make your card run the games faster than to make 2-3 benchmarks run faster. Funny how corrupted all game developers are, but benchmark developers are so incorruptible...

This is not about corrupt ppl, it's about the fact that NVIDIA and ATi cards are very different.
When you make a new game for instance you have to optimize the code so that it takes advantage of the hardware the GPU has. This process can take time, you dont generaly have time to cover all aspects and explore all the resources of two cards that work in different ways. So designers have to choose. Nvidia gives them all the help they need in understanding and using the nooks and crannies of their cards. Then games run better on NVIDIA cards.

I'm trying to stay neutral here, the only reason that I like ATi cards right now is because the competition is overcharging.
 

DrPepper

The Doctor is in the house
Joined
Jan 16, 2008
Messages
7,482 (1.25/day)
Location
Scotland (It rains alot)
System Name Rusky
Processor Intel Core i7 D0 3.8Ghz
Motherboard Asus P6T
Cooling Thermaltake Dark Knight
Memory 12GB Patriot Viper's 1866mhz 9-9-9-24
Video Card(s) GTX470 1280MB
Storage OCZ Summit 60GB + Samsung 1TB + Samsung 2TB
Display(s) Sharp Aquos L32X20E 1920 x 1080
Case Silverstone Raven RV01
Power Supply Corsair 650 Watt
Software Windows 7 x64
Benchmark Scores 3DMark06 - 18064 http://img.techpowerup.org/090720/Capture002.jpg
When the Ultra came out and the price was a lifetime of slavery some people would still try to buy 3 of them.
 

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.28/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
This is not about corrupt ppl, it's about the fact that NVIDIA and ATi cards are very different.
When you make a new game for instance you have to optimize the code so that it takes advantage of the hardware the GPU has. This process can take time, you dont generaly have time to cover all aspects and explore all the resources of two cards that work in different ways. So designers have to choose. Nvidia gives them all the help they need in understanding and using the nooks and crannies of their cards. Then games run better on NVIDIA cards.

I'm trying to stay neutral here, the only reason that I like ATi cards right now is because the competition is overcharging.

Have you ever read any game developer's blog? They do specific code not only for each brand, but for almost each card architecture. Then they may optimize better the code specific for Nvidia hardware under TWIMTBP, because Nvidia gives them extense support. In that respect the code for Nvidia hardware may be better optimized, but each card has its own code.

Long before TWIMTBP, many first tier game developers said the way 3DMark did things WAS NOT how they were going to do things in the future. AFAIK this never changed, so it's the same with 3DMark 06 too. Saying that Ati architecture is any better based on these kind of benchmarks, involves that those benchmarks are doing things right, which they aren't.

This is something that has always bugged me. People say that game developers are making code "Nvidia's way", but they never stop and think that MAYBE Nvidia is doing their hardware in "game developers way", while Ati may not. TWIMTBP it's a two way relationship. With the time Ati has become better (comparatively) in benchmarks and worse in games. Everybody blames TWIMTBP for this, without taking into account each company's design decisions. For a simple example, Ati's Superscalar, VLIW and SIMD shader processors are A LOT better suited for the HOMOGENEOUS CODE involved in an static benchmark, than for the ever changing code involved in a game. Also in the case of R600 and one of it's biggest flaws, its TMUs, benchmarks are a lot more favorable than games, since you can "guess" which texture comes next and you don't have to care about the textures that have already been used. In games you don't know where the camera will head next, so you don't know which textures you can discard. In reality none of the architectures can "guess" the next texture, but G80/92 with it's bigger texture power can react better to texture changes. On benchmarks R600/670 can mitigate this effect with streaming.
 
Joined
Sep 15, 2007
Messages
3,946 (0.65/day)
Location
Police/Nanny State of America
Processor OCed 5800X3D
Motherboard Asucks C6H
Cooling Air
Memory 32GB
Video Card(s) OCed 6800XT
Storage NVMees
Display(s) 32" Dull curved 1440
Case Freebie glass idk
Audio Device(s) Sennheiser
Power Supply Don't even remember
Well, with 1 billion transistors does it even matter how many ROPs, shaders, etc it has? LOL *sarcasm*

It has way more switching power which means ownage (as long as they don't screw it up like the FX series). Clocks are irrevelant (overall).
 

BigBruser13

New Member
Joined
Apr 30, 2008
Messages
36 (0.01/day)
Location
Portland OR
System Name V64PC
Processor Q9550
Motherboard Rampage Extreme
Cooling water
Memory 2x2gb XP3 mushkin
Video Card(s) Ati 4870 x2
Storage 2x 150gb raptor
Display(s) acer 1920x1200
Case cooler master 830
Audio Device(s) n/a
Power Supply x3 ultra
Software Vista x64 ultimate
Benchmark Scores 3dmark 06 19899
latest upcoming nvidia offereings

All I want to know is can it play Crysis on very high at 1920x1200
 

HTC

Joined
Apr 1, 2008
Messages
4,610 (0.78/day)
Location
Portugal
System Name HTC's System
Processor Ryzen 5 5800X3D
Motherboard Asrock Taichi X370
Cooling NH-C14, with the AM4 mounting kit
Memory G.Skill Kit 16GB DDR4 F4 - 3200 C16D - 16 GTZB
Video Card(s) Sapphire Pulse 6600 8 GB
Storage 1 Samsung NVMe 960 EVO 250 GB + 1 3.5" Seagate IronWolf Pro 6TB 7200RPM 256MB SATA III
Display(s) LG 27UD58
Case Fractal Design Define R6 USB-C
Audio Device(s) Onboard
Power Supply Corsair TX 850M 80+ Gold
Mouse Razer Deathadder Elite
Software Ubuntu 20.04.6 LTS
wow $600 is a bit steep, wonder has that'll translate into pounds over here. I reckon £400 :(

Here's an interesting comparison chart:


If the 4870 is nearlly as fast as the GTX280 it'll be a much better buy I think.

Even if the nVidia options turn out to be better (by say ... 10% to 15%), if this power consumption is real, i would buy an ATI every time.

Only if the difference turns out more substantial would i consider to buy a nVidia but NOT for that price!
 
Joined
Sep 15, 2007
Messages
3,946 (0.65/day)
Location
Police/Nanny State of America
Processor OCed 5800X3D
Motherboard Asucks C6H
Cooling Air
Memory 32GB
Video Card(s) OCed 6800XT
Storage NVMees
Display(s) 32" Dull curved 1440
Case Freebie glass idk
Audio Device(s) Sennheiser
Power Supply Don't even remember
Even if the nVidia options turn out to be better (by say ... 10% to 15%), if this power consumption is real, i would buy an ATI every time.

Only if the difference turns out more substantial would i consider to buy a nVidia but NOT for that price!

Well, if it's real, then I can say that ATI is missing transistors b/c they're the ones always sucking down the juice :p
 

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.28/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
Even if the nVidia options turn out to be better (by say ... 10% to 15%), if this power consumption is real, i would buy an ATI every time.

Only if the difference turns out more substantial would i consider to buy a nVidia but NOT for that price!

Thing is that from the looks of the specs, RV770 will have 2x the power of RV670 and GT200 will have 2x that of G92. This means a GTX 280 40-50% faster than the HD4870 and GTX260 could end up being 25% faster.
It's early to say anything, but most probably each card will have its market segment and ALL of them will have a similar price/performance, as it's been the case almost always. Also GTX cards won't lag a lot behind in performance-per-watt, always based on these specs which I don't know if are trusty. But then again we are always talking about these specs so, simple math:

157 watts +50% = 157 +79 = 236 W isn't funny?
 
Last edited:

HTC

Joined
Apr 1, 2008
Messages
4,610 (0.78/day)
Location
Portugal
System Name HTC's System
Processor Ryzen 5 5800X3D
Motherboard Asrock Taichi X370
Cooling NH-C14, with the AM4 mounting kit
Memory G.Skill Kit 16GB DDR4 F4 - 3200 C16D - 16 GTZB
Video Card(s) Sapphire Pulse 6600 8 GB
Storage 1 Samsung NVMe 960 EVO 250 GB + 1 3.5" Seagate IronWolf Pro 6TB 7200RPM 256MB SATA III
Display(s) LG 27UD58
Case Fractal Design Define R6 USB-C
Audio Device(s) Onboard
Power Supply Corsair TX 850M 80+ Gold
Mouse Razer Deathadder Elite
Software Ubuntu 20.04.6 LTS
Thing is that from the looks of the specs, RV770 will have 2x the power of RV670 and GT200 will have 2x that of G92. This means a GTX 280 40-50% faster than the HD4870 and GTX260 could end up being 25% faster.
It's early to say anything, but most probably each card will have its market segment and ALL of them will have a similar price/performance, as it's been the case almost always. Also GTX cards won't lag a lot behind in performance-per-watt, always based on these specs which I don't know if are trusty. But then again we are always talking about these specs so, simple math:

160 watts +50% = 160 +80 = 240

If you're correct, i would buy a nVidia card, but only when the price dropped to a more realistic value.

nVidia's GTX280 could be 10 times faster the ATI's 4870x2 but i wouldn't buy it for this amount of money: NO WAY!!!!
 

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.28/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
If you're correct, i would buy a nVidia card, but only when the price dropped to a more realistic value.

nVidia's GTX280 could be 10 times faster the ATI's 4870x2 but i wouldn't buy it for this amount of money: NO WAY!!!!

I will never buy any card above 300$ neither, but the VALUE of the cards is undeniable.

Also I've been looking around and other sources say $400 and $500, for GTX260 and 280 respectively and Nvidia may still have an ace up its sleeve called GTX260 448 MB which would pwn as did GTS 320 in the past, so who knows...
 
Last edited:

Megasty

New Member
Joined
Mar 18, 2008
Messages
1,263 (0.21/day)
Location
The Kingdom of Au
Processor i7 920 @ 3.6 GHz (4.0 when gaming)
Motherboard Asus Rampage II Extreme - Yeah I Bought It...
Cooling Swiftech.
Memory 12 GB Crucial Ballistix Tracer - I Love Red
Video Card(s) ASUS EAH4870X2 - That Fan Is...!?
Storage 4 WD 1.5 TB
Display(s) 24" Sceptre
Case TT Xaser VI - Fugly, Red, & Huge...
Audio Device(s) The ASUS Thingy
Power Supply Ultra X3 1000W
Software Vista Ultimate SP1 64bit
If you're correct, i would buy a nVidia card, but only when the price dropped to a more realistic value.

nVidia's GTX280 could be 10 times faster the ATI's 4870x2 but i wouldn't buy it for this amount of money: NO WAY!!!!

That's the full gray zone. On paper the 280 & 260 will be 25-50% faster than the 4870. That alone warrants the higher prices to a point. Then you drop the 4870x2 on them & it creates a simple paradox. On paper the 4870x2 is twice as fast as the 4870. Plus at $500 the winner is clearer than glass.
 

HTC

Joined
Apr 1, 2008
Messages
4,610 (0.78/day)
Location
Portugal
System Name HTC's System
Processor Ryzen 5 5800X3D
Motherboard Asrock Taichi X370
Cooling NH-C14, with the AM4 mounting kit
Memory G.Skill Kit 16GB DDR4 F4 - 3200 C16D - 16 GTZB
Video Card(s) Sapphire Pulse 6600 8 GB
Storage 1 Samsung NVMe 960 EVO 250 GB + 1 3.5" Seagate IronWolf Pro 6TB 7200RPM 256MB SATA III
Display(s) LG 27UD58
Case Fractal Design Define R6 USB-C
Audio Device(s) Onboard
Power Supply Corsair TX 850M 80+ Gold
Mouse Razer Deathadder Elite
Software Ubuntu 20.04.6 LTS
That's the full gray zone. On paper the 280 & 260 will be 25-50% faster than the 4870. That alone warrants the higher prices to a point. Then you drop the 4870x2 on them & it creates a simple paradox. On paper the 4870x2 is twice as fast as the 4870. Plus at $500 the winner is clearer than glass.

Dunno how you figure that, but it doesn't matter!

To me, there are only 2 important aspects for any card: power usage and price (in that order).
 

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.28/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
That's the full gray zone. On paper the 280 & 260 will be 25-50% faster than the 4870. That alone warrants the higher prices to a point. Then you drop the 4870x2 on them & it creates a simple paradox. On paper the 4870x2 is twice as fast as the 4870. Plus at $500 the winner is clearer than glass.

That logic has its flaws:

1- HD3870 X2 is not twice as fast as HD3870, we could guess HD4 x2 won't either.

2- HD3870 X2 price is more than twice than DH3870, will this be different? That puts the HD4870 X2 well above $600. Probably it will be cheaper, but it won't launch until August, we don't know how prices are going to be then...
 

imperialreign

New Member
Joined
Jul 19, 2007
Messages
7,043 (1.14/day)
Location
Sector ZZ₉ Plural Z Alpha
System Name УльтраФиолет
Processor Intel Kentsfield Q9650 @ 3.8GHz (4.2GHz highest achieved)
Motherboard ASUS P5E3 Deluxe/WiFi; X38 NSB, ICH9R SSB
Cooling Delta V3 block, XPSC res, 120x3 rad, ST 1/2" pump - 10 fans, SYSTRIN HDD cooler, Antec HDD cooler
Memory Dual channel 8GB OCZ Platinum DDR3 @ 1800MHz @ 7-7-7-20 1T
Video Card(s) Quadfire: (2) Sapphire HD5970
Storage (2) WD VelociRaptor 300GB SATA-300; WD 320GB SATA-300; WD 200GB UATA + WD 160GB UATA
Display(s) Samsung Syncmaster T240 24" (16:10)
Case Cooler Master Stacker 830
Audio Device(s) Creative X-Fi Titanium Fatal1ty Pro PCI-E x1
Power Supply Kingwin Mach1 1200W modular
Software Windows XP Home SP3; Vista Ultimate x64 SP2
Benchmark Scores 3m06: 20270 here: http://hwbot.org/user.do?userId=12313
Have you ever read any game developer's blog? They do specific code not only for each brand, but for almost each card architecture. Then they may optimize better the code specific for Nvidia hardware under TWIMTBP, because Nvidia gives them extense support. In that respect the code for Nvidia hardware may be better optimized, but each card has its own code.

Long before TWIMTBP, many first tier game developers said the way 3DMark did things WAS NOT how they were going to do things in the future. AFAIK this never changed, so it's the same with 3DMark 06 too. Saying that Ati architecture is any better based on these kind of benchmarks, involves that those benchmarks are doing things right, which they aren't.

This is something that has always bugged me. People say that game developers are making code "Nvidia's way", but they never stop and think that MAYBE Nvidia is doing their hardware in "game developers way", while Ati may not. TWIMTBP it's a two way relationship. With the time Ati has become better (comparatively) in benchmarks and worse in games. Everybody blames TWIMTBP for this, without taking into account each company's design decisions. For a simple example, Ati's Superscalar, VLIW and SIMD shader processors are A LOT better suited for the HOMOGENEOUS CODE involved in an static benchmark, than for the ever changing code involved in a game. Also in the case of R600 and one of it's biggest flaws, its TMUs, benchmarks are a lot more favorable than games, since you can "guess" which texture comes next and you don't have to care about the textures that have already been used. In games you don't know where the camera will head next, so you don't know which textures you can discard. In reality none of the architectures can "guess" the next texture, but G80/92 with it's bigger texture power can react better to texture changes. On benchmarks R600/670 can mitigate this effect with streaming.



I agree to an extent.


Although, I think there is a major difference in GPU architecture that has been holding ATI back across the board, the few games where ATI has worked closely with developers shows a fairly level playing field, or better ATI performance upon release.

Sadly, the only two games I can think of that I know for certain that ATI worked closely with game developers is Call of Juarez - where we see ATI cards tending to outperform nVidia's; and FEAR - where we see ATI cards continuing to keep pace with nVidias.


I'm sure a certain amount of collaboration does tend to help out nVidia's overall, but yes, GPU architecture does come into play as well; and ATI's GPUs just haven't been suited for the more complex games we've seen over the last 2 years or so.

But, all-in-all, the new ATI R700 series is a brand new design, not a re-hash of an older GPU like the R600 was to R500 - It might be wishful thinking, but . . . I think we might be in for a surprise with the new ATI GPUs. Probably just wishful thinking on my part, though :ohwell:





Anyhow, someone correct me if I'm wrong, but I thought I remember hearing that nVidia's new G200 is another re-hash of G92/G80? :confused:
 
Joined
Aug 12, 2006
Messages
3,278 (0.50/day)
Location
UK-small Village in a Valley Near Newcastle
Processor I9 9900KS @ 5.3Ghz
Motherboard Gagabyte z390 Aorus Ultra
Cooling Nexxxos Nova 1080 + 360 rad
Memory 32Gb Crucial Balliastix RGB 4.4GHz
Video Card(s) MSI Gaming X Trio RTX 3090 (Bios and Shunt Modded) 2.17GHz @ 38C
Storage NVME / SSD RAID arrays
Display(s) 38" LG 38GN950-B, 27" BENQ XL2730Z 144hz 1440p, Samsung 27" 3D 1440p
Case Thermaltake Core series
Power Supply 1.6Kw Silverstone
Mouse Roccat Kone EMP
Keyboard Corsair Viper Mechanical
Software Windows 10 Pro
lol you know what, for that price you could buy 2 of ATI's single offerings and beat out the NV counterpart ^^

or just go x2 and win that way, and the x2 will no doubt be far cheaper than nv's gx2 alternative, so again, price wise and performance its win win for ATI atm.

Think about it, NV may have one killer card, but if you can almost buy 2 of ATI's own killer cards (dont matter if they are less powerful than NV's offering) for around or just over the price of one of NV's cards, do the math, ATI / customer would win every time?
 

warhammer

New Member
Joined
Jan 10, 2008
Messages
204 (0.03/day)
Processor Q6600@3.6
Motherboard Evga 680i
Cooling H20
Memory 2GB DDR2
Video Card(s) 8800GTS 512 SLI
Storage 4x320gig
Display(s) 21CRT
Case ARMOR
Audio Device(s) SB
Power Supply 750W
Software VISTA ULTIMATE
Save your money for the DX11 cards:)
 
Top