• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Jen-Hsun Huang (NVIDIA): ''We Underestimated RV770''

X1REME

New Member
Joined
Jan 3, 2008
Messages
84 (0.01/day)
to everybody saying am glad nvidia has learned or that ati has woken up nvidia the beast.

all nvidia has woken upto is a powerpoint slide not fully complete yet, they don't have an answer for anther 8/9 months. look you cant just make new gpu cards from the 8 series architecture in a few months which exactly what it is + name changes for the past 2 years (nobody can OK)

the funny thing is when nvidia does come back after 8 to 9 months min they will get smacked right back down again with the r800 little dragon, nvidia gonna be 4rth for at least 2+ years. amd has finally learned there is no rest for the wicked as you may find your self bankrupt if you don't have the crown e.g cpu`s

nvidia fans make me laugh the things they come out with even when there on the loosing side
 

Wile E

Power User
Joined
Oct 1, 2006
Messages
24,318 (3.81/day)
System Name The ClusterF**k
Processor 980X @ 4Ghz
Motherboard Gigabyte GA-EX58-UD5 BIOS F12
Cooling MCR-320, DDC-1 pump w/Bitspower res top (1/2" fittings), Koolance CPU-360
Memory 3x2GB Mushkin Redlines 1600Mhz 6-8-6-24 1T
Video Card(s) Evga GTX 580
Storage Corsair Neutron GTX 240GB, 2xSeagate 320GB RAID0; 2xSeagate 3TB; 2xSamsung 2TB; Samsung 1.5TB
Display(s) HP LP2475w 24" 1920x1200 IPS
Case Technofront Bench Station
Audio Device(s) Auzentech X-Fi Forte into Onkyo SR606 and Polk TSi200's + RM6750
Power Supply ENERMAX Galaxy EVO EGX1250EWT 1250W
Software Win7 Ultimate N x64, OSX 10.8.4
to everybody saying am glad nvidia has learned or that ati has woken up nvidia the beast.

all nvidia has woken upto is a powerpoint slide not fully complete yet, they don't have an answer for anther 8/9 months. look you cant just make new gpu cards from the 8 series architecture in a few months which exactly what it is + name changes for the past 2 years (nobody can OK)

the funny thing is when nvidia does come back after 8 to 9 months min they will get smacked right back down again with the r800 little dragon, nvidia gonna be 4rth for at least 2+ years. amd has finally learned there is no rest for the wicked as you may find your self bankrupt if you don't have the crown e.g cpu`s

nvidia fans make me laugh the things they come out with even when there on the loosing side
And 8-9months between lead changes is normal. It's what has happened in the ATI/NV battle for years. You can't say that R800 will beat NV's next offerings at all. You have no idea what either NV's next design, or even R800, has to offer. For all we know, R800 could be a flop, as could the next NV design.

To sit here and claim that ATI will retain the lead is just silly. There is absolutely no way to predict that.
 
Joined
Jul 14, 2008
Messages
872 (0.15/day)
Location
Copenhagen, Denmark
System Name Ryzen/Laptop/htpc
Processor R9 3900X/i7 6700HQ/i7 2600
Motherboard AsRock X470 Taichi/Acer/ Gigabyte H77M
Cooling Corsair H115i pro with 2 Noctua NF-A14 chromax/OEM/Noctua NH-L12i
Memory G.Skill Trident Z 32GB @3200/16GB DDR4 2666 HyperX impact/24GB
Video Card(s) TUL Red Dragon Vega 56/Intel HD 530 - GTX 950m/ 970 GTX
Storage 970pro NVMe 512GB,Samsung 860evo 1TB, 3x4TB WD gold/Transcend 830s, 1TB Toshiba/Adata 256GB + 1TB WD
Display(s) Philips FTV 32 inch + Dell 2407WFP-HC/OEM/Sony KDL-42W828B
Case Phanteks Enthoo Luxe/Acer Barebone/Enermax
Audio Device(s) SoundBlasterX AE-5 (Dell A525)(HyperX Cloud Alpha)/mojo/soundblaster xfi gamer
Power Supply Seasonic focus+ 850 platinum (SSR-850PX)/165 Watt power brick/Enermax 650W
Mouse G502 Hero/M705 Marathon/G305 Hero Lightspeed
Keyboard G19/oem/Steelseries Apex 300
Software Win10 pro 64bit
IMO nv underestimated ati and that was the starting point for their current situation , they will come back with an answer because they have the funds and the tech resources and that is good for us because if there is only one company in any kind of market the customer gets raped over and over again because of the lack of antagonism.
 

Tatty_Two

Gone Fishing
Joined
Jan 18, 2006
Messages
25,793 (3.88/day)
Location
Worcestershire, UK
Processor Rocket Lake Core i5 11600K @ 5 Ghz with PL tweaks
Motherboard MSI MAG Z490 TOMAHAWK
Cooling Thermalright Peerless Assassin 120SE + 4 Phanteks 140mm case fans
Memory 32GB (4 x 8GB SR) Patriot Viper Steel 4133Mhz DDR4 @ 3600Mhz CL14@1.45v Gear 1
Video Card(s) Asus Dual RTX 4070 OC
Storage WD Blue SN550 1TB M.2 NVME//Crucial MX500 500GB SSD (OS)
Display(s) AOC Q2781PQ 27 inch Ultra Slim 2560 x 1440 IPS
Case Phanteks Enthoo Pro M Windowed - Gunmetal
Audio Device(s) Onboard Realtek ALC1200/SPDIF to Sony AVR @ 5.1
Power Supply Seasonic CORE GM650w Gold Semi modular
Mouse Coolermaster Storm Octane wired
Keyboard Element Gaming Carbon Mk2 Tournament Mech
Software Win 10 Home x64
Whats this "ATi will retain the lead" malarkey? Do they have the lead? I thought NVidia had the fastest SINGLE gpu???? am I missing something here, OK if you bolt 2 GPU's together then thats a different story but for all of ATi's excellent marketing strategy this time around, coupled with their "futureistic & innovative" architecture, fact is, it's still a slower GPU........or have I missed the release of the HD4880? :D
 

Wile E

Power User
Joined
Oct 1, 2006
Messages
24,318 (3.81/day)
System Name The ClusterF**k
Processor 980X @ 4Ghz
Motherboard Gigabyte GA-EX58-UD5 BIOS F12
Cooling MCR-320, DDC-1 pump w/Bitspower res top (1/2" fittings), Koolance CPU-360
Memory 3x2GB Mushkin Redlines 1600Mhz 6-8-6-24 1T
Video Card(s) Evga GTX 580
Storage Corsair Neutron GTX 240GB, 2xSeagate 320GB RAID0; 2xSeagate 3TB; 2xSamsung 2TB; Samsung 1.5TB
Display(s) HP LP2475w 24" 1920x1200 IPS
Case Technofront Bench Station
Audio Device(s) Auzentech X-Fi Forte into Onkyo SR606 and Polk TSi200's + RM6750
Power Supply ENERMAX Galaxy EVO EGX1250EWT 1250W
Software Win7 Ultimate N x64, OSX 10.8.4
Whats this "ATi will retain the lead" malarkey? Do they have the lead? I thought NVidia had the fastest SINGLE gpu???? am I missing something here, OK if you bolt 2 GPU's together then thats a different story but for all of ATi's excellent marketing strategy this time around, coupled with their "futureistic & innovative" architecture, fact is, it's still a slower GPU........or have I missed the release of the HD4880? :D

It doesn't matter if it's a slower gpu. We don't buy gpus, we buy gfx cards. ATI has the fastest video card on the planet.
 

Tatty_Two

Gone Fishing
Joined
Jan 18, 2006
Messages
25,793 (3.88/day)
Location
Worcestershire, UK
Processor Rocket Lake Core i5 11600K @ 5 Ghz with PL tweaks
Motherboard MSI MAG Z490 TOMAHAWK
Cooling Thermalright Peerless Assassin 120SE + 4 Phanteks 140mm case fans
Memory 32GB (4 x 8GB SR) Patriot Viper Steel 4133Mhz DDR4 @ 3600Mhz CL14@1.45v Gear 1
Video Card(s) Asus Dual RTX 4070 OC
Storage WD Blue SN550 1TB M.2 NVME//Crucial MX500 500GB SSD (OS)
Display(s) AOC Q2781PQ 27 inch Ultra Slim 2560 x 1440 IPS
Case Phanteks Enthoo Pro M Windowed - Gunmetal
Audio Device(s) Onboard Realtek ALC1200/SPDIF to Sony AVR @ 5.1
Power Supply Seasonic CORE GM650w Gold Semi modular
Mouse Coolermaster Storm Octane wired
Keyboard Element Gaming Carbon Mk2 Tournament Mech
Software Win 10 Home x64
It doesn't matter if it's a slower gpu. We don't buy gpus, we buy gfx cards. ATI has the fastest video card on the planet.

Very true but as I said, bang 2 together and you have double (well ish) the performance, I can only think that NVidia havent done it because they have an even better single GPU solution up their sleeve to bash the R700......I cant beleive they will not at least try to :nutkick: ATi very soon, it's just not like them..........reports of the R700 were being leaked months ago so it's not as if they had no warning.
 

mixa

New Member
Joined
Jun 29, 2005
Messages
117 (0.02/day)
Location
Aden Castle Town
Processor AMD Athlon64 2800+ @ ~2.3GHz
Motherboard Asus K8N4-E Deluxe
Cooling Zalman CNPS7000B Al-Cu
Memory 2 x 512MB Vitesta (TCCDs) @ 255MHz , 2-3-3-9
Video Card(s) Sapphiretech X700 Pro @ XT
Storage 2 x 80Gb Hitachi DeskStar 7k250 , RAID 0
Display(s) LG L1710B
Case CoolerMaster ATC-210C-AX2
Audio Device(s) Sound Blaster Audigy 1
Power Supply FSP Group Fortron 400W w/silencer
Software Windows XP SP2 / fbsd 5.xx
Its always been like that and in the end its the end-user who benefits the most.
Only those fanboys lose cos of staying on the same coast.The situation is now the same as it was with 5900 Ultra and 9800 Pro/XT.NVIDIA will take an year or smth to recover as usual, then I guess they will release something better than ATi (read AMD), cos ATi tends to launch a real monster once in a while that leave NV down in the bush, but then ATi starts to lack behind laying on on the old architecture.And then booom NV comes with something better 'cos they were working their ass off to reach ATi's beast.

It's a great show, go watch it :D
 
Joined
Jun 20, 2007
Messages
3,937 (0.64/day)
System Name Widow
Processor Ryzen 7600x
Motherboard AsRock B650 HDVM.2
Cooling CPU : Corsair Hydro XC7 }{ GPU: EK FC 1080 via Magicool 360 III PRO > Photon 170 (D5)
Memory 32GB Gskill Flare X5
Video Card(s) GTX 1080 TI
Storage Samsung 9series NVM 2TB and Rust
Display(s) Predator X34P/Tempest X270OC @ 120hz / LG W3000h
Case Fractal Define S [Antec Skeleton hanging in hall of fame]
Audio Device(s) Asus Xonar Xense with AKG K612 cans on Monacor SA-100
Power Supply Seasonic X-850
Mouse Razer Naga 2014
Software Windows 11 Pro
Benchmark Scores FFXIV ARR Benchmark 12,883 on i7 2600k 15,098 on AM5 7600x
It doesn't matter if it's a slower gpu. We don't buy gpus, we buy gfx cards. ATI has the fastest video card on the planet.

Well sure, if you eliminate the 'price/performance' badge that so many people seem to wear, especially when discussing ATi.

But the X2 is as much of a 'flop' in that category as the 280 is/was, and this is where the variable of TWO gpus DOES matter.

Two GPUs,
DDR5,
How many shaders again? I can't count that high!
etc.

It boasts no real world advantage to the average consumer, or even some of the not so average consumers. It's a piece of hardware that only 'shines' (and by not that much..) in very acute situations that most people won't encounter.

It also draws 100 more watts, is natively hotter (and two times the heat at that), and costs $100 more(which should be a moot point, but SOMEHOW, price always gets involved whether it's TOP end products or not).

So...

Let's reverse the comparison.

280, single solution
Less power, heat and price.
Neck and neck, and at times, better(slightly) or worse performance (slightly) than the X2, in average comparisons. It falls short 10-25% (is that fair? on average?) to the X2 in acute or synthetic situations.


We could keep going, saying the 4870 is close to the 280, at times, and costs less and etc.etc.

The key difference being, that a 280 has more real world purpose than an X2. Then, from a tech standpoint, the performance of the X2, considering it's horsepower is far from impressive. Tack on the cost, heat, power etc. and it's even less impressive, and therefore just as much of a 'dog' as the 280.


In some ways, I think both sides failed.

Nvidia should have released the 280 as 55nm with better shaders.
ATI shouldn't have bothered with the X2, trying to attain some pointless 'crown,' and rather tried to keep the performance of their 4870/4850, but without giving the finger to heat/power/efficiency etc.


In the end, if a 280 isn't enough for you, then a X2 won't be either. The only real world application that will demand either of these cards is Crysis, more or less, and it's sad how everyone is using THAT as a benchmark, when five minutes before they were bitching about how Crysis is coded so 'poorly.' Yet even in Crysis the X2 will not give you that elusive 60 fps, or even a constant 40-45 - unless you turn things down or off, but then that defeats the purpose. But if you run a tuned custom config for Crysis, then you can get your 45+ FPS with all the eye candy, with EITHER card.

Back to square one we go.



This graph pretty much sums up my understanding and perception of GPUs these days, in that many of them run the majority of 3d applications without fault.

The top two games are popular, modern and have a general requirement in regards to the power needed to run them. They are, average. All cards perform exceptionally well, easily achieving the elusive '60 fps'(or near it) requirement. The bottom two games, are examples of programs that can heavily tax the same GPUs used in the previous games, but are also popular and modern, just not average, hence 'acute.'

Crysis seems self-explanitory. Good choice using Arma, I was hoping someone would. Older engine, but the rules of GPU physics (not physics like PHYSX) still apply. Lots of objects, shaders, long range viewing distance and high resolutions can result in very crippled frame rates. It's interesting how well the 4870 does, but more importantly how well the X2 doesn't.
 

Attachments

  • graph.gif
    graph.gif
    44.3 KB · Views: 462
Last edited:
Joined
Jun 16, 2008
Messages
3,175 (0.55/day)
Location
Brockport, NY
System Name Is rly gud
Processor Intel Core i5 11600kf
Motherboard Asus Prime Z590-V ATX
Memory (48GB total) 16GB (2x8GB) Crucial Ballistix Sport 3000MHZ and G. Skill Ripjaws 32GB 3200MHZ (2x16GB)
Video Card(s) GIGABYTE RTX 3060 12GB
Storage 1TB MSI Spatium M370 NVMe M.2 SSD
Display(s) 32" Viewsonic 4k, 34" Samsung 3440x1440, XP Pen Creative Pro 13.3
Power Supply EVGA 600 80+ Gold
VR HMD Meta Quest Pro, Tundra Trackers
Software Windows 10
Nvidia should have released the 280 as 55nm with better shaders.
ATI shouldn't have bothered with the X2, trying to attain some pointless 'crown,' and rather tried to keep the performance of their 4870/4850, but without giving the finger to heat/power/efficiency etc.

Not necessarily. The X2 attracts attention. If a (uneducated) consumer is told that this card is the best in the world, they'll think "Oh, I can't afford that, but they made this and it has to be good too!"

I think flagship cards are for gathering our attention. They have to have a purpose or else companies wouldn't compete for the strongest product.
 
Joined
Jun 20, 2007
Messages
3,937 (0.64/day)
System Name Widow
Processor Ryzen 7600x
Motherboard AsRock B650 HDVM.2
Cooling CPU : Corsair Hydro XC7 }{ GPU: EK FC 1080 via Magicool 360 III PRO > Photon 170 (D5)
Memory 32GB Gskill Flare X5
Video Card(s) GTX 1080 TI
Storage Samsung 9series NVM 2TB and Rust
Display(s) Predator X34P/Tempest X270OC @ 120hz / LG W3000h
Case Fractal Define S [Antec Skeleton hanging in hall of fame]
Audio Device(s) Asus Xonar Xense with AKG K612 cans on Monacor SA-100
Power Supply Seasonic X-850
Mouse Razer Naga 2014
Software Windows 11 Pro
Benchmark Scores FFXIV ARR Benchmark 12,883 on i7 2600k 15,098 on AM5 7600x
Not necessarily. The X2 attracts attention. If a (uneducated) consumer is told that this card is the best in the world, they'll think "Oh, I can't afford that, but they made this and it has to be good too!"

I think flagship cards are for gathering our attention. They have to have a purpose or else companies wouldn't compete for the strongest product.


Well yes of course they get attention, but I'm not trying to discuss the fickleness of the average consumer's mentality or ignorance; rather trying to discuss about their needs. If they don't understand their needs then that's again, about ignorance, and perception, not fact.


The unfortunate thing about flagship cards is that they attract people in two ways, there's the:

WTFBBQSAUCE pwnzerz - bragging rights and I want the best!
and then the
SynthetiX4Life benchers

And this IS unfortunate because the first type should be pointless and irrelevant. The second type, benchers, are pitting themselves against technological odds, in order to achieve some 'goal.' They are using GPUs (primarily made for games) in order to benchmark.

If benchmarking was done with programs that utilised lots of vertexes and things like CAD or cinematics, design tools etc, then they would be having to use Quadro type GPUs, which I would much rather prefer, as that has less to do with gaming, and more to do with pure horsepower (of a different type), accuracy and things of an acute and statistical nature.
 
Last edited:
Joined
Feb 18, 2006
Messages
5,147 (0.78/day)
Location
AZ
System Name Thought I'd be done with this by now
Processor i7 11700k 8/16
Motherboard MSI Z590 Pro Wifi
Cooling Be Quiet Dark Rock Pro 4, 9x aigo AR12
Memory 32GB GSkill TridentZ Neo DDR4-4000 CL18-22-22-42
Video Card(s) MSI Ventus 2x Geforce RTX 3070
Storage 1TB MX300 M.2 OS + Games, + cloud mostly
Display(s) Samsung 40" 4k (TV)
Case Lian Li PC-011 Dynamic EVO Black
Audio Device(s) onboard HD -> Yamaha 5.1
Power Supply EVGA 850 GQ
Mouse Logitech wireless
Keyboard same
VR HMD nah
Software Windows 10
Benchmark Scores no one cares anymore lols
Well yes of course they get attention, but I'm not trying to discuss the fickleness of the average consumer's mentality or ignorance; rather trying to discuss about their needs. If they don't understand their needs then that's again, about ignorance, and perception, not fact.


The unfortunate thing about flagship cards is that they attract people in two ways, there's the:

WTFBBQSAUCE pwnzerz - bragging rights and I want the best!
and then the
SynthetiX4Life benchers

And this IS unfortunate because the first type should be pointless and irrelevant. The second type, benchers, are pitting themselves against technological odds, in order to achieve some 'goal.' They are using GPUs (primarily made for games) in order to benchmark.

If benchmarking was done with programs that utilised lots of vertexes and things like CAD or cinematics, design tools etc, then they would be having to use Quadro type GPUs, which I would much rather prefer, as that has less to do with gaming, and more to do with pure horsepower (of a different type), accuracy and things of an acute and statistical nature.

true, sometimes I think ati and nvidia fawn over the flagship and forget about where the money is (well it's evident ati did for a long time as they got bought out while pumping impressive flagship cards.)

right now If I were to think about it, neither the gtx280 nor the 4870x2 are practical at all, and the gtx260 and 4870 are even a stretch. the 9800gtx+ and the 4850 seem to be much better buys as they can play everything out there with a nice detail setting and can be dual-d and sometimes tri-d for cheaper than the next card up. the flagships may become more useful in a year or so when games can tap into their power, but right now, I'm cruising on a 9600gt and have yet to find a complaint.
 

Wile E

Power User
Joined
Oct 1, 2006
Messages
24,318 (3.81/day)
System Name The ClusterF**k
Processor 980X @ 4Ghz
Motherboard Gigabyte GA-EX58-UD5 BIOS F12
Cooling MCR-320, DDC-1 pump w/Bitspower res top (1/2" fittings), Koolance CPU-360
Memory 3x2GB Mushkin Redlines 1600Mhz 6-8-6-24 1T
Video Card(s) Evga GTX 580
Storage Corsair Neutron GTX 240GB, 2xSeagate 320GB RAID0; 2xSeagate 3TB; 2xSamsung 2TB; Samsung 1.5TB
Display(s) HP LP2475w 24" 1920x1200 IPS
Case Technofront Bench Station
Audio Device(s) Auzentech X-Fi Forte into Onkyo SR606 and Polk TSi200's + RM6750
Power Supply ENERMAX Galaxy EVO EGX1250EWT 1250W
Software Win7 Ultimate N x64, OSX 10.8.4
Well sure, if you eliminate the 'price/performance' badge that so many people seem to wear, especially when discussing ATi.

But the X2 is as much of a 'flop' in that category as the 280 is/was, and this is where the variable of TWO gpus DOES matter.

Two GPUs,
DDR5,
How many shaders again? I can't count that high!
etc.

It boasts no real world advantage to the average consumer, or even some of the not so average consumers. It's a piece of hardware that only 'shines' (and by not that much..) in very acute situations that most people won't encounter.

It also draws 100 more watts, is natively hotter (and two times the heat at that), and costs $100 more(which should be a moot point, but SOMEHOW, price always gets involved whether it's TOP end products or not).

So...

Let's reverse the comparison.

280, single solution
Less power, heat and price.
Neck and neck, and at times, better(slightly) or worse performance (slightly) than the X2, in average comparisons. It falls short 10-25% (is that fair? on average?) to the X2 in acute or synthetic situations.


We could keep going, saying the 4870 is close to the 280, at times, and costs less and etc.etc.

The key difference being, that a 280 has more real world purpose than an X2. Then, from a tech standpoint, the performance of the X2, considering it's horsepower is far from impressive. Tack on the cost, heat, power etc. and it's even less impressive, and therefore just as much of a 'dog' as the 280.


In some ways, I think both sides failed.

Nvidia should have released the 280 as 55nm with better shaders.
ATI shouldn't have bothered with the X2, trying to attain some pointless 'crown,' and rather tried to keep the performance of their 4870/4850, but without giving the finger to heat/power/efficiency etc.


In the end, if a 280 isn't enough for you, then a X2 won't be either. The only real world application that will demand either of these cards is Crysis, more or less, and it's sad how everyone is using THAT as a benchmark, when five minutes before they were bitching about how Crysis is coded so 'poorly.' Yet even in Crysis the X2 will not give you that elusive 60 fps, or even a constant 40-45 - unless you turn things down or off, but then that defeats the purpose. But if you run a tuned custom config for Crysis, then you can get your 45+ FPS with all the eye candy, with EITHER card.

Back to square one we go.



This graph pretty much sums up my understanding and perception of GPUs these days, in that many of them run the majority of 3d applications without fault.

The top two games are popular, modern and have a general requirement in regards to the power needed to run them. They are, average. All cards perform exceptionally well, easily achieving the elusive '60 fps'(or near it) requirement. The bottom two games, are examples of programs that can heavily tax the same GPUs used in the previous games, but are also popular and modern, just not average, hence 'acute.'

Crysis seems self-explanitory. Good choice using Arma, I was hoping someone would. Older engine, but the rules of GPU physics (not physics like PHYSX) still apply. Lots of objects, shaders, long range viewing distance and high resolutions can result in very crippled frame rates. It's interesting how well the 4870 does, but more importantly how well the X2 doesn't.
All the charts I have seen point to the X2 winning by a fair percentage, more often than it loses to a GTX.

With 280's dipping down as low as $420 on Newegg, it probably does take the price/perf crown now, but that wasn't the discussion here. The discussion turned into merely who had the fastest card, nothing more.

The fact remains the fastest card is the 4870X2.

Practical or not, I wish I could have 2 of them for my Xfire board. lol.

I also wouldn't mind having 2 280's for my AMD rig (Now that is truly overkill with it's 1440x900 monitor. lol.)
 
Joined
Dec 28, 2006
Messages
4,378 (0.69/day)
Location
Hurst, Texas
System Name The86
Processor Ryzen 5 3600
Motherboard ASROCKS B450 Steel Legend
Cooling AMD Stealth
Memory 2x8gb DDR4 3200 Corsair
Video Card(s) EVGA RTX 3060 Ti
Storage WD Black 512gb, WD Blue 1TB
Display(s) AOC 24in
Case Raidmax Alpha Prime
Power Supply 700W Thermaltake Smart
Mouse Logitech Mx510
Keyboard Razer BlackWidow 2012
Software Windows 10 Professional
true, sometimes I think ati and nvidia fawn over the flagship and forget about where the money is (well it's evident ati did for a long time as they got bought out while pumping impressive flagship cards.)

right now If I were to think about it, neither the gtx280 nor the 4870x2 are practical at all, and the gtx260 and 4870 are even a stretch. the 9800gtx+ and the 4850 seem to be much better buys as they can play everything out there with a nice detail setting and can be dual-d and sometimes tri-d for cheaper than the next card up. the flagships may become more useful in a year or so when games can tap into their power, but right now, I'm cruising on a 9600gt and have yet to find a complaint.

really how so did Nvidia forget the mid range ever?

TNT2
Geforce2 MX400
Geforce3 Ti 200
Geforce4 Ti 4200
Geforce FX5600
Geforce FX5700
Geforce FX5900XT
Geforce 6600GT
Geforce 6800GS
Geforce 7600GT
Geforce 7900GS
Geforce 8600GTS
Geforce 8800GS
Geforce 9600GT

It seems to me since 1999 Nvidia has been covering the midrange, you could argue the FX cards loose to the Radeon 9600 but anyone remember those days actully, the time of DX8, when DX9 wasn't being really used to potental. The FX cards kept up and the Radeon 9600 sucks just as much at FarCry or HL2 as the FX midrange does. The 8600GTS while not faster than the old highend does not seem like a real issue, it offered 7950GT preformance and DX10 support where is the problem? Now lets look at ATI's midrange and tell me who tends to have the best midrange

Radeon 7500
Radeon 8500Le
Radeon 9500
Radeon 9600
Radeon 9800SE
Radeon x600
Radeon x700
Radeon x800GT
Radeon x800GTO
Radeon x1600
Radeon x1650
Radeon x1800GTO
Radeon HD2600
Radeon HD36x0
Radeon HD3850

so in the sub 200 market who had the best cards at launch. Let me remind you a few things again. The x600 went up agasint the 6600GT at first which it couldn't compete with and later the x700pro couldn't keep up either. They made the x800GT and GTO to compete with the 6800GS but the 6800GS was once again faster. The x1600 was a joke, the x1650 was also a joke save the x1650XT but when it came out the 7900GS was the same price, and the x1800GTO lost to the 7600GT most of the time. HD2600 cards couldn't keep up with the 8600's and the HD36x0 didtn help. The HD3850 was a good midrange till the 8800GS showed up followed by the 9600GT

In truth the good ATI midrange look like this

Radeon 9500
Radeon 9600
Radeon HD3850

Nvidia had the faster midrange at launch every other time
 

AsRock

TPU addict
Joined
Jun 23, 2007
Messages
18,853 (3.08/day)
Location
UK\USA
Processor AMD 3900X \ AMD 7700X
Motherboard ASRock AM4 X570 Pro 4 \ ASUS X670Xe TUF
Cooling D15
Memory Patriot 2x16GB PVS432G320C6K \ G.Skill Flare X5 F5-6000J3238F 2x16GB
Video Card(s) eVga GTX1060 SSC \ XFX RX 6950XT RX-695XATBD9
Storage Sammy 860, MX500, Sabrent Rocket 4 Sammy Evo 980 \ 1xSabrent Rocket 4+, Sammy 2x990 Pro
Display(s) Samsung 1080P \ LG 43UN700
Case Fractal Design Pop Air 2x140mm fans from Torrent \ Fractal Design Torrent 2 SilverStone FHP141x2
Audio Device(s) Yamaha RX-V677 \ Yamaha CX-830+Yamaha MX-630 Infinity RS4000\Paradigm P Studio 20, Blue Yeti
Power Supply Seasonic Prime TX-750 \ Corsair RM1000X Shift
Mouse Steelseries Sensei wireless \ Steelseries Sensei wireless
Keyboard Logitech K120 \ Wooting Two HE
Benchmark Scores Meh benchmarks.
Well sure, if you eliminate the 'price/performance' badge that so many people seem to wear, especially when discussing ATi.

But the X2 is as much of a 'flop' in that category as the 280 is/was, and this is where the variable of TWO gpus DOES matter.

Two GPUs,
DDR5,
How many shaders again? I can't count that high!
etc.

It boasts no real world advantage to the average consumer, or even some of the not so average consumers. It's a piece of hardware that only 'shines' (and by not that much..) in very acute situations that most people won't encounter.

It also draws 100 more watts, is natively hotter (and two times the heat at that), and costs $100 more(which should be a moot point, but SOMEHOW, price always gets involved whether it's TOP end products or not).

So...

Let's reverse the comparison.

280, single solution
Less power, heat and price.
Neck and neck, and at times, better(slightly) or worse performance (slightly) than the X2, in average comparisons. It falls short 10-25% (is that fair? on average?) to the X2 in acute or synthetic situations.


We could keep going, saying the 4870 is close to the 280, at times, and costs less and etc.etc.

The key difference being, that a 280 has more real world purpose than an X2. Then, from a tech standpoint, the performance of the X2, considering it's horsepower is far from impressive. Tack on the cost, heat, power etc. and it's even less impressive, and therefore just as much of a 'dog' as the 280.


In some ways, I think both sides failed.

Nvidia should have released the 280 as 55nm with better shaders.
ATI shouldn't have bothered with the X2, trying to attain some pointless 'crown,' and rather tried to keep the performance of their 4870/4850, but without giving the finger to heat/power/efficiency etc.


In the end, if a 280 isn't enough for you, then a X2 won't be either. The only real world application that will demand either of these cards is Crysis, more or less, and it's sad how everyone is using THAT as a benchmark, when five minutes before they were bitching about how Crysis is coded so 'poorly.' Yet even in Crysis the X2 will not give you that elusive 60 fps, or even a constant 40-45 - unless you turn things down or off, but then that defeats the purpose. But if you run a tuned custom config for Crysis, then you can get your 45+ FPS with all the eye candy, with EITHER card.

Back to square one we go.



This graph pretty much sums up my understanding and perception of GPUs these days, in that many of them run the majority of 3d applications without fault.

The top two games are popular, modern and have a general requirement in regards to the power needed to run them. They are, average. All cards perform exceptionally well, easily achieving the elusive '60 fps'(or near it) requirement. The bottom two games, are examples of programs that can heavily tax the same GPUs used in the previous games, but are also popular and modern, just not average, hence 'acute.'

Crysis seems self-explanitory. Good choice using Arma, I was hoping someone would. Older engine, but the rules of GPU physics (not physics like PHYSX) still apply. Lots of objects, shaders, long range viewing distance and high resolutions can result in very crippled frame rates. It's interesting how well the 4870 does, but more importantly how well the X2 doesn't.

Any chance you know the program used to get the FPS for arma ?. Did not know the community actually made one yet.. Well in fact there is one but how Arma works makes benchmark programs pointless.

You could load a part of the game 5 times and find that each time different textures had not loaded there fore giving off false FPS.

As i was trying to get W1z to benchmark Arma to find it was a pretty much pointless. How ever he said he might do it for Arma 2 if things improve.

Here's a message i got of some one who does a benchmark program for arma and says what the issue's are.
Hi! Still a little bit surprised here but ill try to answer your questions thoroughly.

The biggest problem about the ArmA Mark was & still is the fact that (no matter what you do) you´ll always get varying results - that´s due to ArmA´s memory management &/or LOD handling.
Another stumble stone with it are the thousands of different performance settings people are using - very few can be arsed to setup their ArmA the way someone else told them to - maybe not so important though for an isolated benchmark.

As for updates: Sadly there aren´t any - but despite some wrong text (says OFP instead of ArmA) in the mission header or briefing there should be no major flaws or show stoppers.

Just be advised that sometimes ArmA behaves weird -
Back in OFP it was strongly advised to let the benchmark run through first, then restart it and let it run again to get a more comparable score (precaching objects helped alot) - but in ArmA that doesn´t help any longer. So sometimes the result will be influenced in a negative or positive way because some textures didn´t show up right from the start, or in another case some AI would decide to rather not cross a bridge at first (hence bringing the results down due to longer needed time for a part of the test) and so on. But since all those are ArmA 'features' nothing can be done about it from our side.


Coming to an end, im still suprised, but absolutely love your initiative!
Would be great to see ArmA being used for serious hardware tests - because we all know its the most demanding game ever. So yeah, plz go ahead!


Cheers
burns
 

Tatty_Two

Gone Fishing
Joined
Jan 18, 2006
Messages
25,793 (3.88/day)
Location
Worcestershire, UK
Processor Rocket Lake Core i5 11600K @ 5 Ghz with PL tweaks
Motherboard MSI MAG Z490 TOMAHAWK
Cooling Thermalright Peerless Assassin 120SE + 4 Phanteks 140mm case fans
Memory 32GB (4 x 8GB SR) Patriot Viper Steel 4133Mhz DDR4 @ 3600Mhz CL14@1.45v Gear 1
Video Card(s) Asus Dual RTX 4070 OC
Storage WD Blue SN550 1TB M.2 NVME//Crucial MX500 500GB SSD (OS)
Display(s) AOC Q2781PQ 27 inch Ultra Slim 2560 x 1440 IPS
Case Phanteks Enthoo Pro M Windowed - Gunmetal
Audio Device(s) Onboard Realtek ALC1200/SPDIF to Sony AVR @ 5.1
Power Supply Seasonic CORE GM650w Gold Semi modular
Mouse Coolermaster Storm Octane wired
Keyboard Element Gaming Carbon Mk2 Tournament Mech
Software Win 10 Home x64
All the charts I have seen point to the X2 winning by a fair percentage, more often than it loses to a GTX.

With 280's dipping down as low as $420 on Newegg, it probably does take the price/perf crown now, but that wasn't the discussion here. The discussion turned into merely who had the fastest card, nothing more.

The fact remains the fastest card is the 4870X2.

Practical or not, I wish I could have 2 of them for my Xfire board. lol.

I also wouldn't mind having 2 280's for my AMD rig (Now that is truly overkill with it's 1440x900 monitor. lol.)


Actually you said the fastest card......I said the fastest GPU :p
 

Wile E

Power User
Joined
Oct 1, 2006
Messages
24,318 (3.81/day)
System Name The ClusterF**k
Processor 980X @ 4Ghz
Motherboard Gigabyte GA-EX58-UD5 BIOS F12
Cooling MCR-320, DDC-1 pump w/Bitspower res top (1/2" fittings), Koolance CPU-360
Memory 3x2GB Mushkin Redlines 1600Mhz 6-8-6-24 1T
Video Card(s) Evga GTX 580
Storage Corsair Neutron GTX 240GB, 2xSeagate 320GB RAID0; 2xSeagate 3TB; 2xSamsung 2TB; Samsung 1.5TB
Display(s) HP LP2475w 24" 1920x1200 IPS
Case Technofront Bench Station
Audio Device(s) Auzentech X-Fi Forte into Onkyo SR606 and Polk TSi200's + RM6750
Power Supply ENERMAX Galaxy EVO EGX1250EWT 1250W
Software Win7 Ultimate N x64, OSX 10.8.4

ktr

Joined
Apr 7, 2006
Messages
7,404 (1.13/day)
Meh, its just a cycle. Nvidia has a few years of the best gpu's, then ATI has a few years of the best gpu's...and so forth. This also includes AMD and Intel.
 

Tatty_Two

Gone Fishing
Joined
Jan 18, 2006
Messages
25,793 (3.88/day)
Location
Worcestershire, UK
Processor Rocket Lake Core i5 11600K @ 5 Ghz with PL tweaks
Motherboard MSI MAG Z490 TOMAHAWK
Cooling Thermalright Peerless Assassin 120SE + 4 Phanteks 140mm case fans
Memory 32GB (4 x 8GB SR) Patriot Viper Steel 4133Mhz DDR4 @ 3600Mhz CL14@1.45v Gear 1
Video Card(s) Asus Dual RTX 4070 OC
Storage WD Blue SN550 1TB M.2 NVME//Crucial MX500 500GB SSD (OS)
Display(s) AOC Q2781PQ 27 inch Ultra Slim 2560 x 1440 IPS
Case Phanteks Enthoo Pro M Windowed - Gunmetal
Audio Device(s) Onboard Realtek ALC1200/SPDIF to Sony AVR @ 5.1
Power Supply Seasonic CORE GM650w Gold Semi modular
Mouse Coolermaster Storm Octane wired
Keyboard Element Gaming Carbon Mk2 Tournament Mech
Software Win 10 Home x64
Top