Tuesday, April 8th 2008

NVIDIA GeForce 9800 GX2 Reaches EOL in Three Months?

This information from Expreview may dissapoing many GeForce 9800 GX2 owners if true. NVIDIA is about to EOL (end-of-life) the GeForce 9800 GX2 line-up in just three months, as a result of two new GT200 cards - the single GPU GeForce 9900GTX and the dual GPU GeForce 9900 GX2. One of the GT200 cards will have similar performance and production cost as the GeForce 9800 GX2, which will force the manufacturer to cut down the "older" card. There will be no rebranding for 9800 GX2, like the GeForce 8800 GS which will become 9600 GSO, but just a sudden death. Meanwhile, details of the new GT200 graphics are still unknown.
Source: Expreview.com
Add your own comment

122 Comments on NVIDIA GeForce 9800 GX2 Reaches EOL in Three Months?

#76
newtekie1
Semi-Retired Folder
174.74 works just fine with the FX cards, I have them installed on my FX5200 right now.
Posted on Reply
#77
candle_86
I can't get them to install on my FX5600 Ultra FC card
Posted on Reply
#79
imperialreign
DarkMatterSometimes I feel like the only one with some memory, although memory it's not really needed when you have wikipedia at hand:

Nvidia 6 series:

6200, 6200 TC2, 6500, 6600 LE, 6600, 6600 GT, 6600 XL, 6800 LE, 6800 XT, 6800, 6800 GTO, 6800 GS, 6800 GT, 6800 Ultra.

Total: 14 cards.

Ati 10 series:

X300 SE, X300, X550 SE, X550, X600 Pro, AIW X600 Pro, X600 XT, X700, X700 Pro, X700 XT, X800 SE, X800 GT128, X800 GT 256, X800 GTO, X800, X800 GTO2, X800 GTO-16, X800 Pro, X800 Pro VIVO, X800 XL, AIW X800 XL, X800 XT, X800 XT VIVO, AIW X800 XT, X800 XT PE, X850 Pro, X850 XT, X850 XT PE.

Total: 28 cards.

Let's see the next generation.

Nvidia 7 series:

7100 GS, 7200 GS, 7300 SE, 7300 LE, 7300 GS, 7300 GT, 7600 GS, 7600 GT, 7600 GT Rev 2, 7800 GS, 7800 GT, 7800 GTX, 7800 GTX 512, 7900 GS, 7900 GT, 7900 GTO, 7900 GTX, 7950 GT, 7950 GX2.

Total: 19 cards.

Ati 11 series:

X1300, X1300 Pro, X1300 XT, X1550 SE, X1550, X1600 Pro, X1600 XT, X1650, X1650 Pro, X1650 GT, X1650 XT, X1800 GTO, X1800 GTO Rev. 2, X1800 XL, AIW X1800 XL, X1800 XT, X1900 GT, X1900 GT Rev. 2, AIW X1900, X1900 CrossFire, X1900 XT, X1900 XTX, X1950 GT, X1950 Pro, X1950 XT, X1950 XTX.

Total: 26 cards.

Nvidia 8 series:

8400 GS, 8500 GT, 8600 GT, 8600 GTS, 8800 GS, 8800 GTS G80, 8800 GT, 8800 GTS G92, 8800 GTX, 8800Ultra.

Total: 10 cards.

But let's add OEM and 9 series since it's based on the same chip (though I could do the same in the above lists and add quite some more).

8600 GS, 9500 GT, 9600 GT, 9800 GT (GTS? Is this one even going to be launched?), 9800 GTX, 9800 GX2

Total: 16 cards.

I could go with Ati series 9 vs. Nvidia series 5 too, but I think I have proven my point with this... (Huh! I didn't make any point? Guess it. lol)

That's what happens when you are in the lead with a strong architecture that can scale well.
And TBH I don't think that's bad, actually I think it's good for the consumer, because you have many cards at different price points with small differences in performance. You can spend as much as you want and you'll get the performance accordingly, you don't have to settle for a slow card (slower than what you want) or spend big $ for a card that is more than what you need. Cough* HD series *cough.
I'm sorry if I wasn't all that clear, in that earlier post about the series generations - yeah, you can bunch all ATI's cards into the X100 and X1000 series; but, what I was trying to get across is that for many, ATI's card naming schemes are a little easier to interpret. The higher the number, the better they think the card is, and there was typically a noticeable difference between two "sub-series" (i.e. a 1000 vs 1300, 1300 vs 1500, 1500 vs 1600, etc). But, you knew that they considered a 1950 to be better than a 1900, and the 1900 was better than the 1800, the 1650 better than the 1600, etc. I guess that was the biggest difference between nVidia's and ATI's naming schemes a few years back, ATI had "sub-series" lineups. It's only been since the HD2000 series that they've gone to a more relaxed naming scheme similar to what nVidia has used (2400, 2600, 2900, etc), even going so far as to drop the suffixes in place of the ~50 and ~70 tags.

Now, I'll defi give you the fact that ATI went hog-wild-and-a-half with the 1900 lineup, there were tow revisions of the GT, two revisions of the PRO, two revisions of the XT, then the XTX, the crossfire edition, then the 1950 lineup. ATI beat that sub-series to a dead horse, buried it, brought it back and beat it some more . . . and they loved the crap out of their name suffixes, too.
Posted on Reply
#80
candle_86
well lets show the Radeon 9 compared to Nvidia FX just for kicks

FX

5200, 5200Ultra, 5300, 5500, 5600XT, 5600, 5600Ultra, 5600 Ultra FC, 5700le, 5700, 5700 Ultra. 5700 Ultra DDR3, 5750, 5800, 5800 Ultra, 5900XT, 5900, 5900Ultra, 5950 Ultra

Radeon 9

9000SE, 9000, 9000Pro, 9200SE, 9200, 9200 Pro, 9500, 9500Pro, 9550, 9600SE, 9600, 9600Pro, 9600XT, 9700TX, 9700, 9700Pro, 9800SE, 9800, 9800Pro, 9800Pro 256, 9800XT

last time it was about even
Posted on Reply
#81
newtekie1
Semi-Retired Folder
candle_86I can't get them to install on my FX5600 Ultra FC card
Are you using the WHQL version or the Beta version? The WHQL version only supports a few cards, but the beta version support pretty much every card ever released. The FX 5600 Ultra is listed under the supported cards for the beta version.
Posted on Reply
#82
btarunr
Editor & Senior Moderator
newtekie1Wait, since when has EOL meant support stopped? Even in Windows, EOL didn't mean support stopped, it just meant production and sales stopped.

Just look at Windows 98, Microsoft EOL'd it in 2004, but continued support into 2006. EOL does not mean support for the product ends, it just means the product isn't being produced anymore.

The 7 series cards have long been EOL'd, and they still get driver updates and support.
Sure, you have drivers from NVidia for FX 5200 too, BUT support in the sense of driver updates that fix issues, 'enhance performance' doesn't happen. You get the latest driver, but apart from the Control Panel, the driver has nothing new for the FX 5200 for example as also with all other NVidia products that hit EOL. Besides, it was extremely shameful of them to abandon the 7950 GX2. It's sort of like they throw a 4 month baby into a dumpster and drive away.
Posted on Reply
#83
newtekie1
Semi-Retired Folder
btarunrSure, you have drivers from NVidia for FX 5200 too, BUT support in the sense of driver updates that fix issues, 'enhance performance' doesn't happen. You get the latest driver, but apart from the Control Panel, the driver has nothing new for the FX 5200 for example as also with all other NVidia products that hit EOL. Besides, it was extremely shameful of them to abandon the 7950 GX2. It's sort of like they throw a 4 month baby into a dumpster and drive away.
The products still get the general enhancements that come with the updated drivers, but you are correct there are generally no real enhancements geared directly at the product. But that happens with pretty much every old piece of hardware and it doesn't happen just because it is EOL'd. It happens because new products need more work. The FX series was as good as it was going to get, it wasn't getting any better via driver updates. The same goes for the 7950GX2. When hardware gets old, companies don't waste time reworking drivers to improve them.
Posted on Reply
#84
btarunr
Editor & Senior Moderator
newtekie1The same goes for the 7950GX2. When hardware gets old, companies don't waste time reworking drivers to improve them.
Give me a driver that lets me run that card under Vista64, give me a driver that allows me to use one of its marketing features, SLI (of two 7950 GX2 cards) under any OS, you choose. It was pretty much an abandon.
Posted on Reply
#85
DarkMatter
imperialreignI'm sorry if I wasn't all that clear, in that earlier post about the series generations - yeah, you can bunch all ATI's cards into the X100 and X1000 series; but, what I was trying to get across is that for many, ATI's card naming schemes are a little easier to interpret. The higher the number, the better they think the card is, and there was typically a noticeable difference between two "sub-series" (i.e. a 1000 vs 1300, 1300 vs 1500, 1500 vs 1600, etc). But, you knew that they considered a 1950 to be better than a 1900, and the 1900 was better than the 1800, the 1650 better than the 1600, etc. I guess that was the biggest difference between nVidia's and ATI's naming schemes a few years back, ATI had "sub-series" lineups. It's only been since the HD2000 series that they've gone to a more relaxed naming scheme similar to what nVidia has used (2400, 2600, 2900, etc), even going so far as to drop the suffixes in place of the ~50 and ~70 tags.

Now, I'll defi give you the fact that ATI went hog-wild-and-a-half with the 1900 lineup, there were tow revisions of the GT, two revisions of the PRO, two revisions of the XT, then the XTX, the crossfire edition, then the 1950 lineup. ATI beat that sub-series to a dead horse, buried it, brought it back and beat it some more . . . and they loved the crap out of their name suffixes, too.
You say that for many Ati card naming scheme is easier to interpret. I can assure you that for many Nvidia's is easier. It's that simple. I don't see any difficulty in any of the two. I can admit that I like the scheme that Ati is using with its HD3000 series and that xx30, xx50, xx70 is better than sufixes. We can say also that at the same time there's been all this naming confusion on Nvidia's part, but remember that Ati and it's jump to the HD3000 is the guillty for all this confusion. Well maybe not with the 8800 GTS, but yes with the jump to 9 series. Before these series both Ati and Nvidia followed pretty much the same scheme. And frankly, I don't know what you are trying to say when you say some years back. It has always been the same naming scheme: first number for the series, second one for the model (usually using 3 numbers, the 8 for the high-end and using the others in a way that better explains the performance difference between them, if there's a big revision +1 to all the sub-series number) and the last two for an small revision. The sufixes usually for differences that came out from the same chip durng production to improve yields (such as disabled pipelines or simply different clocks). This was used by both companies up until now, I can't see how you can argue one was clearer than the other, because the two had the same scheme. In Nvidia cards you also know that 7600 is better than 7500, 7900 better than 7800 and so on. And then you have the suffix just as in Ati before HD3000.

And just so that you know that a higher number wasn't always a sign of a better card, you have the X1900 GT vs. X1950 GT. I couldn't find myself a Nvidia card with the same problem.

What I'm saying is that you can't say what you said and relate it to common Nvidia trait, when both companies have done the same for years. Indeed is Ati who liked more flooding the market with many different models until R600. And if they are not doing it right now, is because their architecture is not flexible enough to permit it.
Posted on Reply
#86
Wile E
Power User
The 7950GX2 is still supported by the latest drivers, but hasn't seen any fixes or optimizations targeted to it. Just because it's on the supported card list, doesn't mean they've done anything meaningful for it in terms of performance or bug fixes or features. REAL support for that card has been abysmal.
Posted on Reply
#87
btarunr
Editor & Senior Moderator
Looking at NVidia's track record of dealing with products that hit EOL way too soon (eg: 7950 GX2), I would stay light years away from buying a 9800 GX2 now.
Posted on Reply
#88
BumbRush
btarunrSure, you have drivers from NVidia for FX 5200 too, BUT support in the sense of driver updates that fix issues, 'enhance performance' doesn't happen. You get the latest driver, but apart from the Control Panel, the driver has nothing new for the FX 5200 for example as also with all other NVidia products that hit EOL. Besides, it was extremely shameful of them to abandon the 7950 GX2. It's sort of like they throw a 4 month baby into a dumpster and drive away.
first time i have seen you say anything that was no "nvidia rocks ati sucks" and you acctualy made sence this time :)

nvidia really raped people who bought then gx2 , i know ppl who had them, every singel person got pissed and dumped it as soon as it was clear driver support was NOT gonna happen(after the 8800 came out it was never gonna get fixed....)

nvidia really raped people on the gx2 back then......
Posted on Reply
#89
BumbRush
DarkMatterYou say that for many Ati card naming scheme is easier to interpret. I can assure you that for many Nvidia's is easier. It's that simple. I don't see any difficulty in any of the two. I can admit that I like the scheme that Ati is using with its HD3000 series and that xx30, xx50, xx70 is better than sufixes. We can say also that at the same time there's been all this naming confusion on Nvidia's part, but remember that Ati and it's jump to the HD3000 is the guillty for all this confusion. Well maybe not with the 8800 GTS, but yes with the jump to 9 series. Before these series both Ati and Nvidia followed pretty much the same scheme. And frankly, I don't know what you are trying to say when you say some years back. It has always been the same naming scheme: first number for the series, second one for the model (usually using 3 numbers, the 8 for the high-end and using the others in a way that better explains the performance difference between them, if there's a big revision +1 to all the sub-series number) and the last two for an small revision. The sufixes usually for differences that came out from the same chip durng production to improve yields (such as disabled pipelines or simply different clocks). This was used by both companies up until now, I can't see how you can argue one was clearer than the other, because the two had the same scheme. In Nvidia cards you also know that 7600 is better than 7500, 7900 better than 7800 and so on. And then you have the suffix just as in Ati before HD3000.

And just so that you know that a higher number wasn't always a sign of a better card, you have the X1900 GT vs. X1950 GT. I couldn't find myself a Nvidia card with the same problem.

What I'm saying is that you can't say what you said and relate it to common Nvidia trait, when both companies have done the same for years. Indeed is Ati who liked more flooding the market with many different models until R600. And if they are not doing it right now, is because their architecture is not flexible enough to permit it.
5700ultra vs 5950xt the 5700ultra was faster in any modern game(dx9/ogl2) hell the 5900ultra vs the 5950xt(xt on nvidia=worse of the perticular number line..poor clockers that run hot)

i could go on and on, 5500 vs 5500xt this case they used the suffix BUT its the same problem, they copyed ati's suffixes but they reverced them, insted of xt being the best they made it the worst(dirty trick!!!)

or the 5700ultra vs 5750xt(yes i have seen these cards, and in every case had to replace them with an ati card to give the person decent dx9 perf) the 5750 was not a well known run, it was just a 5700 with worse ram in most cases, horrible cards the 5750xt was close in my exp to the higher end 5200's..........horrible cards.......horrible :nutkick: :banghead: :banghead: :shadedshu :cry:
Posted on Reply
#90
calvary1980
you can add the 8800GTS to the EOL list according to Fudzilla and NordicHardware as of Today. whats interesting is in another thread I posted a link from HOCP about the 9800GTX and 9800GX2 going EOL in 3 months about 3 days ago however this news bit from expreview only mentions the 9800GX2.

- Christine
Posted on Reply
#91
candle_86
BumbRush5700ultra vs 5950xt the 5700ultra was faster in any modern game(dx9/ogl2) hell the 5900ultra vs the 5950xt(xt on nvidia=worse of the perticular number line..poor clockers that run hot)

i could go on and on, 5500 vs 5500xt this case they used the suffix BUT its the same problem, they copyed ati's suffixes but they reverced them, insted of xt being the best they made it the worst(dirty trick!!!)

or the 5700ultra vs 5750xt(yes i have seen these cards, and in every case had to replace them with an ati card to give the person decent dx9 perf) the 5750 was not a well known run, it was just a 5700 with worse ram in most cases, horrible cards the 5750xt was close in my exp to the higher end 5200's..........horrible cards.......horrible :nutkick: :banghead: :banghead: :shadedshu :cry:
umm, the 5600XT was out before the 9800XT and 9600XT, it dates to the launch of the 9800pro. Nvidia has never stolen ATI's naming crap.

As for the cards themselves they did what they where supposed to do, run DX8 games, not even an ATI card from that time can run a DX9 game at full tilt. Back in 2003 who cared if it handled SM2 right, they first game to use it was FarCry and the 6800Ultra beat it out the door. As for poor clockers is that why my prolink pixelview 5900XT Limited Golden Edition clocked to 475/1000 from 350/700 because it was a bad overclocker? As for saying the 5750 is a crappier card than the normal is total BS also, the 5750 was the PCX 5700. Go read bombrush, stop spouting your stupid crap.
Posted on Reply
#92
BumbRush
the 5600XT was not out sooner, the 5600 was, they added the XT later after the 9800XT sales where so high, i was a huge nvidia fan back then and was watching the 5600's because i fully expected nvidia to fix the perf in dx9 with drivers, they didnt at all.......

and the 9500pro/9700 can run dx9 games from back then JUST FINE, i know i had a 9800SE 256bit(moded to true pro) and o farcry wasnt the first to use sm2, there where others, i dont got a list, but i had a bunch of betas i was in and many of them came out b4 farcry did.

as to the 6800 killing the x800, the x800 was faster and had better IQ, when farcry updated to a sm3 patch it brought the gf6 line to THE SAME QUILITY as ati's cards already had includint the 9500-9600-9700-9800 cards.

the 5750xt was worse then the 5700 normal, period.

now compare the 5800u to the 59*0 cards for example, in many benches its a couple fps faster, its a shit card as are all the fx line cards but still, its 5800 vs 5900/5950 and u see that the 5800 is slower in some cases and faster in others, but still, by name it shoutd NEVER be faster........EVER!!!!!
Posted on Reply
#93
newtekie1
Semi-Retired Folder
btarunrGive me a driver that lets me run that card under Vista64, give me a driver that allows me to use one of its marketing features, SLI (of two 7950 GX2 cards) under any OS, you choose. It was pretty much an abandon.
174.74 allows both. It supports the 7950 GX2 under Vista64, and supports Quad-SLI, most of the drivers released have been like this. Have you actually tried it? I have a customer that comes in my shop regulary that bought two 7950 GX2's though me, he still uses them in Quad-SLI and runs Vista64, 174.74 has been working wonders for him, so have several previous driver releases.
Wile EThe 7950GX2 is still supported by the latest drivers, but hasn't seen any fixes or optimizations targeted to it. Just because it's on the supported card list, doesn't mean they've done anything meaningful for it in terms of performance or bug fixes or features. REAL support for that card has been abysmal.
Real support for any of the 7 series cards, even the ones that are not EOL, has been abysmal. Just like real support for the x1k series has been non-existant also. Once a new series comes out, both graphics camps pretty much drop real support for their older cards. Usually, it isn't a problem since most of the cards have had more than enough time to mature before the new series was released. However, in the cases of cards released at the very end of a series lifespan, support is usually dropped rather quickly, but the cards still work and still get the general benefits of the new drivers. ATi did the same thing with their Dual x1950Pro, there haven't been driver improvemnts directly for the cards since the day it was released.
Posted on Reply
#94
newtekie1
Semi-Retired Folder
candle_86Go read bombrush, stop spouting your stupid crap.
He is an ATi fanboy, and generally doesn't have a clue what he is talking about. Just add him to your ignore list and move on.
Posted on Reply
#95
DarkMatter
BumbRush5700ultra vs 5950xt the 5700ultra was faster in any modern game(dx9/ogl2) hell the 5900ultra vs the 5950xt(xt on nvidia=worse of the perticular number line..poor clockers that run hot)

i could go on and on, 5500 vs 5500xt this case they used the suffix BUT its the same problem, they copyed ati's suffixes but they reverced them, insted of xt being the best they made it the worst(dirty trick!!!)

or the 5700ultra vs 5750xt(yes i have seen these cards, and in every case had to replace them with an ati card to give the person decent dx9 perf) the 5750 was not a well known run, it was just a 5700 with worse ram in most cases, horrible cards the 5750xt was close in my exp to the higher end 5200's..........horrible cards.......horrible :nutkick: :banghead: :banghead: :shadedshu :cry:
So in the end you can't understand the naming scheme? :shadedshu It's easy...

xx50 is the slightly revamped xx00 and is always faster (or as fast) IF THE SUFIX IS THE SAME, because the sufix is (and was on Ati) the greates indicator of performance after the second number (x7xx), the model number or sub-series number, call it as you will. For example, xx50 GT is supposed to be faster than xx00 GT but NEVER faster than xx00 GTX. It's easy.

To my knowledge 5950 XT and 5750 XT never existed as Nvidia cards. Maybe they were some partner made special cads. I know of many others like that, when a card was already out of production Nvidia would let some of its partners to make some crippled cards, using lower memory and such, to sell all of the stock. Maybe that's the case.

5800 was NEVER and nowhere faster than 5900 or 5950. In some fantasy world of yours maybe. Or in a very specific game and level, at an specific setting and resolution maybe, otherwise 5900 and up was a hell of a lot faster. LOL :shadedshu

And AFAIK 5600 XT launched some months before the 9800 XT, so I have to agree with candle. And BTW only an Ati lover would think that XT must be an indicator of a faster card other than in Ati's lineup.

On the other hand, I have to agree with you in that Nvidia's FX series were crap.
Posted on Reply
#96
Tatty_Two
Gone Fishing
BumbRushas to the 6800 killing the x800, the x800 was faster and had better IQ, when farcry updated to a sm3 patch it brought the gf6 line to THE SAME QUILITY as ati's cards already had includint the 9500-9600-9700-9800 cards.
I find it hard to beleive, having owned both the 6800 vanilla and Ultra as well as an 800XT that the IQ in the 800 was better, the 800 if I remember rightly didnt support SM3 where as the 6800 did if my memory serves me correctly, a big jump in gfx quality and effects between SM2 and SM3. There were many things IMO that the X800 was better at than the 6800 but IQ was not one of them.....but as I said, thats just my opinion.
Posted on Reply
#97
btarunr
Editor & Senior Moderator
Yes, Radeon X series lacked Shader Model 3.0 support.
Posted on Reply
#98
newtekie1
Semi-Retired Folder
Tatty_OneI find it hard to beleive, having owned both the 6800 vanilla and Ultra as well as an 800XT that the IQ in the 800 was better, the 800 if I remember rightly didnt support SM3 where as the 6800 did if my memory serves me correctly, a big jump in gfx quality and effects between SM2 and SM3. There were many things IMO that the X800 was better at than the 6800 but IMO IQ was not one of them.....but as I said, thats just my opinion.
Agreed, the x800 series were great cards, but they were definitely not better in the IQ department, that was one of the few weak points. I still regret selling my two x800GTO2's.:banghead: But the x800XL is doing its job amazingly, I even play Crysis on it.:rockout:
Posted on Reply
#99
BumbRush
Tatty_OneI find it hard to beleive, having owned both the 6800 vanilla and Ultra as well as an 800XT that the IQ in the 800 was better, the 800 if I remember rightly didnt support SM3 where as the 6800 did if my memory serves me correctly, a big jump in gfx quality and effects between SM2 and SM3. There were many things IMO that the X800 was better at than the 6800 but IQ was not one of them.....but as I said, thats just my opinion.
www.hardocp.com/article.html?art=Njc4LDUsLGhlbnRodXNpYXN0
We did notice shader quality improvements from Patch 1.1 to Patch 1.3, which now make the image quality on the GeForce 6800GT comparable to the shader image quality as seen on Radeon X800 video cards. The shader quality with Shader Model 3.0 is not better than Shader Model 2.0, it is now equal, where it wasn’t before in this game with Patch 1.1.
www.anandtech.com/video/showdoc.aspx?i=2102&p=11
Image quality of both SM2.0 paths are on par with eachother, and the SM3.0 path on NVIDIA hardware shows negligable differences. The very slight variations are most likely just small fluctuations between the mathematical output of a single pass and a multipass lighting shader. The difference is honestly so tiny that you can't call either rendering lower quality from a visual standpoint. We will still try to learn what exactly causes the differences we noticed from CryTek.
so yeah, basickly the x800's iq was better, the ps3/sm3 path sped up the 6800 and gave it quility =to the x800 cards but did not make it look better

i had both cards, the x800's aa and af looked far better and till games optimized for nvidia ps3 support IQ and perf where FAR worse on the 6800gt@ultra i had, then the x800pro vivo@xt pe(flashed) and the x800pro vivo cost me less yet was faster.....lol

im on an 8800gt now it was best deal i could get at the time, and after alot of tweaking the drivers are ok, still not as good IQ wise PER SETTING as my x1900xtx was but at least it works, im just wondering if they will abandon updates for the 8800gt's once they move beyond the g92 core as they did with the 7 seirse, thats something i was alwase impressed by since i moved from nvidia to Ati back in the FX line days, (tho i have owned nvidia cards from each gen in those times) ati updates even their older cards drivers to fix issues, i know somebody told me recently that ati's 8 drivers fixed a problem with a game on his x800gto@xt pe(flash mod) thats far better then my experiance with nvidia have bene over the years, even back when i was a huge nvidia fan i knew that my older nvidia cards wouldnt be getting bug fixes, after the gf2 came out the tnt cards didnt even get bug fixes for common games that had seirous issues, and yet they where still selling the tnt/tnt2 based cards as budget seirse cards to OEM's(the gfmx was mid range the full gf cards where high end and the tnt cards where value line)

sorry last rant was a bit long, hope people can deal with more then 2 lines of text in a row, if not i will go back to dubble spacing my posts......

neither ati or nvidia are golden when it comes to their remarking older parts or supporting some older parts, tho really nobody supports dx8 and older cards anymore, but i can say this, nvidia cut driver updates/fixes for their dx8 cards sooner then ati did(the 8500/9100/9200 and such) all got driver support up till they cut support for all the sub 9500 cards

the gf4 and older cards all stoped getting meaningfull updates not long after the fx line hit.
i know because at the time i had a ti4400(had better cooler then the 4600's did and was able to clock higher in my case) and i was effectivly proded into buying a 5800ultra by the hype nvidia put out about it jesus that card sucked tho........drove me to try ati again after years of HATING them due to their shitty rage pro/rage2/rage128 drivers sucking ass.

i better stop b4 i spend another page ranting about why i hate ati and why i hate nvidia :P i like them both in ways but both also piss me off at times, stupid bastages.......oh well at least if you buy a card from eather today you will still get something you can use for a couple years (maby not for gaming but gamings a small part of the pc market really)

this is bs tho think about it, they put out the 7950gx2 and NEVER give it proper support, then the 9800gx2 and EOL it just after it comes out, im SURE they wont give it proper support now eather, i would also bet they are regreting their dual pcb design as its far more costly then amdti's 3870x2 cards are to make.

now b4 any of you try and say im full of it, use logic here.

you have a 3870x2 thats 1 card, and can use a moded ver of the cooler theyuse on the normal cards OR most 3rd party coolers will fit.

then you have the 9800gx2 that you have to design and order special coolers for, as well as having to pay more to assimble the cards because its dual pcb with flexable links and such, each pcb being quite long/marge as well as being quite complex, basicly they made it overly complex and more of a PITA to deal with, hell look at the price compared to the x2 card.......nasty!!!

if i had bought one of these i would be returning it asap or selling it on ebay or something, because if they eol it this quick u KNOW your gonna get screwed on driver support just as they did to the last gx2 card.......

at least ati's first x2 card got support dispite it being very poorly known, but then again it dosnt need really special drivers, its just seen as a crossfire setup and gets enhancements from any crossfire based update :)

blah lets not fight about it, cant we all agree that we would be pissed if we owned one of these?
Posted on Reply
#100
BumbRush
Tatty_OneI find it hard to beleive, having owned both the 6800 vanilla and Ultra as well as an 800XT that the IQ in the 800 was better, the 800 if I remember rightly didnt support SM3 where as the 6800 did if my memory serves me correctly, a big jump in gfx quality and effects between SM2 and SM3. There were many things IMO that the X800 was better at than the 6800 but IQ was not one of them.....but as I said, thats just my opinion.
oh forgot to say the only real diffrance you will see in most games is HDR support between the x800 and 6800 and even then the 6/7 range cards have to choose between AA and HDR because they cant do both at the same time if its SM3 hdr, the 8800 and x1k cards can(infact for the x1k cards theres no perf penilty to have both enabled in games like farcry and oblivion)

HDR could be done under sm2.0c, it just requiered diffrent coding that took more time and skill, check out HL2 lost coast and the HL2 expantions, IMHO with current patches it looks just as good as any other HDR use even tho its sm2.0 based not sm3 :)

blah, i did it again, i ranted more then intended :P
Posted on Reply
Add your own comment
May 15th, 2024 10:14 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts