Thursday, June 19th 2008

NVIDIA Gently Intros GeForce 9800 GTX+

AMD today took a major point for the red team by positioning its brand new ATI Radeon HD 4850 cards between NVIDIA's GeForce 9 series and GTX 200 series cards. The all new HD 4850 cards beat NVIDIA's GeForce 9800 GTX while also maintaining the very reasonable MSRP of $199. Currently NVIDIA has no card that can compete in this category, but that's eventually going to change in mid-July, when the company will announce a new mid-range video card dubbed GeForce 9800 GTX+. The card will be idential to GeForce 9800 GTX from the outside, but from the "inside" it will use a smaller and more efficient 55 nanometer GPU with increased default clock/shader speeds: from 675MHz to 738MHz and from 1688MHz to 1836MHz respectively. Memory speeds for this card will be dropped slightly to 1GHz (1100MHz for GeForce 9800 GTX). Other than that the card is virtually the same as GeForce 9800 GTX, the three-way SLI support also remains untouched. NVIDIA expects to start offering GeForce 9800 GTX+ with a MSRP of $229. The company also plans to drop the price of the 65nm GeForce 9800 GTX to $199.

First card is Leadtek 9800GTX, second one is GeForce 9800 GTX+

Source: bit-tech.net
Add your own comment

137 Comments on NVIDIA Gently Intros GeForce 9800 GTX+

#1
farlex85
Mussels said:
because i LIKE my board... if i was selling the board, i'd grab an SLI one and get another passive, vmodded 8800GT...
I gotcha, I was just busting your balls, sorry. :laugh: Although, there are many other boards out there you would like also. SLI is worse at oc generally, and gt's don't scale very well to my knowledge.
Posted on Reply
#2
SheetCake
wiz needs to rebench with the new driver hotfix i wana see the results :P
Posted on Reply
#3
Mussels
Moderprator
farlex85 said:
I gotcha, I was just busting your balls, sorry. :laugh: Although, there are many other boards out there you would like also. SLI is worse at oc generally, and gt's don't scale very well to my knowledge.
yeah bit i'm a silence nut. i like the power these things can give passively.
Posted on Reply
#4
newtekie1
Semi-Retired Folder
SheetCake said:
this craps what makes people say your bias nvidia fanboi, first, the 3800 was a die shrink but it want just a die shrink, check the details out, they also improoved the avivo support yeah thats right the 2900's avivo sucked/sucks it dosnt even really work, the 3800 cards works like a charm(better then purevideo)

they also changed other things, if i remmber correctly there are acctualy LESS transistors in the 3800's then the 2900's yet the perf is the same or better in many cases.......how can it be a pure die shrink if the number of transistors changes and they effectivly add a new unit for video decoding(avivo)?




click the image for source, note 3800=666m vs 2900@700m so if its a pure die shrink where did the 34million transistors go??????

and why was video decoding performace boosted so drasticly?
It also added DX10.1 support, and the AVIVO is controlled by a different chip entirely IIRC, which has nothing to do with the GPU itself. The memory interface changed also(most likely where the transistors disappeared to, a lower memory interface means less complex GPU). Granted, I should have been more specific, it isn't just a die shrink, but it is a die shrink and the end result is essentially identical to the end user(minus the shrunk memory bus).

If you want some exact example from ATi's camp, look at the the R535 and R530.
Posted on Reply
#5
SheetCake
newtekie1 said:
It also added DX10.1 support, and the AVIVO is controlled by a different chip entirely IIRC, which has nothing to do with the GPU itself. The memory interface changed also(most likely where the transistors disappeared to, a lower memory interface means less complex GPU). Granted, I should have been more specific, it isn't just a die shrink, but it is a die shrink and the end result is essentially identical to the end user(minus the shrunk memory bus).

If you want some exact example from ATi's camp, look at the the R535 and R530.
nope, its the on die, it is not a seprate chip, your thinking of the g80 and new gtx280 cards where the purevideo supports a seprate chip.

ati cards did use to have a theiter chip onboard, but thats for mepg1/2 (dvd playback) support nothing more.
Posted on Reply
#6
newtekie1
Semi-Retired Folder
HAL7000 said:
You miss what I was saying,,,,,I have read other post from you, if you look back at my prior post you will see that it ended in a positive. I did not miss out on what you said but as you stated.....it is all positive for the 9800GTX+ and negative for the 4850.
Share the links that you have concerning the poor over clocking of the 4850. I would like to read them as well.
You see it took Nvidia 2 releases to make a 9800GTX+....to me they are loosing ground, esp when the 280 $$$$$$ runs like shit as well....getting beat out in some benches by the low end 4850 and others by the 9800GX2. ...as we read in the TPU reviews.,,,FACT.
So is it wise to buy the 9800GTX+....nope....not worth the money.
I am far from being any fan boy..... just read my signature....they all piss me off...
Firstly, your links, I didn't exactly have to go very far:
http://www.techpowerup.com/reviews/MSI/HD_4850/24.html
http://www.techpowerup.com/reviews/Powercolor/HD_4850/24.html

Two HD4850's in the hands of a person that I would consider a master overclocker. Both overclock like complete ass. Less than 10% overclocks is overclocking like ass, IMO, and 1% is just a joke. I could go through and find some of the other articles and forum posts I have read, but I think that is enough, and that is really all the time I want to spend on the issue. If you want to dispute it, show me some better results.

http://www.techpowerup.com/reviews/Point_Of_View/GeForce_GTX_280/26.html

You can look at another new product, and it overclocks better, and according to Wiz, it is actually harder to overclock.

http://www.techpowerup.com/reviews/Zotac/GeForce_9800_GTX/24.html

There you can see the overclocks on the 9800GTX, better IMO. You can speculate that the HD4850 overclocking might get better with better software and that the HD4870 might be a better overclocker, but there is no factual information backing to that, and you are all about the facts.

If you read around you will see me saying some positive things about the HD4850, you will even find some thread of me thinking about switching to HD4850's.

I definitely agree with you that ATi is catching up to nVidia, and if you read most of my posts on the subject you will see I think that is a wonderful thing. Competition is a good thing, it forces lower prices from both companies, which is only good for for us, the consumers.

Don't get me started on the GTX 280, I think nVidia was insane on that front, but that is an entirely different discussion there.

Again, the value of the 9800GTX+ and weither it is worth buying depends on the type of person you are. Personally, I think the 9800GTX+ would be worth it to buy over the HD4850 if both were priced the same(and it seems likely they will be very close). But even more, the 9800GTX at the price it is at, overclocking the way it does, I think it is worth it if you plan to overclock. BUT, most importantly, I think the 8800GTS(G92) is the best value out of the pack for overclockers, it overclocks almost just as well as the 9800GTX, but is cheaper than the HD4850. And with the added value of PhysX support on the G92 cards, it helps their appeal(granted, not by much, probably about as much as DX10.1 helps ATi's appeal).
Posted on Reply
#7
Hayder_Master
200$ is good price , i see old 9800gtx just like 8800gts 512 but maybe new one is better
Posted on Reply
#8
wolf
Performance Enthusiast
most 9800GTX's get to 775-800 core and 1800-2000+ shaders, a 55nm part should do even better, maybe not hugely, but enough.

we will probably see some dirt cheap, pre-oc cards running 800/2000/2400, and thats just sweet.
Posted on Reply
#9
btarunr
Editor & Senior Moderator
hayder.master said:
200$ is good price , i see old 9800gtx just like 8800gts 512 but maybe new one is better
Haider, even at $199, the old 9800 GTX is not worth it. It takes more power, requires an OC to perform on par with a HD4850, thereby increasing power consumption.
Posted on Reply
#10
wolf
Performance Enthusiast
dude, 177.39 drivers make up the gap, and will make it exceed the 4850 under some circumstances.

im just hanging for a review that finally uses the new drivers so i cans top saying this and just post a link Finally.
Posted on Reply
#11
btarunr
Editor & Senior Moderator
You can't say that until you come across a comparison between a 9800GTX (with 177.39) versus HD4850 (with the latest hotfix to the Catalyst) .
Posted on Reply
#12
wolf
Performance Enthusiast
im saying that my 9800GTX has ~20% more FPS across the board on 177.xx drivers. and all the reviews are on 174/175's.

so yes what we really need is a comparison on 177's and ATi with the hotfix.
Posted on Reply
Add your own comment