Discussion in 'Reviews' started by Darksaber, Jul 24, 2006.
To read this review go to: http://www.techpowerup.com/reviews/NVIDIA/7900vs7950/
MUAHAHAHAHAH i beat 7900GTX and i am almost as fast as 7950GX2 OMG i cant beleave my x1900xt is that fast, WOW im just shocked, well not really it IS ati and i AM comparing it to nvidia
Funny, but your signature says:
"x800 pro OC", literally...
Typically (& many here even stated it in their replies in the forums over the time I have been here @ these forums for 2-3 months now on/off)?
NVidia cards are faster (& over time/history, decidedly so in OpenGL - whereas ATI typically has done better speeds in DirectX display renderings than NVidia has in the past over time)!
However, ATI folks say that their boards produce better imagery: I don't know about that!
First - Beauty in the eye of the beholder imo, on that account (especially a biased beholder).
Second - I've owned both types/families of boards over time & went NVidia in the end, & I never noted any diff. in quality between the two.
ON THE SUBJECT OF THE REVIEW HERE (overall great review, but my take on it, from a personal standpoint & how I use my rig & why):
I currently own an "amped up" model of the GeForce 7900 GTX, in the BFG "O/C'd" model... & I further O/C'd it over THEIR rates (see my signature below for exactly how far over) & it runs stable while gaming.
I play Doom III &/or Quake 4 SMP almost SOLELY the last 2-3 months, when I have time to game or choose to!
I noted that they are played above for the comparison test, so I concentrated on them!
(Personally, I don't put TOO much stock into synthetic benchmarks for gaming video cards, because I have the actual games most of the time, & to me? That makes the MOST SENSE to pay attention to since I own & play the actual games - they do show POTENTIALS of vidcards, but not real-world actual results (as iirc, the "demos" they play aren't real games)).
However, I noted the resolutions used above!
They are FAR ABOVE what I play @, & also the fact that they are "off" ones (note the 1600x1200 typical is not there, & instead some 'off-the-wall" 1680x1050 is used instead? IIRC, I don't think I have personally ever used that geometry, or if it is possible for me or offered while I configure the 2 games I mention above - but, assuming they are? Note the frame-rate hits above - not worth it to me @ least to play that "slowly")...
Myself, I play @ 800x600 (Doom III) & 1600x1200 (Quake 4 SMP) + ranges in between, because any higher than that?
Even the 7950 GTX can't pull off better than 80fps, not even close to it... & from what I understand, anything over 80fps is wasted, the human eye can't detect it.
Where greater than 80fps would help is in a pitched battle with blood & particles flying, & explosions + smoke filled areas imo @ least (no "professsional game critic" here).
So, what would be the point of playing a game that high up, if it is slow frames-per-second-wise??
(Especially slow during online & multiplayer sessions??)
Certainly not with an x800 pro U didn't so U need to update ya specs then!
I agree, I get about constant 60FPS at 1280x1024 (the max res my crappy CRT supports ) in almost every game out there (except for Oblivion, where I get 25~30 FPS in the woods ) with my current setup, so why get these uber-fast cards using "future-proof" as an excuse if upcoming games will use DX10 anyways
Well, if you have big monitors that support resolutions of 1920x1200 then maybe you should get one of these cards to play at stable framerates at such high resolutions, but certainly, 25~40FPS at such resolutions doesn't sound like being so great anyways.
I uncap my games I mention above, IDSoftware allows for it via the game's individual config files... so, I don't get "stuck" @ 60fps limiting by the oem of the game.
To be able to run games @ the max possible was my reasoning!
(W/out "breaking-the-bank" so-to-speak on a personal finances level (that, & I don't want to go SLI really, I have plans for the other PCI-x 16 slot here))
To wit - a DDRDrive x1 capable PCI-e ramdisk solution to replace the solidstate disk I use now since it is "slow" by today's std.'s in that area (e.g.-> DDR based Gigabyte IRAM & HyperOS ssd's out there now).
I have to leave that slot open, & iirc according to my mobo manual? That 2nd SLI PCI-e slot can function as a stand-alone PCI-x4 slot, perfect for my needs/desires really (mostly diskbound I/O work here).
It doesn't but, I think the point of DarkSaber's review was to extoll the difference really & where it was going to give you gains!
(I would LOVE to see a game @ 1900x1200 (or more) resolutions w/ full AA + AntiIsotropics turned ALL THE WAY UP/Maxed-out, etc. but I don't have enough oomph monitor wise imo OR gaming card wise to enjoy it)
I agree totally, and to just add comment to the guy who has the 1900xt but shows the 800pro in his specs......all the reviews I have read show that the 7900GTX and the 1900XT are pretty even, I would probably go as far as to say in sheer speed terms (and not high res high AA/AF) the 7900GTX is actually the faster card so when he says he can hammer it and almost equal the 7950GX2 then he must have got lucky and picked a card with rocket propellent and a Nuclear power station plugged into it!
I'm sorry if I didn't explain myself there meant I get about 60FPS as average in current games in general (COD2, F.E.A.R., Q4, AOE3, D3, Prey, NFS:MW, HL2:E1, FarCry, etc) I know Id caps its games at 60FPS, I uncap them too...
Yes, exactly, but anyways, you would be investing a lot of money for a card that runs current games at max settings, but ppl who invested the same amount of money some months ago on a 7900GTX or a X1900XT won't need that a new $550+ card that won't be able to use DX10 features, when they could wait a bit for G80 or R600 cards that will probably have about the same price tag and better performance
Yes, I understand you need to raise the resolution that high in order to show results that aren't affected by a CPU bottleneck, in fact, I didn't critizise the use of high resolutions for the review per se, but the fact that you need to play at such high resolutions to show any differences when comparing this card to a 7900GTX or even an X1900XT; playing at such high resolutions will produce very low framerates no matter how much you invested in your video card
Cool, & believe you me: I am CONSTANTLY editing my posts for details I overlook!
(If I was others reading my posts, lol, I wouldn't reply for a good 5 minutes or so until I finish them up... POGE & I were killing one another because of that in threads we spoke it... "the baud rates NOT fast enough" lol!)
I hear ya... to folks w/ large amounts of money? Performance isn't an issue vs. monies outlaid, but, admittedly? I am NOT one of those folks - I bought w/ money specifically saved for a new rig, as my older one is now the 2nd lab rig I needed for pursuing @ home study of .NET really, & it was getting "long-in-the-tooth" vs. today's games with certainty, even though it is a GOOD rig for all else & decent in gaming too!
For me to come up with & save say, $5,000? Not that easy ontop of other bills & normal things like food, gas, etc. (takes me 1/2 year about).
So, I spent what I could, when I could... tossing around that much? NOT an "everyday thing" for me by ANY means, I can assure you!
yeah i DO have an 1900xt, just because i havent updated it because i was buisy playing doesnt mean its not true. and no shit i cant beat a 7900gtx W/ my x800 pro.
Ah, it's cool man - we were just busting your balls is all!
* I believe you, no sarcasm either!
There is quite a quality difference between ATi and nVidia, done a head-to-head. However nVidia cards can generally be clocked further than ATi cards. Equally though, ATi have far superior drivers. End result, 50\50. Whichever way you go, it'll be a good card.
Beauty in the eye of the beholder is all this is... imo @ least! Is it nice to see the details in a game? Certainly... but, I am mostly trying to stay alive & all that!
Could it be ATI does render better? Possibly, but it is just that I never personally noted it, & imo? The human eye only catches so many details & colors as is.
Especially while in the midst of gameplay.
I guess, I just know what mine can do, & it's pretty ok! I don't have a "like" ATI board to compare to here... so, it's not something I can TRULY comment upon (last ATI I had? 9800 XT).
Ok, here? I have to beg to differ on 2 grounds:
1.) ATI had problems in their drivers quite a bit more than NVidia had over time (this much I am fairly certain of... have they "cleaned up their act"? Absolutely!)
2.) John Carmack (of IDSoftware fame, & imo one of the greatest programmers alive today) while developing Doom III said "I like what NVidia does with their drivers" vs. ATI's stuff... as far as I am concerned, since he is one of my "technical/intellectual heroes", his word IS gold!
Oh, this I have NO doubt of... by now/today? Heck, most of all of what we use, even "midrange products" rule (especially by way of comparison to the stuff I started out on in PC's circa 1991 or so, by MANY ORDERS OF MAGNITUDE - believe me, I am NOT one to complain!)
P.S.=> Put it THIS way: Back when I first got into PC's, & I'd see films showing folks doing live video conference feeds & such, I was like "MUST BE SOMEKIND OF SPECIAL HIGH-END UNIX LAPTOP" etc. & today? We're doing ALL OF THAT, & far more... apk
Was not inferring that you didnt, my point was that when making such a bold statement as you did in your first post it's always good to be able to back it up. I have had quite often people challenging my scores that I post in my signature (hence the link to my 2005 score). As I said, all the reviews I have read suggest in raw speed terms at stock and overclocked the 7900GTX marginally beats the x1900XT, I hope your 1900XT is faster as I am getting one in a short while but maybe you and Alec would'nt mind posting a 3D Mark 2005 image of your scores at stock speeds to put my mind at rest.
Ah, you've got to be the 3rd or 4th person I've seen state it that NVidia is the "speed king" in the board revision I have (mine's actually overclocked over that as it is the BFG 7900GTX OC model)
I pushed it even farther & it runs great @ those rates (via coolbits.reg auto-sense o/c wizards it exposes in the NVidia control panel classic view)...
It must be how it is then, I really wasn't certain.
I have seen x1900's by ATI run basically neck & neck w/ them for the most part (vs. stock 7900 GTX iirc) though, & the lead was marginal.
ATI's "cleaned up their act" imo, especially in OpenGL performance from what I've seen... but, does the NVidia 7900GTX do as well as the ATI 1900 series in DirectX?
I don't run/own/use 3dMark 2005... is it free?
Yes it is, you can download it here:
it will be quite intersting to see how they do perform in a synthetic benchmark only most reviews conclude results based on a number of different test, some of which you mentioned (quite rightly) have the 1900xt in front but few of those are in the lower res tests and as 3D Mark 2005 runs in 1024 x 768 but both in "no AA/AF" mode and "4 x AA/8xAF" mode it will be interesting to see results especially as the rest of the system will be an ingredient in the results.....whilst 2005 is very GPU intensive, high end GPU's can be easily bottlenecked by less high end CPU's sometimes causing up to 20% bottlenecks! and I see you have a 4800 x2 versus an Athlon 64 3200 @ 2.3Gig (the other rig).
My point is not to try to disprove what has been said regarding the comparisons but was one of some surprise followed by an inquisitiveness! Especially if the 1900XT is near to the GX2 performance cause that means I am definatly getting one!
Well, as I stated here earlier:
Usually I don't put a lot of stock into synthetic tests since the demos they run are not actual games in production use.
I instead lend more credence to the actual game tests I have seen in reviews (as it seems like a more practical measure of what my system can do with actual games I own).
* Still, I will download, install, & run it. Cannot hurt I suppose to see what my current system can do on such a synthetic test.
(Man, this thing is BIG! 239mb...)
A meer 3 minutes ish on a 8MB connection!
Well, I downloaded & installed 3dMark2005 free model per the URL link you gave me, & tried to run it - I see the logo splashscreen, & it disappears.
* I don't think it likes running on Windows Server 2003 (I tried 3dMark 2006 as well, no dice)...
P.S.=> Are/is there "work-arounds" for this? ONLY thing I can possibly think of is using the "Windows Application Compatibility Toolkit" maybe, & it's "RunAs" features where you can fool an app into thinking it's running on older MS' OS' (it probably changes the "environment strings" via SET statements or something to do it is my guess)... apk
its a little lower because im running ati tray tools OSD
Thanks, but even with sys tray that score is way too low for a 1900xt, my score is with sys tray apps and it is higher than yours and a 1900xt should trash a 7900GT (my card) a 7900GTX would burn that score by 3000 points almost, you should be getting over 12000, think you need to do some serious tweakin there!!! Otherwise you are just not getting your monies worth outta your card.
really? 12000. k im getting my X2 3800 on monday so ill try it then. im running a 3200 @ 2.5.
really? you dont BEAT me by that much, and also my vid card is at all stock, no OC till i get AMC
Even at stock I think you should be beating me, I cant remember who his name is but there is a guy in here who has an 1800XT 512MB with no voltmod etc who is getting 10300 (although he does overclock and he has flashed his BIOS to xtpe , and your card is some 20-25% faster than a 1800xt so that will give you some perspective but yeah, when you get your 3800 x2 it will make some difference as its overclockable to 2600 on air, couple that with some O/C on your card and you will be flying nice setup! 2005 does not use both cores really but you will see the improvements across the board in "real world" stuff.
Ohhhh sorry, forgot to answer that one, think of it this way, in UK at time of purchase I paid £100 less for the 7900GT than an x1900XT....like $180 so beating you just a little bit is actually real "Bang for Buck".
Separate names with a comma.