Discussion in 'Reviews' started by W1zzard, Sep 17, 2005.
To read this review go to: http://www.techpowerup.com/reviews/NVIDIA/CrossfireTruth/
Looks like a lot of propaganda to me. nVidia have had alot of time to get things right on their mother board chipsets, but as i havent heard much about ATi boards (hell, nothing about them except for the xpress 200) i think ATi are doing pretty well for themselves!
And the second last slide: nTune, nvidia has it, and ATi doesnt... Guess why? cos its nVidia Tune...
Anyone else like to comment?
Propaganda. It is funny that they put "Ntune" as a feature to be supported or not supported by each lol.
Everyone says HDR wont work on 98xx/X8xx cards, yet in that preview of the Source lost coast posted here-
The guy runs HDR on a 9600. So whats the real deal with SM 3.0?
And if there was a problem with the USB controller a year ago, why the hell should they think it would be now? Nvidia is slinging mud.
the USB problem was easily solved, they used an external USB 2.0 chip.
Im waiting for the good part where ati sues nvidia for damages due to the lies spread in this public presentation .
Wouldnt be hard to hit back at nvidia though. Total lack of software support from nvidia to take advantage of their hardware and breaking support for their previous generation in doing so :| There the Creative (soundblaster) of the gpu industry.
Thing that gets me is that they are compairing a 7800GTX to a X850XTPE...then they say the X850 is out of date. Well yes it is, but the 7800GTX is a brand new card. Just wait till the X1800 comes out and see what Nvidia has to say about ATI then.
My first post on this forum:
That article right there is a lot of doo-doo. Even my 9550GU-Extreme supports HDR, so a X800 "probably" suppots it too...
I don't know about "paperlaunch-ing", but GeCube just released their first X800GTO with Crossfire.
How dare they acuse ATi of lack of flexibility when their technology is the one that requiers both cards to have SLI enabled for that same SLI to work. How dare they say they have "two, full band 16x PCI-Xpress slots" when everybody knows that if you have 2 cards in SLI, each will only use it's PCI-Xpress at 8X!!! If they acuse ATi of "paperlaunch-ing" how the heck do they know that ATi's USB and PCI-Xpress are not running at full capacity!?! I've seen a test in the romanian edition of Chip magazine with an ATi Xpress 200 which trashed an Abit NF4 at default settings even though it was a preview version with very little BIOS settings...
What is tha "Nvidia nTune not present on ATi cards" poo: it's like saying that GeForce cards don't support ATi Hydravision or something....
And last, but not least: What the heck is "Mainstream Price Points"!?!
You would have to be pretty foolish to accept information, at face value, from a corporation demeaning a competitor’s product. Show me third party research showing SLI is better than Crossfire, than I may believe it. However, nothing is certain until the product is even released. Shame on you Nvidia, you just made yourselves look bad.
In the 9th piture, it saying how if you have a master card but if your running a 12p card, it will make the master use only 12p. In SLI you must have the identical card, so whats the difference? Plus its almost better because you will be able to buy the master card, and run it with say your X800pro, and then if you want to upgrade you can buy a faster card without needing to get a a new master card. With Nvidia you would have to buy two new cards. Not to put Nvidia down or anything, but they are just pulling everything they can think of to deter people from crossfire. As Polaris573 just said "Shame on you Nvidia, you just made yourselves look bad."
Wow, NVIDIA is whack. I love their cards, but to spread lies and false rumors about ATI is rediculous.
Page 4.. You mean I can *only* have 14x Anti-Aliasing with my ATI's?? There is no way I could settle with less than 16x
"Enthusiasts don't buy old technology." Of course not, but they may just upgrade their 7800GTX to an X1800XT when it's released. We'll see.
Page 9.. They accuse Crossfire of *not* being more flexible. At least people DO have a choice to mix models/brands, and not be forced to use the exact precise SKU.
The only pages I found to have any warrant are maybe 11 and 12. If people have different display resolutions, then they may run into problems. But that's only if that's actually true.
Page 16.. They devote an entire page to a rant about PCI-E 1x lanes? Because we all have so many 1x cards lying around???
Page 18 was quite hilarious as well. nTune and Price Points. It's hard to compete prices to nF4 when CF is not even out yet
Cool slide though. Opened my eyes more about how foolish NVIDIA can be.
well cant say theyre lying 100%
Very biased, quite unprofessional, but then again i already knew how nvidia could be upset, personally im unhappy cuz r520 tunred out to be 16 pipes and i still cant find a crossfire board....
you have to admit though i read the review for some ati express chipsets and i have to admit only the turion chipsets are impressive to me.....crossfire seemed fairly rigid...but yeah its unfair cuz theyre still working on crossfire and im sure theyre gonna fix the problems as well if not faster than nvidia....personally? i think both companies are acting like pms'in schoolgirls...
Having 2 video cards is simply dump.
I think that you have to understand that both ATI and NVIDIA have people sitting around just thinking up things about the competition. I would bet that both companies do it.
Any way that being said I am here for some Nvidia support… objectively
Go ahead and get a cup of coffee and something to eat this is long.
Slide 2 - Seems here that this could be the case however not knowing when the ATi cards will be released it is hard to judge. To the person that believes that they can buy a card that is crossfire capable… yes you can however you still need a master card. The ATI version will work with all previous x8XX cards just with a master card.
Slide 3 - This is a hard case to make for either company, True the 7800gxt in sli will undoubtedly beat the current generation of ATI cards in Crossfire ( based on current single card benchmarks where the 7800gtx beats all). However ati could have simple ment that if you own an ati card you will have the best performance with crossfire and an ATI chipset Mobo.
Slide 4 - On reason I know many people do not buy ATI cards is that they SUCK in OGL applications, this doesn’t mater much if you like windows however if you use Linux you don’t want to be using ATI. Technically 16x is better than 14x. It is also unclear weather or not ATI will have a competing solution to transparent AA
Slide 5 – HDR lighting does only work on Shader model 3 cards
Slide 6 – Nvidia Has changed this recently with a driver update. I don’t know when that page was written on ATI’s site but it is still there. Im sure they wanted people to know that. What you don’t see is the rest of the bulleted points, • CrossFire is an open platform that supports multiple components and graphics cards that can be mixed and matched in a single system. Competitive multi-GPU solutions are constrained to supporting identical graphics cards.
This is true however if you have a better card it gets dumbed down to the lover version of card you have. Also you can mix and match brands of NVIDIA cards as long as they are the same card.
Slide 7 – ahh who cares, Nvidia is running out of things to argue about, its all done automatically I don’t see why anyone should care.
Slide 8 – seems to me Nvidia is just pointing out the obvious. One thing how ever is that Nvidia can do all 4 modes in OGL and on all cards that support SLI, Looks like ATI will not beat Nvidia at doom3 at least lol
Slide 9 – yeah the dongle is more flexible. This one is most definitely FUD by Nvidia, put out there to get people away from the fact that Nvidia can’t dumb down a card. (Well I know you can with certain programs but that’s not the point)
Slide 10 – Well I believe that this slide is spot on.
Slide 11 and 12 – I don’t know if this is confirmed or not, but if it is true that is a very very big limitation. People with large lcd’s and high end CRT’s should probably shy away from Crossfire if they like playing at 1600x1200 or higher.
Slide 13 and on – I don’t care practically much about a mobos features as long as it is stable, offers good performance in its class and can OC like no other. I am very sorry that there Gigabit nic only allows them to dl pr0n at the same speed as me because your not on a gigabit network and your cable modem only works at 5Mbits.
This presentation was made to the press to look at when reviewing Crossfire parts as things to look for. It was never made to be presented to readers like this. I was in on one of these presentations and that is what they stated this part of the briefing was for.
I agree, readers need to remember this isn't targeted towards the customers. Ofcourse it's biased, it's from nVidia. Would you expect an unbiased response from Ati? It's the marketing people that are creating these slides afterall.
As for the mud-slinging, it's no different than the mud-slinging by fanboys, of which there are plenty here. Recalling old problems either company had when talking about new hardware (driver cheats, poor performance, etc.) is nothing new.
I agree with you in some ways, Like if you have a 7800GTX, geting another one is not worth the money you pay for such a little performnace increase. But say to cheaper cards, like a 6600Gt for say, you can get close to the same performance a one single highend card, that cost more than the two cheaper ones. Its great if you want a performance increase without buying a newer more expensive card later on. You can just get the same card again, and it will be quite cheaper. But in most cases its just not worth it...
Actually, there are motherboards using 2 chipsets, thus achieving 2 PCIe 16x slots.
I knew that, but if I'm not mistaken there are no graphics cards to work in dual 16x (maybe Quadro's). And using two chipsets on one mobo just to overcome that is not exactly a thing to brag about...
One area 2 cards may be good for is 3D Rendering. I'm really curious to see what kind of an impact 2 cards will have on rendering times compared to 1 card. For people who use 3D Max or something similar 2 cards may be well worth the money if it really cuts down on render times.
ATI ==> The best
NVidia ==> Wrong propagande
Yes it may be Nvidia propaganda, but Nvidia is right. Crossfire is going to sink without trace and so it seems is the x520 or x1800xt as it is now known. Ati is in serious trouble.
Cossfire was originally launched in June or July this year - and now ATI have had to do a relaunch and there is still no availability. There are many issues with crossfire such as using mastercards, which dumb down the perfomance of your cards and incredibly crossifre doesn't support 1920x1200 at 60hz or greater which is just absurd.
Secondly, the X1800XT is beaten by Nvidia's 7800gtx in nearly all high resolution benchmarks and yet the x1800xt is going to cost $150 more at launch. Who, apart from ATI fan boys, is going to buy it?
Thirdly, to the idiot who said that they though ATi is doing well - maybe he should go back to school. Ati's share price has slumped by almost 50% this year in comparison to Nvida's which has more than doubled. Ati is also being sued in multiple class action suits - by its OWN shareholder - for misleading statements by its directors.
Conclusion - ATi has serious problems.
OH wow, because you know... x1800 benches are everywhere now that the nda has lifted and the card is out in stores.
It’s nice to see that nothing ever changes… ATI fan boys VS. nVidia fanatics, Intel VS. AMD and so on, it never ends.
And you wonder what camp I’m in, I’m in the “best performance for decent money” camp. And witch brand is the better right now? I don’t care! I can’t afford premium ones anyway
What makes you angry is that a document meant for journalists, made by nvidias sales- department to discredit ATI leaked out. We all know that sales departments are fishy. The reason ATI isn’t suing them is because they of course do the very same thing. Every body know that, so way are you upset?
As a comparison I’ve had to do with the sales department at the company I work for, regarding rolling out some advertising now and then. Being a technician I know how hard it is to teach a sales person what is good/bad about a product, especially compeered to the competitors. It really doesn’t mater how well you say it, they won’t understand and doesn’t care. Some of the info folders turned out okey, but some….. Man I feel ashamed knowing that I’ve been involved in the process of making them
My guess is that both nVidia and ATIs tech-teams regularly wonder when their sales departments sold their souls to the devil
I agree,meant for the press
Beating a dead horse is what came to mind when I first read this,but without question this was meant for another audience.I am currentley a nvidia owner,actually 3dfx prior(many moons ago),because I had several bad expieriences with early ati cards,xpert99,and such with horrible driver support and performance."HOWEVER" I have a few friends with 9800 pros that made me drool back in the day,and wiped nvidia all over the place-ala 5900 series.Really,I find it refreshing that both of these companies can catch up to one another year after year,forcing each one to top the other,only benefiting the customer in the end.Now if amd and intel were on equal marketing grounds,I bet you would find that 4 ghz might already be here.Marketing and competition has it's obvious ugly side,but causes roadblocks to open unexpectedly.
ATI ==> Top Dog sometimes
Nvidia ==>Top Dog sometimes
I could give a rat's hairy ballbag about branding. And I'm sure as hell not going to take the manufacturer's word for who has the best product.
Separate names with a comma.