ok so rating in amps is better than rating in watts?
That is most definitely wrong, amps are dependant on voltage
P = V.I for a DC circuit
Power (watts) = volts x amps
Eg, if you are on 230V eu power supply. Lets say your pc draws 500watts from the mains
I = P / V = 500 / 230 = about 2 amps
and lets say your chip draws 100 watts at 1V (just an example remember)
I = P / V = 100 / 1 = 100 amps. So see why you cant use amps to measure things??? unless voltages are the same OFC.
I (A) = P/V. So the higher the voltage on the chip the lower the power consumption appears to be. Therefore watts is a better way to go, or joules per second if u wanna get really technical about it and put it in SI.
You also say the SB400/450 sucked, agreed, im using one now. The SB600 rocks though, also agreed and you will only see it on my fav board manufacturer (definitely a DFI fanboy). So if ATI release another not so good chipset i expect you will have the same reaction you are currently having to NV graphics cards yes?
As for the x1900 being faster, my 7900GT is stable at 620 GPU, 915 ram and thats without voltmod and was £70 cheaper than a x1900xt when i bought it. Id like to bet it would give a x1900 some competition compared to the 450/650 default.
As for a slight difference in power the x1900XT takes about 80-90% more than that of the 7900GT, thats more than "slightly lower power"
http://www.vr-zone.com/?i=3335&s=8 for 7900GT power consumption and x1900XTX
http://www.xbitlabs.com/articles/video/display/gpu-consumption2006_4.html for x1900XT
I am not saying the x1900 series of cards are bad, most certainly not, but the nvidia range have their redeeming features and the 7900GT has unbelievable overclocking potential. Especially if you are prepared to do a bit of voltmodding.