well you see thats the rub your misunderstanding something
DX10 is DX10 it wasnt an after thought DX10.1 is DX10 with some added code line though it does infact add performance on games that can use the extra coding their isnt muc difference at all..that and im not sure it was an after thought as if i remember correctly when DX10.1 started being put in ATI cards the 9 series was already planned out thus why pull 8 million units and junk them? though i do agree with the fact that 10.1 is better and more future proof it isnt as much of a "botch" as you may thing...like some of nvidias older cards or ATI's older cards that kinda sorta supported DX9 but with only SM 2.0 etc.
didnt say it was botched, what i said was that if you want future proofed, go 3870 if you want fast for todays games with no support for dx10.1 go for nvidias current cards.
And yes nvidia's current cards are primarly for dx9 gaming, hence they didnt add 10.1 support when they went from g80 to g92, they could have, they fixed purevideo support so its FAR better then it was.
Im not attacking nvidia, I am just very sure about what nvidia has done in the past and what they have done today, they dont jump to support new tech unless somehow it can be done by simply modifying something they already have around.
this is what i have found with the 8800gt i had(being rma'd should have replacement sometime this week, i hate ups...slow bastages) the card is effectivly a massivly boosted 7 seirse card, effectivly they put the card on roids, sure they added so called unified shaders, and shader 4.0, but it is not a native dx10 design, because i think nvidia desided that if they went with the current 8800 design, till dx10/10.1 games where dominant that they would beable to force people to buy yet another card to get fetures that dx10.x offers vs basic 10.0 support.
its a smart buisness move in a way, but it also pisses off costmers who payed alot for their cards and expected them to last a couple/few years.
from what i have seen over the years i have seen nvidia beat dead horses till they are glue, the fx line, they kept selling them and putting out new versions even after it was shown that they SUCKED for dx9 gaming, same was true for the 6 and 7 seirse, they just tweaked the design and added more bruit force, where as the r300 and up where not bruit force, the x1900xt for example, was kicking the 7950 around till the gx2 came out, and even then it wasnt that far behind, yet by nvidiot specs it should have been loosing, because the 7950 had more pipes/rops and such.....but games where moving to being shader based, so 16rops with 3 shaders per was better then more rops with less shader units each.
blah, basickly the 2 companys work diffrently, ati tends to think along diffrent lines, they dont try and purely bruit force everything, where nvidia is all about the bruit force, hell look at how they presured ubi into patched dx10.1 out of assassins creed after it showed ati cards doing better because they could do the AA in 1 pass vs doing a 2nd rendering pass to apply the AA, again bruit force bullys....... i know alot of ppl who bought that game who didnt/wont patch it because THERE IS NO NEED, the patch dosnt fix buggs, it just makes it so that nvidia stoped bitching because their cards CANT do 10.1 where as ati's can.
also take not of Tessellation, the cat 8.5's enable it on ati cards, allowing the same thing on pc's as is avalable on the xbox360, higher res without a much if any performance impact, more detail without the perf impact at least i see that as total winnage.
just look it up, or darknova and others can explain it better then i can at the moment.