• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA has Five New Cards in the Works

IMO.....
IMo...
,but imo :)

LOL.... hey - how can we state anyone else's opinion, unless we quote them citing THEIR opinions (because we surely are NOT them), you know?

:)

(So... what's next? Spelling &/or Grammar checks from you?? lol...)

* Ah... anyhow, I am just nitpicking, like you are!

APK

P.S.=> Opinions: EVERYBODY'S GOT ONE (&, they ARE, uniquely their own)... apk
 
Last edited:
I'm not knitpicking i'm just reading between the IMO's

LOL, good turn of a phrase... IMO!

:)

* ANYHOW - I think this is good news that NVidia's cooking up some new boards... it'll drive down the cost of the TOP-END 8800 stuff @ least!

(Might actually give me a reason to upgrade to one! Right now? They want TOO much... way too much!)

APK

P.S.=> My last round buying (see my signature below for specs) cost me a lot, so I hold off to once every 4-5 yrs., but, when I do buy? Well, I am one of those folks that Ketxxx is 'ribbing on'... I go ALL OUT, every 1/2 decade or so, & usually end up w/ systems that are TWICE as powerful as the last one...

Upgrading's rare, but vidcards CAN 'get me' to do it, IF the price/performanc mix-ratio is good... apk
 
I'm stupid, I'll bite!! AND I will probably get 2,(or maybe 4), but not nVidia.:toast:

Fanboy! So if you are getting ATi your gonna rent a power station to rev 4 R600's then :laugh:
 
I wonder if we can mod our old 8800's to unlock the extra pipes. Anybody know how NVIDIA disabled them? I got a 8800GTS and was hoping it could be modded by pencil, or through flash.
 
I'm just dissapointed that the two focus aspects are shaders and mhz...

700/2000 isn't that impressive. For all we know it's the same bloody card (with a few updates) just overclocked.

The Gx2 sounds neat, but I can see that as being far more powerful than any system will need for gaming for a while. I trust Nvidia's technology prowess, yet with such little need for duo core, vista and dx10 at this time I would definatley wait until the beginning of next year to get into the 8900 series.

8800 GTX and GTS are still not the bottlenecks.

As for prices, the Americans get the best deals(probably due to active competitors), some GTX are down in the $400-450 range; best we can find in Britain is about $420(USD) for a GTS ;/ But that isn't saying too much, on identical systems my GTX and GTS show no difference in 'real world' terms (and the GTX is unfortunatley a near inch longer, which is a SOB if you have front side panel USB/firewire installed, as the connector cables run right across where the PCI E slot resides; no USB/firewire for me! :P Small price to pay.

I wouldn't be surprised if Nvidia stands firm on their 8800 prices, especially with the 320 MB version out, until they're about to release the 89 series. They've presented us the first DX10 generation card, which will have more 'real world' public testing/feedback/support time than the R600s. If they back down, they'll only lose trust with their customers.
 
totally agree , u pay for technology , and that isnt cheap ................. thats why a Honda is a Honda and a Porche is Porche.

So your point is, they are both over rated and over priced for their respective markets. Thanks to what appears to be price fixing between the two biggest video card makers in the world, we have two video cards in each category with roughly the same specifications to chose from that cost exactly the same amount, oh wait they at least changed the names to confuse the die hards :nutkick:. They finally figured it out, it’s easier to make 50% of the market share then try to bankrupt one another, thank god for capitalism.
 
Getting back on track, good bad or indifferent I just see it as more choice and thats got to be good, that, at the end of the day is what makes competition and eventually leads to lower prices, whether I (or you) choose to buy one of them is a different and personal matter.

Personally and yes.....IMO! It seems like an excellent move by NVidia if the information is accurate that they shipped the current 8800 series cards with 25% of their whatevers disabled, it means that theoretically they can compete with the R600 with something already engineered and manufactured allowing them to be already someway to working towards the next generation.......9XXX series or whatever they will call it keeping them ahead of the game.

Whether or not the cards will actually be better than the opposition or not remains to be seen but personally I think its a good time to be a spectator!
 
-Russia Weekly

"Too few people have enough cash to buy computers and pay
connection charges. And there is still not enough investment in the basic
communications infrastructure, particularly outside the major cities.

In smaller towns and rural areas, where a high proportion of Russians live,
would-be surfers would confront poor-quality telephone networks and a lack of
Internet service providers. Although the potential for access is growing with
several major companies, such as Moscow-based Relcom and Demos, installing
servers in more and more cities and towns across Russia."


Aye, thank God for communism!


A laughable offense surely... people complaining about the price of products that aren't anywhere near being released, and those poor folks probably don't even know what "GPU" means...


Tatty is absolutley correct, all Nvidia has to do is stand behind their product now.
The r600 will not impress over the 8800, and with the way ATI's are designed, if they don't make some serious architectural changes, I don't think they'll out perform the current GTX.

Too many cards boast these razzle dazzle capabilities, which ends up being the same architecture, improved in stability or performance by a measily 10-20% and then they slap a new technology description on it.

Nvidia is set to stay in the lead at this point, and rightfully so, they brought the new tech. first and thus deserve their comfortability zone.
 
I cant jump on a person from Russia.

In Russia the GPU renders you!!

Ati is still ahead of nVidia..they jumped the gun yet again DX10,SM4.0... guess what they cant do it!!!!!!!

The first for SM 3.0 support guess what they couldnt do it!!!!!!!!


blah


blah

blah
 
  • 8900 GS – this should have a core clocked at 550MHz with 96 shaders and 256-bit memory at 1600MHz in sizes of either 256MB or 512MB. This should cost about $250

There goes my paycheck :).
 
good info, thanks!

that said ... i'll wait quite some time before upgrading since my vmodd'd 7900GT is just so sickeningly wonderful! and besides ... i paid over 320.00 for it just 5 short months ago!

:)

it's so *easy* to get sucked into upgrade-itus!

rule of thumb: research, study the options, make a well judged investment ... and if it's working? keep it for at least a year.

then ... ???

like shampoo ... repeat as necessary.

:)
 
Omg... this is FUD stop posting FUD on these forums!!! Has Nvidia even said that they were going to have a GS series? NO.

8900GS Would not exist because they are going for a new naming scheme. I would expect something like a 8800GTS to be cheaper than the 8900GS, so this is total bs.
 
Fanboy! So if you are getting ATi your gonna rent a power station to rev 4 R600's then :laugh:

I've decided that I'm going to need two cases for my next build. The second case will hold the psu'S and the Water cooling, pumpS, Radiator, and tank. Havn't yet figured out the umbilical yet, would be neet to go WIRELESS!!:roll:
 
the way i see this, nvidias ripped off every 8800 owner whos gotten an 8800 to date, they dissabled 25% of teh shaders just to give themselves a next offering(8900) i would be fuking pissed if i had just spend 600 on an 8800gtx then they bring out my same card with 25% more shaders enabled for the same price range!!

that toped with the fact that they still dont got truely good/reliable drivers for the g80 :)
 
So your point is, they are both over rated and over priced for their respective markets. Thanks to what appears to be price fixing between the two biggest video card makers in the world, we have two video cards in each category with roughly the same specifications to chose from that cost exactly the same amount, oh wait they at least changed the names to confuse the die hards :nutkick:. They finally figured it out, it’s easier to make 50% of the market share then try to bankrupt one another, thank god for capitalism.

i totally agree with u , i see the error in my analogy. =)
 
the way i see this, nvidias ripped off every 8800 owner whos gotten an 8800 to date, they dissabled 25% of teh shaders just to give themselves a next offering(8900) i would be fuking pissed if i had just spend 600 on an 8800gtx then they bring out my same card with 25% more shaders enabled for the same price range!!

that toped with the fact that they still dont got truely good/reliable drivers for the g80 :)

They have not done anything that they or ATi have not done in the past, an 800 or 6800series card in the past that could be "unlocked" or flashed to give an extra 4 hardware locked pipelines for example were considered "enthusiast cards", now I do actually agree with what you have said but I'm just making the point that similar (if not the same) has happened before, the only REAL difference is maybe that people are less happy about it now because they may not be able to do anything about the unlocking like they could in some circumstances with the 800's and 6800's.

I have tried to explain myself in a little more detail on the other R600 thread which is here, my posy is near the bottom of the page:

http://forums.techpowerup.com/showthread.php?t=25339&page=3
 
Last edited:
Am I the only one that sees that this is a die shrink also, and maybe not just a simple unlock? Perhaps the G80 doesn't actually have 128 Shaders with only 96 unlocked, perhaps the extra shaders were added to the new 80nm core.

As for NVidia ripping people off, the only people that ripped the 8800 series owner off are the 8800 series owner themselves. If you buy the bleeding edge top of the line video cards you are almost guaranteed that something better will come out in a month or two.
 
newtekie1, i agree to a point but you gotta look at it this way, nvidia nor amd have EVER put out a core with stuff intentionaly dissabled just so they could put out the same core latr with those parts enabled in order to sell it as a new product, thats INSAIN and EVIL.

its dissabing part of a chip simply so u can later sell the same chip as a new product....not kool imho.
 
newtekie1, i agree to a point but you gotta look at it this way, nvidia nor amd have EVER put out a core with stuff intentionaly dissabled just so they could put out the same core latr with those parts enabled in order to sell it as a new product, thats INSAIN and EVIL.

its dissabing part of a chip simply so u can later sell the same chip as a new product....not kool imho.

Who says that that is actually what is being done here? That is a mighty big jump to make, especially with the fact that a die shrink is also happening. It is entirely possible that nVidia just couldn't fit 128 shaders on a 90nm die and still make it reasonable, so they designed them with 96 just to get them to market with the plans to later have a die shrink and add more shaders.

The truth will come when specs on the number of transistor in each core come out. If they are the same then we know the 8800 cores really did have the shaders there, but disabled, if the new cores have more transistors then we know that the shaders were added after the G80 was released.
 
nvida dosnt count transistors the same as other companys, they only count "acctive" transistors in many cases, thus a 7800gs and gt have diffrent transistor counts but are the same chip just the gs has laser cut dissabled pipes
 
nvida dosnt count transistors the same as other companys, they only count "acctive" transistors in many cases, thus a 7800gs and gt have diffrent transistor counts but are the same chip just the gs has laser cut dissabled pipes

*Pulls out specs for the cores of the 7900GTX and 7900GS*
*Notices both list 278 million transistors*

You sure about that? Because I have the specs sitting right in front of me for those two cards, and it seems to show quite the opposite.

Another interesting thing I notices is that when the made the die shrink from the G70 to the G71, the transistor count actually went down(from 302m to 278m). Most likely from them combining transistors that could be combined or eliminating unneeded transistors. So the new 80nm cores could very well have physically more shaders and still have a similar transistor counts.
 
Back
Top