Today NVIDIA announced their new GeForce 8800 Series based on their all-new G80 GPU.
Initially two different products will be available, the GeForce 8800 GTX and the GeForce 8800 GTS.
SpecificationsHere are all the specs at a glance.
Let's go through the specs list quickly. Take a look at the transistor count, it increased by 250% compared to the 7900 GTX. This is a huge leap ahead, I hope the performance potential justifies the added cost.
|7900 GTX||X1950 XTX||8800 GTS||8800 GTX|
|Transistors||278 M||384 M||681 M||681 M|
|GPU Process size||90 nm||90 nm||90 nm||90 nm|
|Memory Interface||256 bit||256 bit||320 bit||384 bit|
|Memory Size||512 MB||512 MB||640 MB||768 MB|
|Core Clock||650 MHz||650 MHz||500 MHz||575 MHz|
|Memory Clock||800 MHz||1000 MHz||800 MHz||900 MHz|
|Shader Clock||1200 MHz||1350 MHz|
The first picture is of the GPU core, the second picture shows the G80 GPU placement on a 130mm wafer at TSMC.
The next point to note is that the process size is still 90nm. Even though TSMC is working on smaller processes, the yields are not yet where they should be. Since the G80's size is a lot bigger than any other GPU, the chances of defects appearing are a lot higher. Understanding the reason is easy if you think about this example: Imagine a wafer where you have 5 defects spread randomly over the whole area. If you can fit 100 GPUs on the wafer, in the worst case you will lose 5 GPUs - 5% defect rate is not that bad. But if you have only a total of 10 larger GPUs which are on the wafer, you will still lose 5 GPUs - suddenly your failure rate is 50%, which is probably going to cripple your business. (These numbers are made up and are in no relation to TSMC's process).
Moving further down in the list we see Shader Model 4.0. This is an industry first, the GeForce 8800 Series cards are the first to natively support Microsoft's new Shader Model 4.0 which will debut with DirectX 10 in the new Vista operating system.
Next in the list is the amount of shaders, vertex pipes and ROPs. Since NVIDIA uses a unified architecture on the G80 there is no difference between Pixel and Vertex shaders anymore. The following image illustrates this very well.
If you look at the eight blocks with the "L1" in them, you see 16 small green squares in them. Those are the stream processors, depending on what is needed at any point during rendering, each of them can do vertex, shader or geometry processing. This gives developers much greater flexibility in developing their games, they no longer have to worry if their game bottlenecks on vertex or shader processing. They can just do whatever they need to give you the best gaming experience.
The blue squares by the way are the texture units which are optimized for HDR rendering and support a number of new anti-aliasing modes.
Next on in the specs list is the memory interface, the numbers 320 and 384 are not what you would expect when thinking "memory interface". The numbers usually go 64, 128, 256, 512 ... The reason for NVIDIA to use this approach is because 512 bit of memory interface would make the PCB unnecessary complex. Having only 256 bit may be not enough to deliver a top-notch product. You always have to consider that avoiding bottlenecks is very important to have a balanced card design. If you multiply the number of ROPs by 16 you get the memory bus width (24 x 16 = 384).
The memory size is directly related to the bus width. Since the memory chips used are 16Mx32 chips with 32 signal lines going into each chip, you will need 384 / 32 = 12 memory chips to connect your whole 384-bit bus. 12 memory chips @ 64MB each are 768 MB (the 64 MB memory size is calculated: 16 MBit x 32 / 8 bits per byte = 64 MB). On the GeForce 8800 GTS you have only 10 memory chips, because the bus is not as wide. Basically NVIDIA turns off two of the eight blocks from the image above. Depending on the method of locking, there may be some softmods coming up.
The rest of the table is pretty straight forward. For now the GeForce 8800 series will only be offered using the PCI-Express bus, but if the market demands it, I am sure NVIDIA will figure out a way to get the cards running on AGP.
I think for the days right after the launch when supply is limited and demand is high you will see much higher prices than suggested here, because merchants are trying to make more money.