Saturday, February 7th 2009

NVIDIA Designs New GTX 260 PCB, Further Reduces Manufacturing Costs

The introduction of the new G200b series graphics processors sought to revive NVIDIA's stronghold over the high-end graphics market, by reducing manufacturing costs, and facilitating high-end graphics cards at unusually low price-points, to compete with rival ATI. The first SKU using the G200b GPU was the new GeForce GTX 260. The PCB of design of the new model (P654) saw several drastic changes, that also ended up contributing to the cost-cutting: all memory chips were placed in the business end of the PCB, and the VRM area rearranged. News emerging from Expreview suggests that NVIDIA has worked out an even newer PCB reference design (model: P897) that aims mainly to cut production costs further. The reference design graphics board based on the PCB will be given the internal name "D10U-20". A short list of changes is as follows:
  • The number of PCB layers has been reduced from 10 to 8, perhaps to compress or remove blank, redundant or rudimentary connections
  • A 4+2 phase NVVDD power design using the ADP4100 voltage regulator IC, the FBVDDQ circuit has been reduced from 2 phases to 1, and the MOSFET package has been changed from LFPAK to DPAK grouping, to reduce costs. The ADP4100 lacks the I2C interface, which means voltage control will be much more difficult than on current PCBs of the GeForce 260,280, 285 and 295
  • The optional G200b support-brace has been removed
  • While the length of the PCB remains the same, the height has been reduced to cut costs
  • BIOS EEPROM capacity reduced from 1 Mbit (128 KB) to 512 Kb (64 KB)
  • Cheaper DVI connectors


The new PCB is expected to reduce costs by as much as US $15 which will impact on the overall product cost, and help step up the competitiveness. Expreview notes that the new PCB will be available to the partners by the third week of this month. Below are the drawing and picture of the PCB. For reference, the second picture is that of the older P654 design.


Source: Expreview
Add your own comment

78 Comments on NVIDIA Designs New GTX 260 PCB, Further Reduces Manufacturing Costs

#1
Hayder_Master
the gtx 260 it is best chose for nvidia and if it become more cheap it will be stranded chose like 8800gt,gts before
Posted on Reply
#2
Mussels
Moderprator
hayder.master said:
the gtx 260 it is best chose for nvidia and if it become more cheap it will be stranded chose like 8800gt,gts before
http://store.steampowered.com/hwsurvey/

see video card description.

almost 12% of steam users (which is a hell of a lot of people) have the 8800 series cards. Its the most popular card overall, and almost 23% of people running DX10 hardware are doing so on an 8800 series card.
Posted on Reply
#3
DarkMatter
Mussels said:
http://store.steampowered.com/hwsurvey/

see video card description.

almost 12% of steam users (which is a hell of a lot of people) have the 8800 series cards. Its the most popular card overall, and almost 23% of people running DX10 hardware are doing so on an 8800 series card.
Wow! Successful chip this G92. If you add 9800 results it's 15.6% and almost 30% respectively, which is very impressive indeed. 1 out 3 DX10 cards are G92. :rockout:
Posted on Reply
#4
iamverysmart
btarunr said:


^Length.
I don't understand, why the hell would nvidia reduce the height of the card. Unless they are after a low profile card they would make the height shorter but shrinking it 1.5cm does nothing except probably makes the card unbalanced.

I'm pretty sure expreview just made a mistake during the translation.
Posted on Reply
#5
Mussels
Moderprator
iamverysmart said:
I don't understand, why the hell would nvidia reduce the height of the card. Unless they are after a low profile card they would make the height shorter but shrinking it 1.5cm does nothing except probably makes the card unbalanced.

I'm pretty sure expreview just made a mistake during the translation.
making it smaller in any dimension makes it cheaper to produce. thats all there is to it.
Posted on Reply
#6
DaedalusHelios
iamverysmart said:
I don't understand, why the hell would nvidia reduce the height of the card. Unless they are after a low profile card they would make the height shorter but shrinking it 1.5cm does nothing except probably makes the card unbalanced.

I'm pretty sure expreview just made a mistake during the translation.
Makes the card unbalanced??? Are you going to use it as a graphics card or a door stop?
Posted on Reply
#7
btarunr
Editor & Senior Moderator
iamverysmart said:
I don't understand, why the hell would nvidia reduce the height of the card. Unless they are after a low profile card they would make the height shorter but shrinking it 1.5cm does nothing except probably makes the card unbalanced.

I'm pretty sure expreview just made a mistake during the translation.
Both Expreview En and Cn websites stated the figure at 1.5 cm (~0.6"), which is too much height to lose, from what the pictures show. It can't be 1.5 mm (eradicating a typo) either, as that would be too less a height to lose. They basically shed the extra height the first iteration of the GTX 260 55nm PCB had, near the VRM area. So the figure isn't 1.5 cm, but is a significant amount nonetheless.
Posted on Reply
#8
Mussels
Moderprator
it could be 1.5"

that'd be around 3-4CM
Posted on Reply
#9
Valdez
DarkMatter said:
Wow! Successful chip this G92. If you add 9800 results it's 15.6% and almost 30% respectively, which is very impressive indeed. 1 out 3 DX10 cards are G92. :rockout:
I don't think it's a good thing. G92 is not a good chip for dx10.
Posted on Reply
#10
DaedalusHelios
Valdez said:
I don't think it's a good thing. G92 is not a good chip for dx10.
LOL The G92 is as good or better than the 3870 you have. Unless you don't think its good for DX 10 either.
Posted on Reply
#11
Laurijan
I hope some manufactures are staying with the old disign to allow customers better OC abilities.. the 1 phase design probably makes OCing more difficult..
Posted on Reply
#12
Mussels
Moderprator
Valdez said:
I don't think it's a good thing. G92 is not a good chip for dx10.
i certainly have no problems with my G92 card in DX10. It had a lot less issues when it was launched than the ATI 3xx0 series cards did, with DX10 compatibility.

at least my card does AA properly.
Posted on Reply
#13
Valdez
DaedalusHelios said:
LOL The G92 is as good or better than the 3870 you have. Unless you don't think its good for DX 10 either.
r600 is better for dx10. But we will see when the first dx10 app comes out.
Posted on Reply
#14
Valdez
Mussels said:
i certainly have no problems with my G92 card in DX10. It had a lot less issues when it was launched than the ATI 3xx0 series cards did, with DX10 compatibility.

at least my card does AA properly.
There are no native dx10 games on the market as i know.
Posted on Reply
#15
Laurijan
Valdez said:
There are no native dx10 games on the market as i know.
Whats with Crysis?
Posted on Reply
#16
Valdez
Laurijan said:
Whats with Crysis?
:laugh:
Posted on Reply
#17
DaedalusHelios
You get extra effects with directX 10 effects enabled in most modern games now.

Its a good feature set to support well.

Unless thats removed from the hungarian versions for some reason. :)
Posted on Reply
#18
btarunr
Editor & Senior Moderator
Return to topic.
Posted on Reply
#19
Nitro-Max
This is not good ok they might be able to offer them to us cheaper which is good but who wants cheaply made hardware id rather pay that bit more for quality tbh.

a thinner pcb means more chance of flexing and damage .
Posted on Reply
#20
Bjorn_Of_Iceland
You get extra effects with directX 10 effects enabled in most modern games now.
DX10 is a joke. Basically just wrapped the existing dx9 api and added few, non-groundbreaking, stuff. Nothing compared to what dx8 vs dx9 showed. Clearly it was just a filler api built to lure gamers into Vista. It has been like 3 years already and still performance is a joke. DX9 was just out for a few months and we saw a leap.


In anycase, Galaxy uses this new design on the first gen GTX260.. really cuts the price off.
Posted on Reply
#21
CrAsHnBuRnXp
iamverysmart said:
Why not just call it length.
Im not even going to say it...
Posted on Reply
#22
Mussels
Moderprator
lets end discussions of DX9 vs DX10. keep it about these cards... anyone seen retail prices for these yet?
Posted on Reply
#23
DarkMatter
Valdez said:
I don't think it's a good thing. G92 is not a good chip for dx10.
I would like to see proofs of that. :laugh:
If there's no DX10 apps, how do you know if it's good o not? :roll:

Valdez said:
r600 is better for dx10. But we will see when the first dx10 app comes out.
3Dmark Vantage IS DX10, Unigine, FarCry 2, Crysis Warhead (let's say Crysis was not, this one IS) and many other games and apps are DX10. G92 does just well on those apps, much better than the R600/RV670 in most cases, so nice try.
Posted on Reply
#24
Kursah
Valdez said:
There are no native dx10 games on the market as i know.
Not enough DX10 users out there to justify the risk of a DX10 only game at this point in time, would be a bad move for profits.

But with cards as powerful as the 4870's and GTX260's dropping in price, these could very well end up being next-gen mid-range cards that could make DX10+ performance better in games and maybe start to justify the costs of DX10 only games. Sure I would like to see it, but I also like the fact that you can do DX9 too if you don't have the hardware or performance capabilites for 10. And if the AMD/ATI and NV units I mentioned earlier were to become a midrange GTS350 and HD5430 or whatever have you, at an easier price point, bang for the buck would be there. Pretty much a pipe dream at this point, but the prices these cards are at now for this generation is pretty sweet. I'm curious to see how this card does, I have a feeling it will do quite well overall, there will be some that don't care if it's got less power phases or can't support voltage change via hardware, when it flat out runs like a champ. The 260 is no slouch at stock.

:toast:
Posted on Reply
#25
spearman914
STALKER Clear Sky is the best DX10 game i've seen so far.
Posted on Reply
Add your own comment