Saturday, February 7th 2009

NVIDIA Designs New GTX 260 PCB, Further Reduces Manufacturing Costs

The introduction of the new G200b series graphics processors sought to revive NVIDIA's stronghold over the high-end graphics market, by reducing manufacturing costs, and facilitating high-end graphics cards at unusually low price-points, to compete with rival ATI. The first SKU using the G200b GPU was the new GeForce GTX 260. The PCB of design of the new model (P654) saw several drastic changes, that also ended up contributing to the cost-cutting: all memory chips were placed in the business end of the PCB, and the VRM area rearranged. News emerging from Expreview suggests that NVIDIA has worked out an even newer PCB reference design (model: P897) that aims mainly to cut production costs further. The reference design graphics board based on the PCB will be given the internal name "D10U-20". A short list of changes is as follows:
  • The number of PCB layers has been reduced from 10 to 8, perhaps to compress or remove blank, redundant or rudimentary connections
  • A 4+2 phase NVVDD power design using the ADP4100 voltage regulator IC, the FBVDDQ circuit has been reduced from 2 phases to 1, and the MOSFET package has been changed from LFPAK to DPAK grouping, to reduce costs. The ADP4100 lacks the I2C interface, which means voltage control will be much more difficult than on current PCBs of the GeForce 260,280, 285 and 295
  • The optional G200b support-brace has been removed
  • While the length of the PCB remains the same, the height has been reduced to cut costs
  • BIOS EEPROM capacity reduced from 1 Mbit (128 KB) to 512 Kb (64 KB)
  • Cheaper DVI connectors


The new PCB is expected to reduce costs by as much as US $15 which will impact on the overall product cost, and help step up the competitiveness. Expreview notes that the new PCB will be available to the partners by the third week of this month. Below are the drawing and picture of the PCB. For reference, the second picture is that of the older P654 design.

Source: Expreview
Add your own comment

78 Comments on NVIDIA Designs New GTX 260 PCB, Further Reduces Manufacturing Costs

#1
soryuuha
imho, pcb layer is important for rams overclocking, so does graphic card..?

correct me if im wrong :respect:
Posted on Reply
#2
DrPepper
The Doctor is in the house
I think this is a good move on nvidias part. I doubt if it had even come up in the news many people would even notice.

As for them removing unecessary parts I think if they were necessary they would have kept them there, its not like nvidia to make rediculous mistakes concearning board design.
Posted on Reply
#3
EastCoasthandle
btarunr said:
No. All a DVI connector does is connect the card to the monitor. It's just a piece of plastic with a few sockets and metal conveying the signal. All that's different between the new one and the old, is that the old one used an EMI shield. Evidently NVIDIA found that unnecessary. Image quality is care of the NVIO2 processor. That's what handles display, and the fact that it's isolated from the GPU (and its power-hungry components) shows they've already lopped EMI, or any form of interference, although the intention of separating display logic was because the GPU die had become too big.
Well, I take it you are, for the most part, against this kind of practice

Well folks, this isn't th 1st time they did this.
Read here and here
Posted on Reply
#4
LittleLizard
phanbuey said:
+1

This is nothing but good news. The cost saving will have minimal, if any impact on the performance of the card; they are just becomeing more skilled with the manufacture of these cards and eliminating unnecessary waste.
ok, so the only real con would be less overclock, but i prefer cheaper because it is already a good card.
Posted on Reply
#5
btarunr
Editor & Senior Moderator
EastCoasthandle said:
Well, I take it you are, for the most part, against this kind of practice
My personal opinion changed. 13 months is sufficient time for peoples' ways of thinking to change. I'm more informed now, so are my opinions.

My being for or against this practice hasn't surfaced in this thread, and is irrelevant anyway.
Posted on Reply
#6
R_1
So, basically GTX260 will become mainstream in Nvidia lineup. Probably a HTPC GPU, something like 9600-9800GT now are and assume serous drop in price will follow. New parts are coming :nutkick:.
Posted on Reply
#7
EastCoasthandle
btarunr said:
My personal opinion changed. 13 months is sufficient time for peoples' ways of thinking to change. I'm more informed now, so are my opinions.

My being for or against this practice hasn't surfaced in this thread, and is irrelevant anyway.
Your posts in this thread gave me the impression that you were giving an opinion. Also, per your own post your opinion has changed. Which is why I inquired. But thanks for the response none the less.
Posted on Reply
#8
PCpraiser100
Nice plan, now I might have second thoughts with this card. Any chances that it could reduce power consumption?
Posted on Reply
#9
newtekie1
Semi-Retired Folder
Haytch said:
I would rather pay the $15 extra and not have those features taken away.
The cons of this move, too heavily outweigh the pro here. I mean pro because it only has one real pro, thats the $15 saving, which infact doubles up as a con.

If Nvidia are unable to wipe $15 off their products without stripping them down then they are in trouble. But we all know that this is not the case. Its one thing to reduce costs, its another thing to strip the card of parts.

To me, this is nowhere near a $15 price reduction or an improvement in manufacturing. What i gather is that Nvidia worked out a way to take off more then $15 worth of parts and allow the card to still work which gives the enduser a $15 saving with Nvidia raking in way over $15 in profit.

Nothing they removed was ' unessessary ', unless ofcourse your a monkey.
What cons are you talking about exactly?

soryuuha said:
imho, pcb layer is important for rams overclocking, so does graphic card..?

correct me if im wrong :respect:
If the PCB layers are going unused, or are only there to provide redundancy, then no they are not important for anything and removing them shouldn't affect overclocking.
Posted on Reply
#10
Haytch
If your the type of user to plug in and play and never touch anything, there are no cons. . . Then again there are no pro's either.

I guess what i meant earlier by this card having more cons then pro's was more in regards to overclocking capability and less room to play with in the bios. Taking away a phase would result in less efficiency and high temperatures at its weak point.

I do believe that Nvidia is capable of redesigning their cards to make them more efficient, more powerfull and cheaper to produce, but this card doesnt cover all 3. Maybe some of the lines are reduntant now, maybe they really are . . .

I think i would like to see this card directly compared. Anyone ?
Posted on Reply
#11
Kursah
I would say wait till the product is released and see what happens, sure more power phases sounds good for overclocking, but losing a phase for efficiency doesn't necessarily mean loss of overclockability. If the gpu runs cooler, and faster, with fewer phases and can still keep up with it's older bretheren, then I see no issue, plus if we start seeing sub-200 GTX260's more commonplace, I really see no issue with that. Giving many gamers a chance to enjoy some serious performance out of a truly great card, I've had mine since July, I did step up to a 216 core in september, but mine is still a 65nm beast, it rocks in every game I play and then some, folds like a champ and runs cool, plus it uses less core voltage for more shader cores and decent clocks, runs cooler and is just as stable as my original card.

I think this is a good progression of the GTX, though dropping 30-60 shaders, bringing memory down to like 640mb/512mb and calling it a GTS250 would've been a good move too imo. Sell it at a 150-170 pricepoint and gamers would be very happy indeed.

:toast:
Posted on Reply
#12
Tatty_One
Senior Moderator
buggalugs said:
I dont like it. Sounds like poorer quality all round. $15 saving on a $300-$400 card doesnt sound like its worth it.
GTX260 @ $300 - $400, damn where do you live :eek:

soryuuha said:
imho, pcb layer is important for rams overclocking, so does graphic card..?

correct me if im wrong :respect:
You may be right, but 95% of gfx card buyers dont overclock, so...... if 95% get a better deal.....thats good, as the other 5% have not bought the card yet.... it isnt yet "bad".

At the end of the day, a reduction in costs has gotta be good, if with that comes an un-acceptable amount of returns then thats bad and they have failed but it's not as if either manufacturers have a particularily strong record in that department although I dont quite understand why NVidia are doing it at this late stage, both ATI and NVidia have new models on the way, NVidia already have the fastest overall card plus the 3 fastest single card solutions.......makes you wonder why they are doing this TBH.
Posted on Reply
#13
pentastar111
AddSub said:
With the global economy in such a bad shape this was to be expected. In fact they have already reduced costs with their 55nm lineup. The coolers on their 55nm lineup have been cut down. When compared to the coolers on the original 65nm GTX 260/280 lineup, the new coolers are lighter, have no back cover, heatpipes are shorter and smaller and many benchmarks reveal that in many situations 55nm parts actually run hotter than 65nm parts. Cutting down on the actual circuitry was the next logical step. They are taking the route AMD took a year ago. Hence the massive jump in Radeon 3xxx RMA's when compared to the previous generation, something that also carried on with the Radeon 4xxx lineup. In the following months you can expect a heathy increase in threads with titles such as "OMG! My brand new GTX 260 is DEAD after 2 days" or "My brand new nvidia GPU is artifacting at stock clocks!". Just watch.

Nehalem from Intel and 65nm GTX GPU's from nVidia are truly the last quality products we will see from both manufacturers since due to the worsening global economic conditions they will be cutting down on quality assurance along with everybody else in the industry. Here is an easy prediction: next massive GPU release from nVidia (384SP monster) gets pushed back by at least 6 months.
Actually due to the economy I predict a RETURN to better quality, in terms of customer service and reliability. With money tight manufacturers are going to have to have good products and service in order to get and keep customers...Why would a reputable company make a card so cheaply that it would have to be returned in a few months? You can't keep people buying your stuff if it's crappy and your customers service is the same. One step further; if there was no return policy, why would anyone in their right mind purchase the things in the first place? I don't think we have a thing to worry about in the long run. As far as the 55nm line up goes...I upgraded from 2 640mb 8800GTS's to the GTX285's. not only do they kick some butt, they don't run any hotter than the older cards.
Posted on Reply
#14
LAN_deRf_HA
pentastar111 said:
Actually due to the economy I can see a RETURN to better quality, in terms of customer service and reliability. With money tight manufacturers are going to have to have good products and service in order to get and keep customers...
That's not how it usually works... especially when you start cutting your customer service reps.
Posted on Reply
#15
raptori
no no NVIDIA you killed the best and the most popular card on the market .... i should go and find another 65nm GTX260 before they run out ...... or change my avatar.
Posted on Reply
#16
DarkMatter
Tatty_One said:
GTX260 @ $300 - $400, damn where do you live :eek:



You may be right, but 95% of gfx card buyers dont overclock, so...... if 95% get a better deal.....thats good, as the other 5% have not bought the card yet.... it isnt yet "bad".

At the end of the day, a reduction in costs has gotta be good, if with that comes an un-acceptable amount of returns then thats bad and they have failed but it's not as if either manufacturers have a particularily strong record in that department although I dont quite understand why NVidia are doing it at this late stage, both ATI and NVidia have new models on the way, NVidia already have the fastest overall card plus the 3 fastest single card solutions.......makes you wonder why they are doing this TBH.
Probably that 5% of the people that would volt mod the card already bought it or will choose another one by this dates.

As for why they are doing this I think that reducing costs is just a good reason to do it. Even if they release new cards, the GTX260 will stay for long IMO, just as the 8800GT, and making it cheaper is always good. I don't think this will result in higher returns or crippled overclocking. Many non-reference boards from many vendors are cheaper and simpler and that doesn't make them worse. Sometimes they're better than the reference ones, because they had the time to test many things and they can correct what its "wrong". This is no different.
Posted on Reply
#17
spearman914
Good news but IMO i would rather spent $15 for voltage control so u can oc the crap out of it.
Posted on Reply
#18
Mussels
Moderprator
btarunr said:
My personal opinion changed. 13 months is sufficient time for peoples' ways of thinking to change. I'm more informed now, so are my opinions.

My being for or against this practice hasn't surfaced in this thread, and is irrelevant anyway.
i agree with BTA on this. last time i was all "yay cheaper!" but then my friends who bought the cheaper cards had heaps of failures, unlike mine which is still working to this day.

cheaper is fine if it doesnt affect reliability or performance, but those always seem to get sacrifieced.

spearman914 said:
Good news but IMO i would rather spent $15 for voltage control so u can oc the crap out of it.
check the news page. EVGA is offering software control with theirs, so they're definately going to stick with the current design of PCB.
Posted on Reply
#19
EarlZ
$15 dollars saving is waaay to little for all that reduction.
Posted on Reply
#20
eidairaman1
The Exiled Airman
well complaining about it here wont stop them so its like set in stone now.
Posted on Reply
#21
DaedalusHelios
If they already had those stats when it was released there wouldn't be so much teary eyed fear going on about it. I doubt it will make any difference to the end-user. Its not like they released it with less shaders by accident. :laugh:
Posted on Reply
#22
eidairaman1
The Exiled Airman
just trying to get greater yields and remove unused layers is all, if you want ultimate performance go with a 285 or 4870.
Posted on Reply
#23
DarkMatter
EarlZ said:
$15 dollars saving is waaay to little for all that reduction.
I think that people overstimate the price of a PCB.
Posted on Reply
#24
EarlZ
DarkMatter said:
I think that people overstimate the price of a PCB.
The op does say $15
Posted on Reply
#25
DarkMatter
EarlZ said:
The op does say $15
I meant that $15 is a lot actually. Keep in mind that a good chunk of the retail price is for the retailer and I mean 25% or more. Another good chunk is for the vendor, but I can't estimate how much, because I never worked for one. You have to extract the price of the packaging and bundles... In the end the manufacturing cost of a PCB can't exceed $50 too much. In this case we could be talking about $65 to $50 reduction, which is a lot.
Posted on Reply
Add your own comment