1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

*GTX 260/280 Unofficial TPU! Thread*

Discussion in 'NVIDIA' started by Kursah, Jul 20, 2008.

  1. newconroer

    Joined:
    Jun 20, 2007
    Messages:
    2,834 (1.10/day)
    Thanks Received:
    267
    I've been silently watching and considering working on voltage with the 280, but in the end I know I won't be doing it for performance gains for my purposes.

    Without voltage adjustments, I can hit about 700/1500/1300 (few clocks either direction), and in nearly all 3d applications, I gain 1-3 fps. Of course, in synthetics, there's a gain, but that's not my bag.

    A volt mod would help stretch out the bandwidth on the memory, maybe to somewhere in the range of 1350-1400? Of course the timings would have to be loosened, and the shader clock (the GT200's weak point), doesn't like going over 1600-1650 (unlinked), no matter the voltage or cooling solution.

    Which means, projected real world frames gains would be less than five fps across the board so to speak. And I simply cannot justify soldering, flashing and further increased cooling for such a small increase in performance.


    Unfortunatley I think I'll be sitting out the modding for this generation of cards.
  2. AddSub

    AddSub

    Joined:
    Aug 9, 2006
    Messages:
    1,001 (0.35/day)
    Thanks Received:
    152
    There are no third party coolers for GTX 200 series yet, right? I mean, the stock cooler does a great job compared to my old 8800 GTX. Not sure if its the cooler or the power management efficiency of the GPU at work, but my GTX 260 idle temps are ridiculously low. Same idle temps I was getting with my old X850XT in idle on water. Either way, great job nVidia. But still, I was wondering what sort of OC results would be possible with third party coolers.

    I haven't tried the 1.18V v-mod yet, Tatty_One. Tell us how it goes.
  3. Kursah

    Kursah

    Joined:
    Oct 15, 2006
    Messages:
    7,595 (2.68/day)
    Thanks Received:
    1,567
    Location:
    Missoula, MT, USA
    Well I have yet to see any aftermarket AIR cooling for the GTX200 series, but as you said AddSub, the OE reference cooler does a pretty bang-up job. This is the best Reference OE GPU Cooler I've seen yet, most of the time I get cheaper single slot cards that have aftermarket cooling installed on them before first boot, or in the case of my old Powercolor x1950Pro and Palit 9600GT Sonic, come with AC/Zalman cooling already installed...

    I created a new version of the EVGA 260 FTW Bios that only had the 80% Min Fan Speed changed, left clocks alone as on the other vmodded one, and left the volts alone on the Extra setting, which keeps 1.12v.

    I decided I'd try and do some OC-ing as I could already see my card more than 3C cooler at the same fan speed, well doing my usual Precision for OC, ATI Tool opened, start artifact scanner, run for 30S-2Min, close ATI Tool, OC some more, repeat. I got my card all the way up to 795/1590! I tried 800/1600, but the second I opened artifact scanner, I got the driver reset failure BSOD as these newer drivers have been doing and had to restart. Still, getting there on 1.18v was a lot harder.

    I did run into some artifacting along the way tho...the trick I found, was to keep the ratio right at 1:2 GPU:Shader, seemed to keep me more stable than the OE ratio. I didn't mess with memory, left it at 1200 this run.

    I'm going to run Furmark and maybe Vantage to see how stable those high clocks really are, but I was actually impressed that I could attain that...on stock voltage before, I think 760/1535 was my max before artifact scanner would lock up...seems this GPU needs some break-in time...maybe running at 1.18v helped for a little bit? Who knows...I'll report back and see if it was just a lucky OC run that is doomed in anything more or if I actually have higher clocks to play with now.

    Either way, I'm still interested in results from other GTX260 users running 1.18v. I can upload my EVGA FTW Modded Bioses or, you can go get NiBitor 4.3 and NVFlash 5.67 (iirc) and go for it yourself...very easy to do, maybe there's more to the FTW Bios than meets the eye? I dunno, just interesting that I have better stability at higher clocks with 1.12v than 1.18v at this point, but that could be temp related...it's around 85F in the PC room (no AC :( ), still idling at 46C loading at 63C, before w/1.18v idle was the same, load was hitting around 70C as of late.
  4. Kursah

    Kursah

    Joined:
    Oct 15, 2006
    Messages:
    7,595 (2.68/day)
    Thanks Received:
    1,567
    Location:
    Missoula, MT, USA
    Alright just ran FurMark and Vantage, seems to be OK, didn't see any artifacts...now I'll push memory back to 1300 and see what happens. I got P12258, so about +250 from P12012...granted I had virus scan, this browser window minimized (has about 8 tabs open lol!), I do sloppy runs, no doubt about it...but I game with other crap open too...I view this as more honest performance submissions of how I use my system...that is when I do submit or run benches beyond using them for stability.

    I wouldn't even have Vantage if I didn't get it free from EVGA lol! :toast:

    Here's some proof for ya guys! Remember I originally had vanilla stock clocks before the FTW stock clocks...they were 576/1246 1000. If you take that into consideration, imo that's one helluva OC...can't believe I got so close to my goal with stock voltage! Maybe next week I'll hit it eh? :D

    [​IMG]
  5. Hayder_Master

    Hayder_Master

    Joined:
    Apr 21, 2008
    Messages:
    5,173 (2.27/day)
    Thanks Received:
    638
    Location:
    IRAQ-Baghdad

    good score , cool overclock , tip for you gpu i think you can increase shader more
  6. Kursah

    Kursah

    Joined:
    Oct 15, 2006
    Messages:
    7,595 (2.68/day)
    Thanks Received:
    1,567
    Location:
    Missoula, MT, USA
    :toast:

    Not at this point...1590 is my all time benchable max...which is up some from 1560 last week. If things keep up, I may be able to break 1600 shader soon...like I said before, my best stability is at the GTX's max 1:2 GPU:Shader ratio when overclocking, could be drivers, GPU break-in, or a number of other things. Also remember I'm in a room that's around 80+F during the day...of course when I'm awake and messing with my rig most of the time too.

    You're giving tips on GTX2xx series eh? You have one and OC'd with one, I see an 8800 in your system specs..but if you have one and have OC'd it please post your experience, I'm hoping to have a nice compilation in the pages of this thread for sure! But I want experienced posts here, I've heard of shaders going over 1600, but remember not all hardware OC's the same. :D
  7. r9

    r9

    Joined:
    Jul 28, 2008
    Messages:
    2,144 (0.99/day)
    Thanks Received:
    284
    If Microsoft did not help them by going step back with DX 10. DX 10 is DX 10 because of NV. 10.1 should be starting point but because NV cant make it is to complicated for them Microsoft sad ok what can you make NV: we can make DX 9.9 :D Microsoft says ok ok we will let you in in Vista. If microsoft stayed true to the words what DX 10 would be NV dont have DX 10. And about technology ATI is F1 car 1000 cc engine NV is some I dont now what 2500 cc (cc is DIE size) how is possible 1000 cc to stand side by side 2500 cc.
    About my point about DX 10 and DX 10.1 loojk for thread about Assassin Creed and performance gains of DX 10 vs DX 10.1 and how is removed form the because it make NV look bad.
  8. newconroer

    Joined:
    Jun 20, 2007
    Messages:
    2,834 (1.10/day)
    Thanks Received:
    267
    Any GT200 owners having probelms with GPUz not showing the updated bandwidth speed after overclocking your memory?

    I seem to be stuck at the stock 141.7gb/s, I thought PCI E 1.1 went higher than that.
  9. Tatty_One

    Tatty_One Senior Moderator Staff Member

    Joined:
    Jan 18, 2006
    Messages:
    16,383 (5.29/day)
    Thanks Received:
    2,321
    Location:
    Worcestershire, UK
    I agree with some of that but to be honest, when you take into account that NVidia has the fastest single GPU and it manages to be the fastest on less than a third of the shaders that the HD4870 has......do you really think that architecturally ATi are that far ahead?........sometimes we can ahead with "different".....both companies take a different approach to GPU architecture, advanced, effficient etc does not mean a great deal if it is slower IMO.
    Let's hope with driver development the HD4870 gets even better because certainly the pricing is right, some 200 series owners saw a 20% gain out of the latest Forceware drivers......if ATi can match or better that then I think this battle really is going to the red side :toast:
  10. r9

    r9

    Joined:
    Jul 28, 2008
    Messages:
    2,144 (0.99/day)
    Thanks Received:
    284
    Number of shaders is irrelevant because of architecture but what it matters is pice od silicon ATI in 2.5 times smaler silicon managed to rival NV. F1 rules says you engines for this season are 2000 cc from teams remains the desing are there going to be 12 cilinders or 4 cilinder the target is to achive more power at 2000 cc space.

    I`m thinking what is goint go to be nvidia next step what is next step 1 mile core 1024 bit memory that is nonsense they need to bumpup their efficenci. ATI if want to make the fastest chip simply will rise the shader from 800 to 1600 and the die size still will be smaler than 280gtx.

    And if werent 48xx how much will 260-280 cost now a small fortune.

    And if werent AMD will be buying P4s at 3.8 GHz with 10mb cache at 1000$
  11. PuMA

    PuMA New Member

    Joined:
    Jul 15, 2006
    Messages:
    727 (0.25/day)
    Thanks Received:
    28
    Location:
    Finland
    yeah thats how the markets work
  12. Kursah

    Kursah

    Joined:
    Oct 15, 2006
    Messages:
    7,595 (2.68/day)
    Thanks Received:
    1,567
    Location:
    Missoula, MT, USA
    Yep but don't forget where ATI started with a large GPU, that ran hot, slower than it was touted to, with the large-bit memory bus..the "ring"bus, that R600 was a good chip, but remember the bashing it got? New architecture that couldn't compete at release, and could pass a GTS 320 after a few months worth of drivers?. Both companies learn lessons along the way, NV I feel got lazy to an extent and milked G80 tech as long as possible...hey it worked...it took ATI this long to bring something out that was closer to serious competition.

    It's still impressive what ATI have done while learning with their newer GPU's though, I will say that...but here's the thing..when I'm looking to buy something, I look at the performance now, the support now, the failure rate now...and really the HD48xx LOST, I'm only speaking of consumer based reports...the pro reviews, I read very few of realistically...what do other customers say, why should I have to bios flash to control fan speed to keep my card below 90C at STOCK CLOCKS? There were quite a few questions like that I asked myself as I was considering an HD4870, at the time I was deciding on purchase...it was touted as the faster card, the better card...I know it's faster at some things and slower in others....what got me was out of the chute operation, sure it' runs ok at 90C, fine...not in my rig it won't, then I read some HD4870's can't get through FUR Mark...them damn VRMs'll get up to around 126C, still...I noticed similar VRM issues on my x1950's, then I read of users RMA-ing their cards within a short time after purchase...meh. I did not choose my purchase on performance alone, I went with the card that OC's fine now, the fan works, the cooling works how it should, the drivers get the job done, the card just flat out works as I've said a dozen times before in this and other threads.

    Every point you're trying to make has already been made in different ways, I see them and understand what you're saying but I also see my reasoning for stating what I do and supporting what I purchase due to the ammount of time and research I put in beyond reading what pro reviews and press releases say, or what just "fanboys" state. Right now it's still a fair comparison of the cards, that's good...I'm sure the HD's got more in them when ATI gets their drivers straight, good...if so ATI could use some better cards and a stronger market share for a while...give it to them. That's not the point of this thread, that's not the point of the GTX200 or it's support here, the point is to enjoy what you decided to purchase, sure you bought it for a reason, and hopefully what was decided upon was worth the value and has minimal headaches to get you there. There's nothing wrong with GTX or HD cards in my eye, just got the one that made more sense to purchase at the time for me. That's what I refer to when I make GTX and HD comparisons...there's more to it that comparing architectures, hopefull drivers...I weigh in the good and the bad....everything has good and bad to it...just depends on what you're willing to deal with imo.

    :toast:
  13. Kursah

    Kursah

    Joined:
    Oct 15, 2006
    Messages:
    7,595 (2.68/day)
    Thanks Received:
    1,567
    Location:
    Missoula, MT, USA
    Nope mine is fine, I'm still on 2.6 tho...but at 1295 memory speed it reads 145 GB/s for bandwidth.

    :toast:
  14. r9

    r9

    Joined:
    Jul 28, 2008
    Messages:
    2,144 (0.99/day)
    Thanks Received:
    284
    Do you now what is funny. What makes 260 280 cards great ... ATI HD4800. 260 280 are great cards at curent prices, weren`t that great when they show they became great after HD48xx. It is all cool i anderstand what is your point I agree with you. I like ATI because they were always more advanced. Read every PC game minimum you will see that ATI is one generation behind if it is NVIDIA 6600GT is ATI 9800GT if it is NV 7600
    it is ATI x800. ATI is better pice of hardware but NVIDIA is sponsoring every worth of playing game some people are seing that as cheeting, in some way is because in neutral game ATI would work better but for as the end of the chain plain users it makes not difernece, only what matters is FPS but looking at the techonolgy ATI was allways better.
  15. Kursah

    Kursah

    Joined:
    Oct 15, 2006
    Messages:
    7,595 (2.68/day)
    Thanks Received:
    1,567
    Location:
    Missoula, MT, USA
    Yeah I've always gotta kick outta system specs that'll use a newer gen NV and older gen ATI for the same recommendation...even if the recommandation isn't quite relative on performance...but quite a few were close enought hat it didn't matter I suppose...still funny none-the-less. As far as what makes the GTX260/280's great, yes I agree the 4850/4870 competition, I see that working both ways...but ATI get's a little more support out of the chute initially for prices, being the underdog with a helluva damn good performing chip, and possibly taking the big green demon down! That's all good and great, they've done great, and I'm happy for it.

    And yes I do see the NV TWIMTBP support all over the place...but if ATI can get enough going for them, what's to stop them from getting more ATI supported games? What's stopped them now? They've had years to make up for in this respect, and it hasn't been done...I think it will eventually or like you said some sort of neutrality should happen...in the end though for true neutrality, imo, they'd need more similar GPU/processing/shader methods and version support to be true to competition...that may make it less interesting imo. It is impressive to see how both sides have done thus far, technology and performance-wise in my eyes...even compared to when it was the x1950xtx vs the monster 8800GTS 320, look at what we have now. Things can only get better for both sides...I just hope closer competition like we're seeing today continues to happen.

    :toast:
  16. Tatty_One

    Tatty_One Senior Moderator Staff Member

    Joined:
    Jan 18, 2006
    Messages:
    16,383 (5.29/day)
    Thanks Received:
    2,321
    Location:
    Worcestershire, UK
    I agree about the costs but as I said earlier, the silicon is irrelivant if it's slower and in ATI's case the number of shaders is far from irrelivent, if it was they would have just put say 200-300 like the 200 series....no, their architecture requires that amount of shaders to perform to it's fullest. Bottom line is performance and energy efficiency are the keys......ATi has 55nm and has so for a while...thats a real :rockout: in my book and hands up to them for moving forward quickly and ahead of NVidia......sad thing is they run hotter than the 65nm competition!
  17. r9

    r9

    Joined:
    Jul 28, 2008
    Messages:
    2,144 (0.99/day)
    Thanks Received:
    284
    :p
    I want make one thing clear I`m not saying don`t buy NV cards or some thing like that I`m using one now and I love it :p. Looking at technology standpoint HD48xx are amazing 2.4X times smaller die size and that performance. Looking at the whole package price performace drivers etc. it is dificult choise or better it makes no diference both ATI and NVIDIA are posstioning the cards well. But for the prices to be what they are NV are seling chips with close to none profit. If ATI was selling 48xx with same profite margine 4870 would cost 200 $ now. Smaler die 2.4x times (it is not only about size the smaller it is it is easyer to make it more efficency in fabricating and less silicon) less expensive PCB (because of 256 bit instead of 512) Only thing that is more expensive is use of DDR5 but dont forget it need only 512 instead of 1024. They make so mutch with so litle that what I`m saying.
    Yes competition :toast: to that.
  18. Kursah

    Kursah

    Joined:
    Oct 15, 2006
    Messages:
    7,595 (2.68/day)
    Thanks Received:
    1,567
    Location:
    Missoula, MT, USA
    Oh yeah I agree that AMD have done a great job getting a lot of power out of very little, turning a profit on it...it's kinda like poker, you play the hands you're dealt, if you play them right you'll have a nice stack of chips to play with...eventually tho, you're gonna make a large gamble...and the chances of losing that bet are great. They've done a great job. To me the GTX GPU is like the R600 was for ATI, large, powerful, ready to rock, just not fully optimized yet. The 200b series could be interesting to see, 55nm GTX cards could do some damage yet...time will tell if that'll be enough for this generation or not. Even if not...I love my GTX260, still glad I got it, just based off of what I wanted and needed it still fits those shoes perfectly, I wish I had the money for a 4870 too just to play around and decide after experiencing both which would truly be my best fit...but I think I made the right choice for me. :D

    :toast:
  19. AddSub

    AddSub

    Joined:
    Aug 9, 2006
    Messages:
    1,001 (0.35/day)
    Thanks Received:
    152
    First of all, nVidia's GTX 200 GPUs are massive in large part because they sport massive numbers of ROP partitions which are again responsible for such a large number of transistors which translates into greater manufacturing costs. GTX 260 and GTX 280 GPU's have 28 and 32 ROP partitions respectively while 4870/50 GPU's sport only 16, much like 3870/50 and 2900XT, and even X1950XT and X850XT before them. Primary reason why AMD/ATI refused to increase the ROP counts which have remained static since 2004 (since R400 that is) is because any increase would represent a massive jump in transistor count which would again result in increased manufacturing and retail costs and in the end increased power consumption as well.

    So instead of AMD/ATI GPU's being the kings of bang-for-buck GPU-wise as they are now (were for the short time back in June at least), a 4870 GPU with 32 ROP partitions would have a massive die, crazy amounts of transistors, and would cost an arm and a leg, as well run even hotter than 4870 GPUs already do. But, like I said, AMD simply can’t afford it. In the last two years their stock has dropped from around $26 per share back in December 2006 to $5.64 per share as of today (quick NYSE quote). ROP partitions are transistor hungry, something you can easily see if you examine the various die specs and they take sizeable die real estate which results in greater manufacturing costs.

    All AMD could do with their latest GPUs is brute force their way by increasing the shader unit count from already massive number of 320 to now 800. Which was really a relatively cheap way to increase performance since shader units on AMD’s GPU’s are much simpler and much more primitive than nVidia’s and therefore much cheaper to tack on. In fact if I remember correctly from an old TechReport article, I think AMD’s shader units are quit a bit more primitive and limited in how they operate vs. that of nVidia. (MAD+MULL ops. ratio wise). Although GPU architectures are severely different, in some respects you could draw a crude estimate of a minimum and maximum throughput rates. For one reason or another AMD GPUs perform very well in certain simple synthetic shader benchmarks, as seen in very thorough reviews done by Digit-Life, yet that does not translate very well into actual real-world gaming results. Another testament of the simplicity of AMD’s shader architecture.

    I must admit though, AMD did fix the serious AA performance issues that exited since the introduction of 2900XT by introducing some tweaks and more than doubling the texturing units. A smart and relatively cheap way to win back some performance under certain conditions. Although, by some reports in early 2007, there were in fact hardware bugs that were causing this. Either way, great job AMD/ATI.

    To conclude: I’m certain if AMD could afford it, right now they would be producing massive GPUs with beefy transistor counts and complex shader architecture. The sad fact of it is, they simply CAN”T AFFORD IT. All they can do is tweak here and there, and do the most economical things to boost performance. I’m sure nVidia is loosing money with the GTX 200 lineup since these GPUs must have cost quite a bit to manufacture and yet they are giving them away at this point if you consider the launch prices.

    Finally, this is getting way off topic. Some moderation maybe?
    SiliconSlick and Kursah say thanks.
  20. Kursah

    Kursah

    Joined:
    Oct 15, 2006
    Messages:
    7,595 (2.68/day)
    Thanks Received:
    1,567
    Location:
    Missoula, MT, USA
    Alright folks, I just updated the OP with some more information on GTX 260 Bios modding. It's extremely easy, I provided links to the most updated versions of NiBitor, GPU-z and NVFlash I'm aware of. I recommend using GPU-z for backing up BIOS for the simple fact I can't capture my GTX260 bios in Vista x64 with NiBitor...if you are capable of doing so, I'm sure that's perfectly fine.

    Lemme know what you think, what you would like added/removed/changed. And I'll post it here too, I'm not responsible for anyone attempting this, neither is TPU or anyone in this forum nore the MFG of the card, if you choose to do this, you are responsible, so pay attention, if you take the right steps, you'll be safe.

    :toast:
  21. AddSub

    AddSub

    Joined:
    Aug 9, 2006
    Messages:
    1,001 (0.35/day)
    Thanks Received:
    152
    My new CPU arrives tomorrow, so my primary machine with GTX 260 in it will be back in action again. Right now I'm using a single core Sempron backup machine with a Radeon 800GTO and 512MB of RAM. Not much tweaking potential there. :)

    I think I will tweak the BIOS for fan speeds on the GTX 260 ASAP. I want to set it to 100% at all times, since my all-steel case does a good job of soundproofing and honestly I can't really notice that much of a difference between 40% and 100% when I have my headphones on with music or game-noise blasting away.

    I haven’t tried the v-mod yet. I’m hoping to see more v-mod results in this thread before I attempt it. So, any pioneering individuals doing these v-mods, feel free to post your results.

    Also, anyone looking to sell-off their GTX 260, I’m looking for another one at this time, so send me a PM if you are interested.
  22. Kursah

    Kursah

    Joined:
    Oct 15, 2006
    Messages:
    7,595 (2.68/day)
    Thanks Received:
    1,567
    Location:
    Missoula, MT, USA
    The fan mod works like a charm...I just modded a bios for 100% as I can deal with the noise, like you most of the time my headphones are on anyways.

    The vmod works fine, for me it was so-so on OC results, I could attain higher stable OC's but not by much at all. Like I've said before, when in Extra mode even at FTW stock clocks the temps rose around 3-5C, that was consistent through load temps too, if not a couple more degrees.

    So far I think I'm the only one that's actually performed the vmod, but I'm sure others will try and hopefully report back with results!

    :toast:
  23. Tatty_One

    Tatty_One Senior Moderator Staff Member

    Joined:
    Jan 18, 2006
    Messages:
    16,383 (5.29/day)
    Thanks Received:
    2,321
    Location:
    Worcestershire, UK
    I did the BIOS mod changing the "extra" to 1.18V from 1.12V....does not seem to have made the slightest difference, I dont know whether thats because the card is hardware limited to 1.12V or because, in actual fact 0.06V is so little an increase it makes no difference to overclocks.
    Last edited: Aug 18, 2008
  24. Kursah

    Kursah

    Joined:
    Oct 15, 2006
    Messages:
    7,595 (2.68/day)
    Thanks Received:
    1,567
    Location:
    Missoula, MT, USA
    Yeah mine was pretty minimal really, what about temps? I noticed that immediately upon hitting EXTRA Clock speeds.

    :toast:
  25. Tatty_One

    Tatty_One Senior Moderator Staff Member

    Joined:
    Jan 18, 2006
    Messages:
    16,383 (5.29/day)
    Thanks Received:
    2,321
    Location:
    Worcestershire, UK
    No, no noticeable difference in temps either for me, I am gonna flash mine back sometime I think. Although I am more than happy running 24/7 for gaming at 760 linked, I dont bench anymore so I dont need to voltmod otherwise the soldering iron would already be out!

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page