1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

*GTX 260/280 Unofficial TPU! Thread*

Discussion in 'NVIDIA' started by Kursah, Jul 20, 2008.

  1. subhendu

    subhendu New Member

    Joined:
    Jan 26, 2009
    Messages:
    488 (0.23/day)
    Thanks Received:
    33
    guys..I am planning to get a gtx 260 (216 core)...resolution - 1920 x 1080 & i Just need 2 or 4 AA max in games...is it a good choice? (I don't need dx11...so no ati ...plz)
     
  2. Burgers New Member

    Joined:
    Jun 28, 2010
    Messages:
    10 (0.01/day)
    Thanks Received:
    2
    The card will do just fine. Probably the perfect card for the job.
     
    subhendu says thanks.
  3. johnspack

    johnspack

    Joined:
    Oct 6, 2007
    Messages:
    4,384 (1.66/day)
    Thanks Received:
    869
    Location:
    Nelson B.C. Canada
    Even when I have to use only one GTX260 216, I get great performance. Of course it helps to OC. I can run my main card at 702\1512, the secondary at 660\1452 for now until I get it bios modded. When in sli mode, my 2 216s power through anything I throw at them! Can it run Crysis, dam right it can! Sli 470s are the next step up. If you can afford them!
     
    subhendu says thanks.
  4. overclocking101

    overclocking101

    Joined:
    Apr 8, 2009
    Messages:
    2,886 (1.38/day)
    Thanks Received:
    405
    Location:
    vermont
    the 260 216 is a nice card it ran any game i played at 45fps and up. i was getting 55fps in BFBC2 at maxxed out settings 192X1080. maximum AA etc
     
    subhendu says thanks.
  5. Burgers New Member

    Joined:
    Jun 28, 2010
    Messages:
    10 (0.01/day)
    Thanks Received:
    2
    Heck I use a GTX 260 65nm, 192. I play at 1920 x 1080 with 4xAA. It plays pretty much everything at high FPS no problem!
     
    subhendu says thanks.
  6. overclocking101

    overclocking101

    Joined:
    Apr 8, 2009
    Messages:
    2,886 (1.38/day)
    Thanks Received:
    405
    Location:
    vermont
    god damn it!!! my driver crash errors are back! now on my GTX280. its been happeneing ever since my new windows install and I cant for the life of me figure out why! it was working finewith the stock cooler then i went to water it was fine, fresh windows install on my SSD now problems occur. im gonna be getting another stock cooler soon so we will see if its maybe the ddr3 chips overheating but it happens usually when running gpu grid or benches sometimes 2d enviroment to. but the vregs are not overheating i just dont get why. im thniknig it has to do with pci-e speed but with clarkdale that clock cannot be manipulated
     
  7. mudkip

    mudkip

    Joined:
    Dec 14, 2008
    Messages:
    1,226 (0.56/day)
    Thanks Received:
    148
    Location:
    The Netherlands
    Hmm. Just set pci-e bus 100 MHz to be sure. also when does the driver crash? only in 2d/3d or when switching from 2d to 3d? maybe you should turn that feature off which switches your clock rates from 2d to 3d , i forgot how that's called but it's in the nvidia control panel. Also try to reinstall windows again or try a different pci-e slot.
     
  8. Burgers New Member

    Joined:
    Jun 28, 2010
    Messages:
    10 (0.01/day)
    Thanks Received:
    2
    You can stop it from changing between 2D/3D clocks with Rivatuner. I'd try a bios update, see if that flushes out the problem, overclocking101.
     
  9. Mussels

    Mussels Moderprator Staff Member

    Joined:
    Oct 6, 2004
    Messages:
    42,553 (11.41/day)
    Thanks Received:
    9,826
    i need some help with a GTX280.

    its got an arctic cooling ubercooler on it, but theres a problem...


    the VRMs get so damn hot when running furmark, that the back of the card (the backplate) gets well over 100C (i'm too chicken to touch the VRM's directly) and the card crashes.

    is there any known issues with 280's and furmark? is the 'Accelero XTREME GTX 280' an inferior cooler? or do i just have a dud card? (i got it and cooler free, since its overheating)


    P.S - its fine at idle, just gets damned hot at load. previous owner used a 120mm fan blowing from rear of card to delay the crashes.
     
  10. LAN_deRf_HA

    LAN_deRf_HA

    Joined:
    Apr 4, 2008
    Messages:
    4,563 (1.86/day)
    Thanks Received:
    954
    Last edited: Nov 24, 2010
  11. Kursah

    Kursah

    Joined:
    Oct 15, 2006
    Messages:
    8,030 (2.69/day)
    Thanks Received:
    1,800
    Location:
    Missoula, MT, USA
    There is a HUGE thread dediated to that cooler, iirc OnBoard created the thread with reviews, pictures, etc.

    It's a great cooler, but yes the VRM cooling was a total failure. Same for the other solution at the time too (I think it was the Thermalright solution).

    What OnBoard had done was cut the VRM section from the OE cooler and screwed it to the card. It cleared the AC Xtreme cooler fins, and had much better contact, along with a pretty dramatic drop in temps iirc.

    I've never seen an issue or failure with Furmark and a GTX2xx card, doesn't mean it can't happen, but I can't imagine running Furmark all the time is a good idea regardless.

    Edit, LAN_deRf beat me to it! Nice work! :toast:
     
  12. Mussels

    Mussels Moderprator Staff Member

    Joined:
    Oct 6, 2004
    Messages:
    42,553 (11.41/day)
    Thanks Received:
    9,826
    the problem i have, is that i dont have the stock cooler or any parts from it.

    i'll have to look into those replacement VRM sinks
     
  13. chris189

    Joined:
    Mar 18, 2010
    Messages:
    107 (0.06/day)
    Thanks Received:
    11
    I can't seem to get even a drop more clock from the overvoltage. I managed to get 725-1515-1225 on stock voltages without a problem. I flashed the card several times. Starting out at 1.18V, 1.2V, 1.25, and then 1.360V. None help me achieve more than 725Mhz core clock. I'm running an MSI Geforce GTX260 65nm 192 core. I also noticed it can't handle anymore than 1515Mhz shader clock. I have the GPU Core Voltage at 1.360V and left the Voltage Regulator at 1.125V. Turning the voltage regulator up doesn't do anything to stability other than make it worst until the screen turns into a huge artifact. I then have to manually reset the computer without a problem. It's just ATITool artifacts at anything above 725Mhz core even with the 1.360V. What's up with this? I didn't change the voltage IC of 1.125V since that's the regulator voltage. What should i do EXACTLY????
     
  14. johnspack

    johnspack

    Joined:
    Oct 6, 2007
    Messages:
    4,384 (1.66/day)
    Thanks Received:
    869
    Location:
    Nelson B.C. Canada
    Ati tool doesn't work right any more, use Furmark to check for artifacts. I can get ati tool to show artifacts at proven stable clocks on my cards.
     
  15. chris189

    Joined:
    Mar 18, 2010
    Messages:
    107 (0.06/day)
    Thanks Received:
    11
    what clock speeds are you achieving? and at what voltage specifications? Core voltage and Voltage Regulator Voltage. I can run through 3dmark vantage at 765mhz core 1530 shader 1225Mhz memory but no games will run for more than a minute before the driver fails. Metro 2033 especially. It runs great, like insanely amazing frame rates, but after a minute the screens goes black. Without anyway of correcting it, i reset the pc and following reset the Post was black. So i have to Hard power down, then restart and it's back to normal again. Only when running Metro 2033 does this happen. Scared the crap out of me last night!!! I was omg i'm an idiot for volt modding/ extreme overclocking this gfx card. Since it has modified bios, i supposed it's not impossible to simply reflash to the original bios, correct? for RMA purposes. Your thoughts about my issues are appreciated hugely!!!
     
  16. johnspack

    johnspack

    Joined:
    Oct 6, 2007
    Messages:
    4,384 (1.66/day)
    Thanks Received:
    869
    Location:
    Nelson B.C. Canada
    With a single card I can get 666/1566 stable. With sli, I seem to be stuck at 1512, but haven't actually tried lately to see if I get higher. I've only got 1.15 and 1.18v on the cards right now. I think trying to use 700 on the gpu core is too much, you'll get the shaders higher if you back it off to 666 ish. I think at 666 I might be able to do 1566 shader stable in sli. I've got 2 gtx285s coming, so I've kind of lost interest in the 260s.....
     
  17. chris189

    Joined:
    Mar 18, 2010
    Messages:
    107 (0.06/day)
    Thanks Received:
    11
    **johnspack**
    POST YOUR VANTAGE SCORES ONCE YOU GET THOSE 285's UP AND RUNNING!!! I'de be really interested to see how much of a gain you will get with the cards running at stock speeds compared to overclocked GTX 260s.

    I always wanna keep a 1:2 ratio for core clock and shader. It's seemed to work itself out in terms of running 3dmark vantage. 765Mhz Core 1530Mhz shader is 1:2 ratio. That's on a 1.25V core voltage and the stock 1.125V voltage regulator. Adjusting the voltage regulator has no effect on stability, it actually makes it less stable. I've found out that it can do 730Mhz Core and 1460Mhz Shader stable @ 1.25V. The crazy thing is i noticed 5fps increase at the 765mhz core 1530mhz shader in metro 2033 compared to 730mhz-1460mhz. It might just need 1.3V or 1.35V for stability.
    **After overclocking this card to the limit, it really makes me wanna just say screw it and get a GTX 580. The only thing holding me back is money and my power supply. I just recently upgraded from a BFGTech 550W power supply to a Kingwin MKX-850W and will need to return the 850W and push for 1000W minimum. I calculated what power my system would need right now and with my gpu stock and 4ghz i7, it said i required 830W. With a stock GTX 580 I would need a 935W i believe. So i'll probably just go for a 1000-1200W just be 100% certain regarding stability.
    **Oh yeah and i noticed people running the 216 core GTX 260 achieve 3d mark vantage gpu scores that outperform my overclock 192 core and overclocked they're quite a bit faster in terms of points. I believe the difference is a couple. No the less it shows how outdated this 65nm 192 Core GTX 260 is, right?!?
     
  18. johnspack

    johnspack

    Joined:
    Oct 6, 2007
    Messages:
    4,384 (1.66/day)
    Thanks Received:
    869
    Location:
    Nelson B.C. Canada
    Your shader speed is more important than the core speed. Core speeds over 700 will mostly just make it less stable. Drop your core down to 666, and push the shaders up and see what fps you get then. I expect to get 740 or so on the 285s, but not on my 260s. One of my 216sp 260s is a 65nm, and it does as well as the 55nm one. I believe I can do 1580 shaders with the core at 666, but I'm not willing to push more volts into them, they are slated for my folding box so I don't want to kill them!
     
  19. chris189

    Joined:
    Mar 18, 2010
    Messages:
    107 (0.06/day)
    Thanks Received:
    11
    Yeah i totally agree on the "you don't wanna kill them". I remapped the dynamics of the fan speed on the card so it never goes beyond 68C. I also didn't see hardly any difference what so ever with over volting the core. Maybe a couple degrees Celsius. I love how i have the dynamic fan speed though, I never have to touch rivatuner. It does it all automatically, and beautifully at that! It's so much quieter and cooler as well with the dynamics to try and maintain 68C under 100% load. Which is a good realistic temperature for this gpu. Oh yeah! i did some research today at work and i was comparing all the different gpu's. I've known this for a while but recently was pushing for the 580 but actually the best bang for your buck card is the 470. It outperforms the 285 and i love the 448 Cuda cores. I do a lot of encoding and i take use of the 192 cores on my 260. 448 of them is really gonna kick ass. That's 2.3 Times more Cuda cores! The best part is you can get them all day long on newegg for 250 bucks! I looked up the 285 and it's actually more expensive. That is unless your getting them used on ebay or something? I compared the 480, 470, and 285 on a benchmark review site and the 470 outperforms it in almost all aspects. Cool huh? I say u should push for 2 470s in SLI! I think sparkle gives a $20 mail in rebate if u buy 2 470s for 250 bucks a piece. Cool huh?
     
  20. johnspack

    johnspack

    Joined:
    Oct 6, 2007
    Messages:
    4,384 (1.66/day)
    Thanks Received:
    869
    Location:
    Nelson B.C. Canada
    Yep, 285 sli will outperform a 470, and even a 480, but a single one, nope. But 2 used 285s are less than even a 470, so it's a poor man's solution. If you have the money, I'd go for the 570, I think that's probably the best bang for the buck in that range.
     
  21. chris189

    Joined:
    Mar 18, 2010
    Messages:
    107 (0.06/day)
    Thanks Received:
    11
    The 570 is about $100 more than the 470. There are some huge improvements in terms of architecture and engineering and design. The 500 series was the correction of the incomplete design, or un-perfected 400 series if you will. It's about 3-5 fps faster over all, maybe 10-15 fps better on average fps, but minimum fps it's so close it doesn't make me wanna spend $350, ya know? For 250 bucks the 470 is a killer bang for your buck since it packs a hell of a punch for pretty cheap. If i can catch a 470 for closer to 200 bucks i'll probably jump on it. I get paid next week so i may just spontaneously buy it.

    Read this... **** http://www.hardwarecanucks.com/foru...s/38626-nvidia-geforce-gtx-570-review-15.html

    Regarding this GTX 260, idk but maybe the over voltage is causing the black screen under crazy load. I downloaded furmark and i was monitoring the gpu current. The most i've EVER seen it draw is 67-68 Amps. That's right, that's what AIDA64 reports from the GPU Power Current sensor.. weird huh? My cpu pulls 85 Amps when under load in intel burn test at maximum stress. So i put it under normal load in furmark, i didn't even touch the Extreme Burn in mode box.... I'm scared just looking at that box :/ .... So immediately i saw a near 90 AMP LOAD!!! on the gpu. Freaking insane. Pulled my 12v line waaaaay down. I have an 850W power supply that is rated at 70 Amps on the 12V line. It dropped down to 11.872V. That's pretty scary low. The craziest thing is the BFGTech 550W power supply i had before was rated at 36 Amps and it took the enormous load! 60 amps Avg on video card in game and around 30 amps on the cpu. 12v line would drop to 11.817V and the second it dropped below 11.8V it would just shut off. I could never run metro 2033 with the 550W since my computer would just turn off every time. That games kills a whole system. I'm running a 4Ghz Core i7 too. Anyway Furmark after a couple minutes made my screen go black just like metro! The exact same problem! so i thought, maybe it's these new 260.99 nvidia drivers. So i went back to the 258.96 which were known to be stable and the same thing happend. So i suppose the Over voltage, is actually making the system more unstable. Unusual huh? Just like cpus, voltage increases stability.. or so i thought? Anyway i'm going back to stock 1.12V Core voltage and see what happens. Oh yeah i'll stress it at the clock speeds you suggested. 666Mhz core and 1580Mhz Shader. What are your memory clocks at? I managed 1225Mhz perfectly stable just FYI. Later.
     
    Last edited: Dec 11, 2010
    trt740 says thanks.
  22. johnspack

    johnspack

    Joined:
    Oct 6, 2007
    Messages:
    4,384 (1.66/day)
    Thanks Received:
    869
    Location:
    Nelson B.C. Canada
    Yeah, guess you're right about the 470, very nice price point right now. I'd consider one if I didn't need a quad core cpu really bad right now. Well really, I'd need 2 for it to be an upgrade from 2 285s as I don't use dx10 or 11 yet. As for the 260s, I haven't bothered to push the memory past 1200. I think you'll need at least 1.15v for 1580 shaders however, I couldn't get close to it with 1.12v. Even 1.18 is safe. The 65nm is the same chip as the 280 which runs on 1.19, so I use that as a reference.
     
  23. johnspack

    johnspack

    Joined:
    Oct 6, 2007
    Messages:
    4,384 (1.66/day)
    Thanks Received:
    869
    Location:
    Nelson B.C. Canada
    I believe it was Tatty_One who told me to keep core under 700. He was right. At 666 core seems 1566 is stable for sli. Can't wait to play with my 285s, upto 750 core, 1700 shaders, not that I'll get that high, but sure want to try!
     
  24. chris189

    Joined:
    Mar 18, 2010
    Messages:
    107 (0.06/day)
    Thanks Received:
    11
    Yeah, i can't wait to get the Fermi GTX 470! That things gonna tear my GTX 260 a new one! I believe 1 GTX 580 out performs GTX 285 SLI. I suppose the GTX 470 should come close with a huge overclock!
     
  25. johnspack

    johnspack

    Joined:
    Oct 6, 2007
    Messages:
    4,384 (1.66/day)
    Thanks Received:
    869
    Location:
    Nelson B.C. Canada
    Actually, 2 gtx285 still beat a gtx580. In dx11, that's another story!
     

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page