• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

*GTX 260/280 Unofficial TPU! Thread*

Joined
Jun 20, 2007
Messages
3,937 (0.64/day)
System Name Widow
Processor Ryzen 7600x
Motherboard AsRock B650 HDVM.2
Cooling CPU : Corsair Hydro XC7 }{ GPU: EK FC 1080 via Magicool 360 III PRO > Photon 170 (D5)
Memory 32GB Gskill Flare X5
Video Card(s) GTX 1080 TI
Storage Samsung 9series NVM 2TB and Rust
Display(s) Predator X34P/Tempest X270OC @ 120hz / LG W3000h
Case Fractal Define S [Antec Skeleton hanging in hall of fame]
Audio Device(s) Asus Xonar Xense with AKG K612 cans on Monacor SA-100
Power Supply Seasonic X-850
Mouse Razer Naga 2014
Software Windows 11 Pro
Benchmark Scores FFXIV ARR Benchmark 12,883 on i7 2600k 15,098 on AM5 7600x
I've been silently watching and considering working on voltage with the 280, but in the end I know I won't be doing it for performance gains for my purposes.

Without voltage adjustments, I can hit about 700/1500/1300 (few clocks either direction), and in nearly all 3d applications, I gain 1-3 fps. Of course, in synthetics, there's a gain, but that's not my bag.

A volt mod would help stretch out the bandwidth on the memory, maybe to somewhere in the range of 1350-1400? Of course the timings would have to be loosened, and the shader clock (the GT200's weak point), doesn't like going over 1600-1650 (unlinked), no matter the voltage or cooling solution.

Which means, projected real world frames gains would be less than five fps across the board so to speak. And I simply cannot justify soldering, flashing and further increased cooling for such a small increase in performance.


Unfortunatley I think I'll be sitting out the modding for this generation of cards.
 
Joined
Aug 9, 2006
Messages
1,065 (0.17/day)
System Name [Primary Workstation]
Processor Intel Core i7-920 Bloomfield @ 3.8GHz/4.55GHz [24-7/Bench]
Motherboard EVGA X58 E758-A1 [Tweaked right!]
Cooling Cooler Master V8 [stock fan + two 133CFM ULTRA KAZE fans]
Memory 12GB [Kingston HyperX]
Video Card(s) constantly upgrading/downgrading [prefer nVidia]
Storage constantly upgrading/downgrading [prefer Hitachi/Samsung]
Display(s) Triple LCD [40 inch primary + 32 & 28 inch auxiliary displays]
Case Cooler Master Cosmos 1000 [Mesh Mod, CFM Overload]
Audio Device(s) ASUS Xonar D1 + onboard Realtek ALC889A [Logitech Z-5300 Spk., Niko 650-HP 5.1 Hp., X-Bass Hp.]
Power Supply Corsair TX950W [aka Reactor]
Software This and that... [All software 100% legit and paid for, 0% pirated]
Benchmark Scores Ridiculously good scores!!!
There are no third party coolers for GTX 200 series yet, right? I mean, the stock cooler does a great job compared to my old 8800 GTX. Not sure if its the cooler or the power management efficiency of the GPU at work, but my GTX 260 idle temps are ridiculously low. Same idle temps I was getting with my old X850XT in idle on water. Either way, great job nVidia. But still, I was wondering what sort of OC results would be possible with third party coolers.

I haven't tried the 1.18V v-mod yet, Tatty_One. Tell us how it goes.
 

Kursah

Super Moderator
Staff member
Joined
Oct 15, 2006
Messages
14,666 (2.30/day)
Location
Missoula, MT, USA
System Name Kursah's Gaming Rig 2018 (2022 Upgrade) - Ryzen+ Edition | Gaming Laptop (Lenovo Legion 5i Pro 2022)
Processor R7 5800X @ Stock | i7 12700H @ Stock
Motherboard Asus ROG Strix X370-F Gaming BIOS 6203| Legion 5i Pro NM-E231
Cooling Noctua NH-U14S Push-Pull + NT-H1 | Stock Cooling
Memory TEAMGROUP T-Force Vulcan Z 32GB (2x16) DDR4 4000 @ 3600 18-20-20-42 1.35v | 32GB DDR5 4800 (2x16)
Video Card(s) Palit GeForce RTX 4070 JetStream 12GB | CPU-based Intel Iris XE + RTX 3070 8GB 150W
Storage 4TB SP UD90 NVME, 960GB SATA SSD, 2TB HDD | 1TB Samsung OEM NVME SSD + 4TB Crucial P3 Plus NVME SSD
Display(s) Acer 28" 4K VG280K x2 | 16" 2560x1600 built-in
Case Corsair 600C - Stock Fans on Low | Stock Metal/Plastic
Audio Device(s) Aune T1 mk1 > AKG K553 Pro + JVC HA-RX 700 (Equalizer APO + PeaceUI) | Bluetooth Earbuds (BX29)
Power Supply EVGA 750G2 Modular + APC Back-UPS Pro 1500 | 300W OEM (heavy use) or Lenovo Legion C135W GAN (light)
Mouse Logitech G502 | Logitech M330
Keyboard HyperX Alloy Core RGB | Built in Keyboard (Lenovo laptop KB FTW)
Software Windows 11 Pro x64 | Windows 11 Home x64
Well I have yet to see any aftermarket AIR cooling for the GTX200 series, but as you said AddSub, the OE reference cooler does a pretty bang-up job. This is the best Reference OE GPU Cooler I've seen yet, most of the time I get cheaper single slot cards that have aftermarket cooling installed on them before first boot, or in the case of my old Powercolor x1950Pro and Palit 9600GT Sonic, come with AC/Zalman cooling already installed...

I created a new version of the EVGA 260 FTW Bios that only had the 80% Min Fan Speed changed, left clocks alone as on the other vmodded one, and left the volts alone on the Extra setting, which keeps 1.12v.

I decided I'd try and do some OC-ing as I could already see my card more than 3C cooler at the same fan speed, well doing my usual Precision for OC, ATI Tool opened, start artifact scanner, run for 30S-2Min, close ATI Tool, OC some more, repeat. I got my card all the way up to 795/1590! I tried 800/1600, but the second I opened artifact scanner, I got the driver reset failure BSOD as these newer drivers have been doing and had to restart. Still, getting there on 1.18v was a lot harder.

I did run into some artifacting along the way tho...the trick I found, was to keep the ratio right at 1:2 GPU:Shader, seemed to keep me more stable than the OE ratio. I didn't mess with memory, left it at 1200 this run.

I'm going to run Furmark and maybe Vantage to see how stable those high clocks really are, but I was actually impressed that I could attain that...on stock voltage before, I think 760/1535 was my max before artifact scanner would lock up...seems this GPU needs some break-in time...maybe running at 1.18v helped for a little bit? Who knows...I'll report back and see if it was just a lucky OC run that is doomed in anything more or if I actually have higher clocks to play with now.

Either way, I'm still interested in results from other GTX260 users running 1.18v. I can upload my EVGA FTW Modded Bioses or, you can go get NiBitor 4.3 and NVFlash 5.67 (iirc) and go for it yourself...very easy to do, maybe there's more to the FTW Bios than meets the eye? I dunno, just interesting that I have better stability at higher clocks with 1.12v than 1.18v at this point, but that could be temp related...it's around 85F in the PC room (no AC :( ), still idling at 46C loading at 63C, before w/1.18v idle was the same, load was hitting around 70C as of late.
 

Kursah

Super Moderator
Staff member
Joined
Oct 15, 2006
Messages
14,666 (2.30/day)
Location
Missoula, MT, USA
System Name Kursah's Gaming Rig 2018 (2022 Upgrade) - Ryzen+ Edition | Gaming Laptop (Lenovo Legion 5i Pro 2022)
Processor R7 5800X @ Stock | i7 12700H @ Stock
Motherboard Asus ROG Strix X370-F Gaming BIOS 6203| Legion 5i Pro NM-E231
Cooling Noctua NH-U14S Push-Pull + NT-H1 | Stock Cooling
Memory TEAMGROUP T-Force Vulcan Z 32GB (2x16) DDR4 4000 @ 3600 18-20-20-42 1.35v | 32GB DDR5 4800 (2x16)
Video Card(s) Palit GeForce RTX 4070 JetStream 12GB | CPU-based Intel Iris XE + RTX 3070 8GB 150W
Storage 4TB SP UD90 NVME, 960GB SATA SSD, 2TB HDD | 1TB Samsung OEM NVME SSD + 4TB Crucial P3 Plus NVME SSD
Display(s) Acer 28" 4K VG280K x2 | 16" 2560x1600 built-in
Case Corsair 600C - Stock Fans on Low | Stock Metal/Plastic
Audio Device(s) Aune T1 mk1 > AKG K553 Pro + JVC HA-RX 700 (Equalizer APO + PeaceUI) | Bluetooth Earbuds (BX29)
Power Supply EVGA 750G2 Modular + APC Back-UPS Pro 1500 | 300W OEM (heavy use) or Lenovo Legion C135W GAN (light)
Mouse Logitech G502 | Logitech M330
Keyboard HyperX Alloy Core RGB | Built in Keyboard (Lenovo laptop KB FTW)
Software Windows 11 Pro x64 | Windows 11 Home x64
Alright just ran FurMark and Vantage, seems to be OK, didn't see any artifacts...now I'll push memory back to 1300 and see what happens. I got P12258, so about +250 from P12012...granted I had virus scan, this browser window minimized (has about 8 tabs open lol!), I do sloppy runs, no doubt about it...but I game with other crap open too...I view this as more honest performance submissions of how I use my system...that is when I do submit or run benches beyond using them for stability.

I wouldn't even have Vantage if I didn't get it free from EVGA lol! :toast:

Here's some proof for ya guys! Remember I originally had vanilla stock clocks before the FTW stock clocks...they were 576/1246 1000. If you take that into consideration, imo that's one helluva OC...can't believe I got so close to my goal with stock voltage! Maybe next week I'll hit it eh? :D

 
Joined
Apr 21, 2008
Messages
5,250 (0.90/day)
Location
IRAQ-Baghdad
System Name MASTER
Processor Core i7 3930k run at 4.4ghz
Motherboard Asus Rampage IV extreme
Cooling Corsair H100i
Memory 4x4G kingston hyperx beast 2400mhz
Video Card(s) 2X EVGA GTX680
Storage 2X Crusial M4 256g raid0, 1TbWD g, 2x500 WD B
Display(s) Samsung 27' 1080P LED 3D monitior 2ms
Case CoolerMaster Chosmos II
Audio Device(s) Creative sound blaster X-FI Titanum champion,Creative speakers 7.1 T7900
Power Supply Corsair 1200i, Logitch G500 Mouse, headset Corsair vengeance 1500
Software Win7 64bit Ultimate
Benchmark Scores 3d mark 2011: testing
Alright just ran FurMark and Vantage, seems to be OK, didn't see any artifacts...now I'll push memory back to 1300 and see what happens. I got P12258, so about +250 from P12012...granted I had virus scan, this browser window minimized (has about 8 tabs open lol!), I do sloppy runs, no doubt about it...but I game with other crap open too...I view this as more honest performance submissions of how I use my system...that is when I do submit or run benches beyond using them for stability.

I wouldn't even have Vantage if I didn't get it free from EVGA lol! :toast:

Here's some proof for ya guys! Remember I originally had vanilla stock clocks before the FTW stock clocks...they were 576/1246 1000. If you take that into consideration, imo that's one helluva OC...can't believe I got so close to my goal with stock voltage! Maybe next week I'll hit it eh? :D



good score , cool overclock , tip for you gpu i think you can increase shader more
 

Kursah

Super Moderator
Staff member
Joined
Oct 15, 2006
Messages
14,666 (2.30/day)
Location
Missoula, MT, USA
System Name Kursah's Gaming Rig 2018 (2022 Upgrade) - Ryzen+ Edition | Gaming Laptop (Lenovo Legion 5i Pro 2022)
Processor R7 5800X @ Stock | i7 12700H @ Stock
Motherboard Asus ROG Strix X370-F Gaming BIOS 6203| Legion 5i Pro NM-E231
Cooling Noctua NH-U14S Push-Pull + NT-H1 | Stock Cooling
Memory TEAMGROUP T-Force Vulcan Z 32GB (2x16) DDR4 4000 @ 3600 18-20-20-42 1.35v | 32GB DDR5 4800 (2x16)
Video Card(s) Palit GeForce RTX 4070 JetStream 12GB | CPU-based Intel Iris XE + RTX 3070 8GB 150W
Storage 4TB SP UD90 NVME, 960GB SATA SSD, 2TB HDD | 1TB Samsung OEM NVME SSD + 4TB Crucial P3 Plus NVME SSD
Display(s) Acer 28" 4K VG280K x2 | 16" 2560x1600 built-in
Case Corsair 600C - Stock Fans on Low | Stock Metal/Plastic
Audio Device(s) Aune T1 mk1 > AKG K553 Pro + JVC HA-RX 700 (Equalizer APO + PeaceUI) | Bluetooth Earbuds (BX29)
Power Supply EVGA 750G2 Modular + APC Back-UPS Pro 1500 | 300W OEM (heavy use) or Lenovo Legion C135W GAN (light)
Mouse Logitech G502 | Logitech M330
Keyboard HyperX Alloy Core RGB | Built in Keyboard (Lenovo laptop KB FTW)
Software Windows 11 Pro x64 | Windows 11 Home x64
:toast:

Not at this point...1590 is my all time benchable max...which is up some from 1560 last week. If things keep up, I may be able to break 1600 shader soon...like I said before, my best stability is at the GTX's max 1:2 GPU:Shader ratio when overclocking, could be drivers, GPU break-in, or a number of other things. Also remember I'm in a room that's around 80+F during the day...of course when I'm awake and messing with my rig most of the time too.

You're giving tips on GTX2xx series eh? You have one and OC'd with one, I see an 8800 in your system specs..but if you have one and have OC'd it please post your experience, I'm hoping to have a nice compilation in the pages of this thread for sure! But I want experienced posts here, I've heard of shaders going over 1600, but remember not all hardware OC's the same. :D
 

r9

Joined
Jul 28, 2008
Messages
3,300 (0.58/day)
System Name Primary|Secondary|Poweredge r410|Dell XPS|SteamDeck
Processor i7 11700k|i7 9700k|2 x E5620 |i5 5500U|Zen 2 4c/8t
Memory 32GB DDR4|16GB DDR4|16GB DDR4|32GB ECC DDR3|8GB DDR4|16GB LPDDR5
Video Card(s) RX 7800xt|RX 6700xt |On-Board|On-Board|8 RDNA 2 CUs
Storage 2TB m.2|512GB SSD+1TB SSD|2x256GBSSD 2x2TBGB|256GB sata|512GB nvme
Display(s) 50" 4k TV | Dell 27" |22" |3.3"|7"
VR HMD Samsung Odyssey+ | Oculus Quest 2
Software Windows 11 Pro|Windows 10 Pro|Windows 10 Home| Server 2012 r2|Windows 10 Pro
I have just temp increase alone as proof the extra voltage from BIOS is there...from 1.12v to 1.18v I increased 3-5C on load...at all clocks. The temps seem pretty consistent for a range of clocks/oc's thus far, and even a healthy increase didn't seem to be huge to the overall increase of temps...idle is always the same since both my modded and OE bios have the same 2D/2d volt settings.

Having to modify a card so it doesn't idle around 60-70C is an issue for me, those kind of temps should only be seen at load imo. I like to see less, the lower the better...sure I know GPU's can take a lot of heat, but if around 100-105C is the limit, my card idles at 45C and the average 48xx idles around 60C...that is just insane imo. I know the HD48xx cards are good, it's about time those extra shaders that ATI uses come in handy, but if you look at the 800 vs 192 on my NV, there are users that state NV's shaders are "less efficient", "not as good of performers" and all that kind of junk...the NV card should be in 2900pro territory if that stuff was true. Maybe it's that the DX10.x support isn't there? Who knows...800 shaders should do more than be neck and neck with the 2nd in line on the GTX. Sure there are times when an HD4870 may meet or beat a GTX280 too...I'm sure it's out there in DX9 land, but in DX10 I haven't seen to much in real world results on SIMILAR builds where that was true. DX9, I don't care if that HD48xx guy gets 100FPS and I get 98 in the same game, same build, same res, same settings..ya know what I mean? Hell knock it down another 10-15 because of PhysiX...so I get a game experience the way it was designed to be, and I'm still at frames that are beyond what would be considered good performance with all the eye candy and PhsyX...to me the winner is clear this round, I just think the initial prices really hurt the image...it gave the 48XX a chance they wouldn't have had as strong of otherwise imo. Good cards, great cards, both sides, but I did a LOT of research in both pro-review and consumer based stuff, to me the card that was getting my money was clear...runs great stock, drivers are pretty damn solid, the card performs great, runs cooler, OC's great out of the box, and flat out gets the job done.

I don't want to sway you if your intents are with the HD48xx series, I hope if you go that route, you get what you expect for your money, and I'm sure you will as they are damn good cards...I'm not saying the GTX's are better, just easier to deal with...the price/performance is there now...that changes the picture quite a bit on it's own imo. The choice is a tough one for sure, but no matter what route you go, you will get a good performer!

:toast:

If Microsoft did not help them by going step back with DX 10. DX 10 is DX 10 because of NV. 10.1 should be starting point but because NV cant make it is to complicated for them Microsoft sad ok what can you make NV: we can make DX 9.9 :D Microsoft says ok ok we will let you in in Vista. If microsoft stayed true to the words what DX 10 would be NV dont have DX 10. And about technology ATI is F1 car 1000 cc engine NV is some I dont now what 2500 cc (cc is DIE size) how is possible 1000 cc to stand side by side 2500 cc.
About my point about DX 10 and DX 10.1 loojk for thread about Assassin Creed and performance gains of DX 10 vs DX 10.1 and how is removed form the because it make NV look bad.
 
Joined
Jun 20, 2007
Messages
3,937 (0.64/day)
System Name Widow
Processor Ryzen 7600x
Motherboard AsRock B650 HDVM.2
Cooling CPU : Corsair Hydro XC7 }{ GPU: EK FC 1080 via Magicool 360 III PRO > Photon 170 (D5)
Memory 32GB Gskill Flare X5
Video Card(s) GTX 1080 TI
Storage Samsung 9series NVM 2TB and Rust
Display(s) Predator X34P/Tempest X270OC @ 120hz / LG W3000h
Case Fractal Define S [Antec Skeleton hanging in hall of fame]
Audio Device(s) Asus Xonar Xense with AKG K612 cans on Monacor SA-100
Power Supply Seasonic X-850
Mouse Razer Naga 2014
Software Windows 11 Pro
Benchmark Scores FFXIV ARR Benchmark 12,883 on i7 2600k 15,098 on AM5 7600x
Any GT200 owners having probelms with GPUz not showing the updated bandwidth speed after overclocking your memory?

I seem to be stuck at the stock 141.7gb/s, I thought PCI E 1.1 went higher than that.
 

Tatty_Two

Gone Fishing
Joined
Jan 18, 2006
Messages
25,793 (3.88/day)
Location
Worcestershire, UK
Processor Rocket Lake Core i5 11600K @ 5 Ghz with PL tweaks
Motherboard MSI MAG Z490 TOMAHAWK
Cooling Thermalright Peerless Assassin 120SE + 4 Phanteks 140mm case fans
Memory 32GB (4 x 8GB SR) Patriot Viper Steel 4133Mhz DDR4 @ 3600Mhz CL14@1.45v Gear 1
Video Card(s) Asus Dual RTX 4070 OC
Storage WD Blue SN550 1TB M.2 NVME//Crucial MX500 500GB SSD (OS)
Display(s) AOC Q2781PQ 27 inch Ultra Slim 2560 x 1440 IPS
Case Phanteks Enthoo Pro M Windowed - Gunmetal
Audio Device(s) Onboard Realtek ALC1200/SPDIF to Sony AVR @ 5.1
Power Supply Seasonic CORE GM650w Gold Semi modular
Mouse Coolermaster Storm Octane wired
Keyboard Element Gaming Carbon Mk2 Tournament Mech
Software Win 10 Home x64
If Microsoft did not help them by going step back with DX 10. DX 10 is DX 10 because of NV. 10.1 should be starting point but because NV cant make it is to complicated for them Microsoft sad ok what can you make NV: we can make DX 9.9 :D Microsoft says ok ok we will let you in in Vista. If microsoft stayed true to the words what DX 10 would be NV dont have DX 10. And about technology ATI is F1 car 1000 cc engine NV is some I dont now what 2500 cc (cc is DIE size) how is possible 1000 cc to stand side by side 2500 cc.
About my point about DX 10 and DX 10.1 loojk for thread about Assassin Creed and performance gains of DX 10 vs DX 10.1 and how is removed form the because it make NV look bad.

I agree with some of that but to be honest, when you take into account that NVidia has the fastest single GPU and it manages to be the fastest on less than a third of the shaders that the HD4870 has......do you really think that architecturally ATi are that far ahead?........sometimes we can ahead with "different".....both companies take a different approach to GPU architecture, advanced, effficient etc does not mean a great deal if it is slower IMO.
Let's hope with driver development the HD4870 gets even better because certainly the pricing is right, some 200 series owners saw a 20% gain out of the latest Forceware drivers......if ATi can match or better that then I think this battle really is going to the red side :toast:
 

r9

Joined
Jul 28, 2008
Messages
3,300 (0.58/day)
System Name Primary|Secondary|Poweredge r410|Dell XPS|SteamDeck
Processor i7 11700k|i7 9700k|2 x E5620 |i5 5500U|Zen 2 4c/8t
Memory 32GB DDR4|16GB DDR4|16GB DDR4|32GB ECC DDR3|8GB DDR4|16GB LPDDR5
Video Card(s) RX 7800xt|RX 6700xt |On-Board|On-Board|8 RDNA 2 CUs
Storage 2TB m.2|512GB SSD+1TB SSD|2x256GBSSD 2x2TBGB|256GB sata|512GB nvme
Display(s) 50" 4k TV | Dell 27" |22" |3.3"|7"
VR HMD Samsung Odyssey+ | Oculus Quest 2
Software Windows 11 Pro|Windows 10 Pro|Windows 10 Home| Server 2012 r2|Windows 10 Pro
I agree with some of that but to be honest, when you take into account that NVidia has the fastest single GPU and it manages to be the fastest on less than a third of the shaders that the HD4870 has......do you really think that architecturally ATi are that far ahead?........sometimes we can ahead with "different".....both companies take a different approach to GPU architecture, advanced, effficient etc does not mean a great deal if it is slower IMO.
Let's hope with driver development the HD4870 gets even better because certainly the pricing is right, some 200 series owners saw a 20% gain out of the latest Forceware drivers......if ATi can match or better that then I think this battle really is going to the red side :toast:

Number of shaders is irrelevant because of architecture but what it matters is pice od silicon ATI in 2.5 times smaler silicon managed to rival NV. F1 rules says you engines for this season are 2000 cc from teams remains the desing are there going to be 12 cilinders or 4 cilinder the target is to achive more power at 2000 cc space.

I`m thinking what is goint go to be nvidia next step what is next step 1 mile core 1024 bit memory that is nonsense they need to bumpup their efficenci. ATI if want to make the fastest chip simply will rise the shader from 800 to 1600 and the die size still will be smaler than 280gtx.

And if werent 48xx how much will 260-280 cost now a small fortune.

And if werent AMD will be buying P4s at 3.8 GHz with 10mb cache at 1000$
 

PuMA

New Member
Joined
Jul 15, 2006
Messages
724 (0.11/day)
Location
Finland
Processor lapped C2D e6850, 3600mhz
Motherboard ASUS P5N32-E SLI (680i SLI)
Cooling NOCTUA NH-U9, 3 2x120mm 1x180mm case fans
Memory 6GB A-DATA vitesta DDR2 900mhz (2x1gb + 2x2gb)
Video Card(s) LEASTEK GTX 260 (701/1408/1105)
Storage SAMSUNG F1 640GB SATAII/SEAGATE 250gb SATAII
Display(s) SAMSUNG 226BW 1680x1050 2ms
Case ANTEC three hundred
Audio Device(s) ASUS SupremeFX, SONY surround AMP
Power Supply XION 630W
Software VISTA ULTIMATE x64
Benchmark Scores 3dmark06:15126 (701/1408/1105) vantage: 11698
And if werent 48xx how much will 260-280 cost now a small fortune.

And if werent AMD will be buying P4s at 3.8 GHz with 10mb cache at 1000$

yeah thats how the markets work
 

Kursah

Super Moderator
Staff member
Joined
Oct 15, 2006
Messages
14,666 (2.30/day)
Location
Missoula, MT, USA
System Name Kursah's Gaming Rig 2018 (2022 Upgrade) - Ryzen+ Edition | Gaming Laptop (Lenovo Legion 5i Pro 2022)
Processor R7 5800X @ Stock | i7 12700H @ Stock
Motherboard Asus ROG Strix X370-F Gaming BIOS 6203| Legion 5i Pro NM-E231
Cooling Noctua NH-U14S Push-Pull + NT-H1 | Stock Cooling
Memory TEAMGROUP T-Force Vulcan Z 32GB (2x16) DDR4 4000 @ 3600 18-20-20-42 1.35v | 32GB DDR5 4800 (2x16)
Video Card(s) Palit GeForce RTX 4070 JetStream 12GB | CPU-based Intel Iris XE + RTX 3070 8GB 150W
Storage 4TB SP UD90 NVME, 960GB SATA SSD, 2TB HDD | 1TB Samsung OEM NVME SSD + 4TB Crucial P3 Plus NVME SSD
Display(s) Acer 28" 4K VG280K x2 | 16" 2560x1600 built-in
Case Corsair 600C - Stock Fans on Low | Stock Metal/Plastic
Audio Device(s) Aune T1 mk1 > AKG K553 Pro + JVC HA-RX 700 (Equalizer APO + PeaceUI) | Bluetooth Earbuds (BX29)
Power Supply EVGA 750G2 Modular + APC Back-UPS Pro 1500 | 300W OEM (heavy use) or Lenovo Legion C135W GAN (light)
Mouse Logitech G502 | Logitech M330
Keyboard HyperX Alloy Core RGB | Built in Keyboard (Lenovo laptop KB FTW)
Software Windows 11 Pro x64 | Windows 11 Home x64
Number of shaders is irrelevant because of architecture but what it matters is pice od silicon ATI in 2.5 times smaler silicon managed to rival NV. F1 rules says you engines for this season are 2000 cc from teams remains the desing are there going to be 12 cilinders or 4 cilinder the target is to achive more power at 2000 cc space.

I`m thinking what is goint go to be nvidia next step what is next step 1 mile core 1024 bit memory that is nonsense they need to bumpup their efficenci. ATI if want to make the fastest chip simply will rise the shader from 800 to 1600 and the die size still will be smaler than 280gtx.

Yep but don't forget where ATI started with a large GPU, that ran hot, slower than it was touted to, with the large-bit memory bus..the "ring"bus, that R600 was a good chip, but remember the bashing it got? New architecture that couldn't compete at release, and could pass a GTS 320 after a few months worth of drivers?. Both companies learn lessons along the way, NV I feel got lazy to an extent and milked G80 tech as long as possible...hey it worked...it took ATI this long to bring something out that was closer to serious competition.

It's still impressive what ATI have done while learning with their newer GPU's though, I will say that...but here's the thing..when I'm looking to buy something, I look at the performance now, the support now, the failure rate now...and really the HD48xx LOST, I'm only speaking of consumer based reports...the pro reviews, I read very few of realistically...what do other customers say, why should I have to bios flash to control fan speed to keep my card below 90C at STOCK CLOCKS? There were quite a few questions like that I asked myself as I was considering an HD4870, at the time I was deciding on purchase...it was touted as the faster card, the better card...I know it's faster at some things and slower in others....what got me was out of the chute operation, sure it' runs ok at 90C, fine...not in my rig it won't, then I read some HD4870's can't get through FUR Mark...them damn VRMs'll get up to around 126C, still...I noticed similar VRM issues on my x1950's, then I read of users RMA-ing their cards within a short time after purchase...meh. I did not choose my purchase on performance alone, I went with the card that OC's fine now, the fan works, the cooling works how it should, the drivers get the job done, the card just flat out works as I've said a dozen times before in this and other threads.

Every point you're trying to make has already been made in different ways, I see them and understand what you're saying but I also see my reasoning for stating what I do and supporting what I purchase due to the ammount of time and research I put in beyond reading what pro reviews and press releases say, or what just "fanboys" state. Right now it's still a fair comparison of the cards, that's good...I'm sure the HD's got more in them when ATI gets their drivers straight, good...if so ATI could use some better cards and a stronger market share for a while...give it to them. That's not the point of this thread, that's not the point of the GTX200 or it's support here, the point is to enjoy what you decided to purchase, sure you bought it for a reason, and hopefully what was decided upon was worth the value and has minimal headaches to get you there. There's nothing wrong with GTX or HD cards in my eye, just got the one that made more sense to purchase at the time for me. That's what I refer to when I make GTX and HD comparisons...there's more to it that comparing architectures, hopefull drivers...I weigh in the good and the bad....everything has good and bad to it...just depends on what you're willing to deal with imo.

:toast:
 

Kursah

Super Moderator
Staff member
Joined
Oct 15, 2006
Messages
14,666 (2.30/day)
Location
Missoula, MT, USA
System Name Kursah's Gaming Rig 2018 (2022 Upgrade) - Ryzen+ Edition | Gaming Laptop (Lenovo Legion 5i Pro 2022)
Processor R7 5800X @ Stock | i7 12700H @ Stock
Motherboard Asus ROG Strix X370-F Gaming BIOS 6203| Legion 5i Pro NM-E231
Cooling Noctua NH-U14S Push-Pull + NT-H1 | Stock Cooling
Memory TEAMGROUP T-Force Vulcan Z 32GB (2x16) DDR4 4000 @ 3600 18-20-20-42 1.35v | 32GB DDR5 4800 (2x16)
Video Card(s) Palit GeForce RTX 4070 JetStream 12GB | CPU-based Intel Iris XE + RTX 3070 8GB 150W
Storage 4TB SP UD90 NVME, 960GB SATA SSD, 2TB HDD | 1TB Samsung OEM NVME SSD + 4TB Crucial P3 Plus NVME SSD
Display(s) Acer 28" 4K VG280K x2 | 16" 2560x1600 built-in
Case Corsair 600C - Stock Fans on Low | Stock Metal/Plastic
Audio Device(s) Aune T1 mk1 > AKG K553 Pro + JVC HA-RX 700 (Equalizer APO + PeaceUI) | Bluetooth Earbuds (BX29)
Power Supply EVGA 750G2 Modular + APC Back-UPS Pro 1500 | 300W OEM (heavy use) or Lenovo Legion C135W GAN (light)
Mouse Logitech G502 | Logitech M330
Keyboard HyperX Alloy Core RGB | Built in Keyboard (Lenovo laptop KB FTW)
Software Windows 11 Pro x64 | Windows 11 Home x64
Any GT200 owners having probelms with GPUz not showing the updated bandwidth speed after overclocking your memory?

I seem to be stuck at the stock 141.7gb/s, I thought PCI E 1.1 went higher than that.

Nope mine is fine, I'm still on 2.6 tho...but at 1295 memory speed it reads 145 GB/s for bandwidth.

:toast:
 

r9

Joined
Jul 28, 2008
Messages
3,300 (0.58/day)
System Name Primary|Secondary|Poweredge r410|Dell XPS|SteamDeck
Processor i7 11700k|i7 9700k|2 x E5620 |i5 5500U|Zen 2 4c/8t
Memory 32GB DDR4|16GB DDR4|16GB DDR4|32GB ECC DDR3|8GB DDR4|16GB LPDDR5
Video Card(s) RX 7800xt|RX 6700xt |On-Board|On-Board|8 RDNA 2 CUs
Storage 2TB m.2|512GB SSD+1TB SSD|2x256GBSSD 2x2TBGB|256GB sata|512GB nvme
Display(s) 50" 4k TV | Dell 27" |22" |3.3"|7"
VR HMD Samsung Odyssey+ | Oculus Quest 2
Software Windows 11 Pro|Windows 10 Pro|Windows 10 Home| Server 2012 r2|Windows 10 Pro
Yep but don't forget where ATI started with a large GPU, that ran hot, slower than it was touted to, with the large-bit memory bus..the "ring"bus, that R600 was a good chip, but remember the bashing it got? New architecture that couldn't compete at release, and could pass a GTS 320 after a few months worth of drivers?. Both companies learn lessons along the way, NV I feel got lazy to an extent and milked G80 tech as long as possible...hey it worked...it took ATI this long to bring something out that was closer to serious competition.

It's still impressive what ATI have done while learning with their newer GPU's though, I will say that...but here's the thing..when I'm looking to buy something, I look at the performance now, the support now, the failure rate now...and really the HD48xx LOST, I'm only speaking of consumer based reports...the pro reviews, I read very few of realistically...what do other customers say, why should I have to bios flash to control fan speed to keep my card below 90C at STOCK CLOCKS? There were quite a few questions like that I asked myself as I was considering an HD4870, at the time I was deciding on purchase...it was touted as the faster card, the better card...I know it's faster at some things and slower in others....what got me was out of the chute operation, sure it' runs ok at 90C, fine...not in my rig it won't, then I read some HD4870's can't get through FUR Mark...them damn VRMs'll get up to around 126C, still...I noticed similar VRM issues on my x1950's, then I read of users RMA-ing their cards within a short time after purchase...meh. I did not choose my purchase on performance alone, I went with the card that OC's fine now, the fan works, the cooling works how it should, the drivers get the job done, the card just flat out works as I've said a dozen times before in this and other threads.

Every point you're trying to make has already been made in different ways, I see them and understand what you're saying but I also see my reasoning for stating what I do and supporting what I purchase due to the ammount of time and research I put in beyond reading what pro reviews and press releases say, or what just "fanboys" state. Right now it's still a fair comparison of the cards, that's good...I'm sure the HD's got more in them when ATI gets their drivers straight, good...if so ATI could use some better cards and a stronger market share for a while...give it to them. That's not the point of this thread, that's not the point of the GTX200 or it's support here, the point is to enjoy what you decided to purchase, sure you bought it for a reason, and hopefully what was decided upon was worth the value and has minimal headaches to get you there. There's nothing wrong with GTX or HD cards in my eye, just got the one that made more sense to purchase at the time for me. That's what I refer to when I make GTX and HD comparisons...there's more to it that comparing architectures, hopefull drivers...I weigh in the good and the bad....everything has good and bad to it...just depends on what you're willing to deal with imo.

:toast:

Do you now what is funny. What makes 260 280 cards great ... ATI HD4800. 260 280 are great cards at curent prices, weren`t that great when they show they became great after HD48xx. It is all cool i anderstand what is your point I agree with you. I like ATI because they were always more advanced. Read every PC game minimum you will see that ATI is one generation behind if it is NVIDIA 6600GT is ATI 9800GT if it is NV 7600
it is ATI x800. ATI is better pice of hardware but NVIDIA is sponsoring every worth of playing game some people are seing that as cheeting, in some way is because in neutral game ATI would work better but for as the end of the chain plain users it makes not difernece, only what matters is FPS but looking at the techonolgy ATI was allways better.
 

Kursah

Super Moderator
Staff member
Joined
Oct 15, 2006
Messages
14,666 (2.30/day)
Location
Missoula, MT, USA
System Name Kursah's Gaming Rig 2018 (2022 Upgrade) - Ryzen+ Edition | Gaming Laptop (Lenovo Legion 5i Pro 2022)
Processor R7 5800X @ Stock | i7 12700H @ Stock
Motherboard Asus ROG Strix X370-F Gaming BIOS 6203| Legion 5i Pro NM-E231
Cooling Noctua NH-U14S Push-Pull + NT-H1 | Stock Cooling
Memory TEAMGROUP T-Force Vulcan Z 32GB (2x16) DDR4 4000 @ 3600 18-20-20-42 1.35v | 32GB DDR5 4800 (2x16)
Video Card(s) Palit GeForce RTX 4070 JetStream 12GB | CPU-based Intel Iris XE + RTX 3070 8GB 150W
Storage 4TB SP UD90 NVME, 960GB SATA SSD, 2TB HDD | 1TB Samsung OEM NVME SSD + 4TB Crucial P3 Plus NVME SSD
Display(s) Acer 28" 4K VG280K x2 | 16" 2560x1600 built-in
Case Corsair 600C - Stock Fans on Low | Stock Metal/Plastic
Audio Device(s) Aune T1 mk1 > AKG K553 Pro + JVC HA-RX 700 (Equalizer APO + PeaceUI) | Bluetooth Earbuds (BX29)
Power Supply EVGA 750G2 Modular + APC Back-UPS Pro 1500 | 300W OEM (heavy use) or Lenovo Legion C135W GAN (light)
Mouse Logitech G502 | Logitech M330
Keyboard HyperX Alloy Core RGB | Built in Keyboard (Lenovo laptop KB FTW)
Software Windows 11 Pro x64 | Windows 11 Home x64
Do you now what is funny. What makes 260 280 cards great ... ATI HD4800. 260 280 are great cards at curent prices, weren`t that great when they show they became great after HD48xx. It is all cool i anderstand what is your point I agree with you. I like ATI because they were always more advanced. Read every PC game minimum you will see that ATI is one generation behind if it is NVIDIA 6600GT is ATI 9800GT if it is NV 7600
it is ATI x800. ATI is better pice of hardware but NVIDIA is sponsoring every worth of playing game some people are seing that as cheeting, in some way is because in neutral game ATI would work better but for as the end of the chain plain users it makes not difernece, only what matters is FPS but looking at the techonolgy ATI was allways better.

Yeah I've always gotta kick outta system specs that'll use a newer gen NV and older gen ATI for the same recommendation...even if the recommandation isn't quite relative on performance...but quite a few were close enought hat it didn't matter I suppose...still funny none-the-less. As far as what makes the GTX260/280's great, yes I agree the 4850/4870 competition, I see that working both ways...but ATI get's a little more support out of the chute initially for prices, being the underdog with a helluva damn good performing chip, and possibly taking the big green demon down! That's all good and great, they've done great, and I'm happy for it.

And yes I do see the NV TWIMTBP support all over the place...but if ATI can get enough going for them, what's to stop them from getting more ATI supported games? What's stopped them now? They've had years to make up for in this respect, and it hasn't been done...I think it will eventually or like you said some sort of neutrality should happen...in the end though for true neutrality, imo, they'd need more similar GPU/processing/shader methods and version support to be true to competition...that may make it less interesting imo. It is impressive to see how both sides have done thus far, technology and performance-wise in my eyes...even compared to when it was the x1950xtx vs the monster 8800GTS 320, look at what we have now. Things can only get better for both sides...I just hope closer competition like we're seeing today continues to happen.

:toast:
 

Tatty_Two

Gone Fishing
Joined
Jan 18, 2006
Messages
25,793 (3.88/day)
Location
Worcestershire, UK
Processor Rocket Lake Core i5 11600K @ 5 Ghz with PL tweaks
Motherboard MSI MAG Z490 TOMAHAWK
Cooling Thermalright Peerless Assassin 120SE + 4 Phanteks 140mm case fans
Memory 32GB (4 x 8GB SR) Patriot Viper Steel 4133Mhz DDR4 @ 3600Mhz CL14@1.45v Gear 1
Video Card(s) Asus Dual RTX 4070 OC
Storage WD Blue SN550 1TB M.2 NVME//Crucial MX500 500GB SSD (OS)
Display(s) AOC Q2781PQ 27 inch Ultra Slim 2560 x 1440 IPS
Case Phanteks Enthoo Pro M Windowed - Gunmetal
Audio Device(s) Onboard Realtek ALC1200/SPDIF to Sony AVR @ 5.1
Power Supply Seasonic CORE GM650w Gold Semi modular
Mouse Coolermaster Storm Octane wired
Keyboard Element Gaming Carbon Mk2 Tournament Mech
Software Win 10 Home x64
Number of shaders is irrelevant because of architecture but what it matters is pice od silicon ATI in 2.5 times smaler silicon managed to rival NV. F1 rules says you engines for this season are 2000 cc from teams remains the desing are there going to be 12 cilinders or 4 cilinder the target is to achive more power at 2000 cc space.

I`m thinking what is goint go to be nvidia next step what is next step 1 mile core 1024 bit memory that is nonsense they need to bumpup their efficenci. ATI if want to make the fastest chip simply will rise the shader from 800 to 1600 and the die size still will be smaler than 280gtx.

And if werent 48xx how much will 260-280 cost now a small fortune.

And if werent AMD will be buying P4s at 3.8 GHz with 10mb cache at 1000$

I agree about the costs but as I said earlier, the silicon is irrelivant if it's slower and in ATI's case the number of shaders is far from irrelivent, if it was they would have just put say 200-300 like the 200 series....no, their architecture requires that amount of shaders to perform to it's fullest. Bottom line is performance and energy efficiency are the keys......ATi has 55nm and has so for a while...thats a real :rockout: in my book and hands up to them for moving forward quickly and ahead of NVidia......sad thing is they run hotter than the 65nm competition!
 

r9

Joined
Jul 28, 2008
Messages
3,300 (0.58/day)
System Name Primary|Secondary|Poweredge r410|Dell XPS|SteamDeck
Processor i7 11700k|i7 9700k|2 x E5620 |i5 5500U|Zen 2 4c/8t
Memory 32GB DDR4|16GB DDR4|16GB DDR4|32GB ECC DDR3|8GB DDR4|16GB LPDDR5
Video Card(s) RX 7800xt|RX 6700xt |On-Board|On-Board|8 RDNA 2 CUs
Storage 2TB m.2|512GB SSD+1TB SSD|2x256GBSSD 2x2TBGB|256GB sata|512GB nvme
Display(s) 50" 4k TV | Dell 27" |22" |3.3"|7"
VR HMD Samsung Odyssey+ | Oculus Quest 2
Software Windows 11 Pro|Windows 10 Pro|Windows 10 Home| Server 2012 r2|Windows 10 Pro
:p
Yeah I've always gotta kick outta system specs that'll use a newer gen NV and older gen ATI for the same recommendation...even if the recommandation isn't quite relative on performance...but quite a few were close enought hat it didn't matter I suppose...still funny none-the-less. As far as what makes the GTX260/280's great, yes I agree the 4850/4870 competition, I see that working both ways...but ATI get's a little more support out of the chute initially for prices, being the underdog with a helluva damn good performing chip, and possibly taking the big green demon down! That's all good and great, they've done great, and I'm happy for it.

And yes I do see the NV TWIMTBP support all over the place...but if ATI can get enough going for them, what's to stop them from getting more ATI supported games? What's stopped them now? They've had years to make up for in this respect, and it hasn't been done...I think it will eventually or like you said some sort of neutrality should happen...in the end though for true neutrality, imo, they'd need more similar GPU/processing/shader methods and version support to be true to competition...that may make it less interesting imo. It is impressive to see how both sides have done thus far, technology and performance-wise in my eyes...even compared to when it was the x1950xtx vs the monster 8800GTS 320, look at what we have now. Things can only get better for both sides...I just hope closer competition like we're seeing today continues to happen.

:toast:

I want make one thing clear I`m not saying don`t buy NV cards or some thing like that I`m using one now and I love it :p. Looking at technology standpoint HD48xx are amazing 2.4X times smaller die size and that performance. Looking at the whole package price performace drivers etc. it is dificult choise or better it makes no diference both ATI and NVIDIA are posstioning the cards well. But for the prices to be what they are NV are seling chips with close to none profit. If ATI was selling 48xx with same profite margine 4870 would cost 200 $ now. Smaler die 2.4x times (it is not only about size the smaller it is it is easyer to make it more efficency in fabricating and less silicon) less expensive PCB (because of 256 bit instead of 512) Only thing that is more expensive is use of DDR5 but dont forget it need only 512 instead of 1024. They make so mutch with so litle that what I`m saying.
Yes competition :toast: to that.
 

Kursah

Super Moderator
Staff member
Joined
Oct 15, 2006
Messages
14,666 (2.30/day)
Location
Missoula, MT, USA
System Name Kursah's Gaming Rig 2018 (2022 Upgrade) - Ryzen+ Edition | Gaming Laptop (Lenovo Legion 5i Pro 2022)
Processor R7 5800X @ Stock | i7 12700H @ Stock
Motherboard Asus ROG Strix X370-F Gaming BIOS 6203| Legion 5i Pro NM-E231
Cooling Noctua NH-U14S Push-Pull + NT-H1 | Stock Cooling
Memory TEAMGROUP T-Force Vulcan Z 32GB (2x16) DDR4 4000 @ 3600 18-20-20-42 1.35v | 32GB DDR5 4800 (2x16)
Video Card(s) Palit GeForce RTX 4070 JetStream 12GB | CPU-based Intel Iris XE + RTX 3070 8GB 150W
Storage 4TB SP UD90 NVME, 960GB SATA SSD, 2TB HDD | 1TB Samsung OEM NVME SSD + 4TB Crucial P3 Plus NVME SSD
Display(s) Acer 28" 4K VG280K x2 | 16" 2560x1600 built-in
Case Corsair 600C - Stock Fans on Low | Stock Metal/Plastic
Audio Device(s) Aune T1 mk1 > AKG K553 Pro + JVC HA-RX 700 (Equalizer APO + PeaceUI) | Bluetooth Earbuds (BX29)
Power Supply EVGA 750G2 Modular + APC Back-UPS Pro 1500 | 300W OEM (heavy use) or Lenovo Legion C135W GAN (light)
Mouse Logitech G502 | Logitech M330
Keyboard HyperX Alloy Core RGB | Built in Keyboard (Lenovo laptop KB FTW)
Software Windows 11 Pro x64 | Windows 11 Home x64
Oh yeah I agree that AMD have done a great job getting a lot of power out of very little, turning a profit on it...it's kinda like poker, you play the hands you're dealt, if you play them right you'll have a nice stack of chips to play with...eventually tho, you're gonna make a large gamble...and the chances of losing that bet are great. They've done a great job. To me the GTX GPU is like the R600 was for ATI, large, powerful, ready to rock, just not fully optimized yet. The 200b series could be interesting to see, 55nm GTX cards could do some damage yet...time will tell if that'll be enough for this generation or not. Even if not...I love my GTX260, still glad I got it, just based off of what I wanted and needed it still fits those shoes perfectly, I wish I had the money for a 4870 too just to play around and decide after experiencing both which would truly be my best fit...but I think I made the right choice for me. :D

:toast:
 
Joined
Aug 9, 2006
Messages
1,065 (0.17/day)
System Name [Primary Workstation]
Processor Intel Core i7-920 Bloomfield @ 3.8GHz/4.55GHz [24-7/Bench]
Motherboard EVGA X58 E758-A1 [Tweaked right!]
Cooling Cooler Master V8 [stock fan + two 133CFM ULTRA KAZE fans]
Memory 12GB [Kingston HyperX]
Video Card(s) constantly upgrading/downgrading [prefer nVidia]
Storage constantly upgrading/downgrading [prefer Hitachi/Samsung]
Display(s) Triple LCD [40 inch primary + 32 & 28 inch auxiliary displays]
Case Cooler Master Cosmos 1000 [Mesh Mod, CFM Overload]
Audio Device(s) ASUS Xonar D1 + onboard Realtek ALC889A [Logitech Z-5300 Spk., Niko 650-HP 5.1 Hp., X-Bass Hp.]
Power Supply Corsair TX950W [aka Reactor]
Software This and that... [All software 100% legit and paid for, 0% pirated]
Benchmark Scores Ridiculously good scores!!!
First of all, nVidia's GTX 200 GPUs are massive in large part because they sport massive numbers of ROP partitions which are again responsible for such a large number of transistors which translates into greater manufacturing costs. GTX 260 and GTX 280 GPU's have 28 and 32 ROP partitions respectively while 4870/50 GPU's sport only 16, much like 3870/50 and 2900XT, and even X1950XT and X850XT before them. Primary reason why AMD/ATI refused to increase the ROP counts which have remained static since 2004 (since R400 that is) is because any increase would represent a massive jump in transistor count which would again result in increased manufacturing and retail costs and in the end increased power consumption as well.

So instead of AMD/ATI GPU's being the kings of bang-for-buck GPU-wise as they are now (were for the short time back in June at least), a 4870 GPU with 32 ROP partitions would have a massive die, crazy amounts of transistors, and would cost an arm and a leg, as well run even hotter than 4870 GPUs already do. But, like I said, AMD simply can’t afford it. In the last two years their stock has dropped from around $26 per share back in December 2006 to $5.64 per share as of today (quick NYSE quote). ROP partitions are transistor hungry, something you can easily see if you examine the various die specs and they take sizeable die real estate which results in greater manufacturing costs.

All AMD could do with their latest GPUs is brute force their way by increasing the shader unit count from already massive number of 320 to now 800. Which was really a relatively cheap way to increase performance since shader units on AMD’s GPU’s are much simpler and much more primitive than nVidia’s and therefore much cheaper to tack on. In fact if I remember correctly from an old TechReport article, I think AMD’s shader units are quit a bit more primitive and limited in how they operate vs. that of nVidia. (MAD+MULL ops. ratio wise). Although GPU architectures are severely different, in some respects you could draw a crude estimate of a minimum and maximum throughput rates. For one reason or another AMD GPUs perform very well in certain simple synthetic shader benchmarks, as seen in very thorough reviews done by Digit-Life, yet that does not translate very well into actual real-world gaming results. Another testament of the simplicity of AMD’s shader architecture.

I must admit though, AMD did fix the serious AA performance issues that exited since the introduction of 2900XT by introducing some tweaks and more than doubling the texturing units. A smart and relatively cheap way to win back some performance under certain conditions. Although, by some reports in early 2007, there were in fact hardware bugs that were causing this. Either way, great job AMD/ATI.

To conclude: I’m certain if AMD could afford it, right now they would be producing massive GPUs with beefy transistor counts and complex shader architecture. The sad fact of it is, they simply CAN”T AFFORD IT. All they can do is tweak here and there, and do the most economical things to boost performance. I’m sure nVidia is loosing money with the GTX 200 lineup since these GPUs must have cost quite a bit to manufacture and yet they are giving them away at this point if you consider the launch prices.

Finally, this is getting way off topic. Some moderation maybe?
 

Kursah

Super Moderator
Staff member
Joined
Oct 15, 2006
Messages
14,666 (2.30/day)
Location
Missoula, MT, USA
System Name Kursah's Gaming Rig 2018 (2022 Upgrade) - Ryzen+ Edition | Gaming Laptop (Lenovo Legion 5i Pro 2022)
Processor R7 5800X @ Stock | i7 12700H @ Stock
Motherboard Asus ROG Strix X370-F Gaming BIOS 6203| Legion 5i Pro NM-E231
Cooling Noctua NH-U14S Push-Pull + NT-H1 | Stock Cooling
Memory TEAMGROUP T-Force Vulcan Z 32GB (2x16) DDR4 4000 @ 3600 18-20-20-42 1.35v | 32GB DDR5 4800 (2x16)
Video Card(s) Palit GeForce RTX 4070 JetStream 12GB | CPU-based Intel Iris XE + RTX 3070 8GB 150W
Storage 4TB SP UD90 NVME, 960GB SATA SSD, 2TB HDD | 1TB Samsung OEM NVME SSD + 4TB Crucial P3 Plus NVME SSD
Display(s) Acer 28" 4K VG280K x2 | 16" 2560x1600 built-in
Case Corsair 600C - Stock Fans on Low | Stock Metal/Plastic
Audio Device(s) Aune T1 mk1 > AKG K553 Pro + JVC HA-RX 700 (Equalizer APO + PeaceUI) | Bluetooth Earbuds (BX29)
Power Supply EVGA 750G2 Modular + APC Back-UPS Pro 1500 | 300W OEM (heavy use) or Lenovo Legion C135W GAN (light)
Mouse Logitech G502 | Logitech M330
Keyboard HyperX Alloy Core RGB | Built in Keyboard (Lenovo laptop KB FTW)
Software Windows 11 Pro x64 | Windows 11 Home x64
Alright folks, I just updated the OP with some more information on GTX 260 Bios modding. It's extremely easy, I provided links to the most updated versions of NiBitor, GPU-z and NVFlash I'm aware of. I recommend using GPU-z for backing up BIOS for the simple fact I can't capture my GTX260 bios in Vista x64 with NiBitor...if you are capable of doing so, I'm sure that's perfectly fine.

Lemme know what you think, what you would like added/removed/changed. And I'll post it here too, I'm not responsible for anyone attempting this, neither is TPU or anyone in this forum nore the MFG of the card, if you choose to do this, you are responsible, so pay attention, if you take the right steps, you'll be safe.

:toast:
 
Joined
Aug 9, 2006
Messages
1,065 (0.17/day)
System Name [Primary Workstation]
Processor Intel Core i7-920 Bloomfield @ 3.8GHz/4.55GHz [24-7/Bench]
Motherboard EVGA X58 E758-A1 [Tweaked right!]
Cooling Cooler Master V8 [stock fan + two 133CFM ULTRA KAZE fans]
Memory 12GB [Kingston HyperX]
Video Card(s) constantly upgrading/downgrading [prefer nVidia]
Storage constantly upgrading/downgrading [prefer Hitachi/Samsung]
Display(s) Triple LCD [40 inch primary + 32 & 28 inch auxiliary displays]
Case Cooler Master Cosmos 1000 [Mesh Mod, CFM Overload]
Audio Device(s) ASUS Xonar D1 + onboard Realtek ALC889A [Logitech Z-5300 Spk., Niko 650-HP 5.1 Hp., X-Bass Hp.]
Power Supply Corsair TX950W [aka Reactor]
Software This and that... [All software 100% legit and paid for, 0% pirated]
Benchmark Scores Ridiculously good scores!!!
My new CPU arrives tomorrow, so my primary machine with GTX 260 in it will be back in action again. Right now I'm using a single core Sempron backup machine with a Radeon 800GTO and 512MB of RAM. Not much tweaking potential there. :)

I think I will tweak the BIOS for fan speeds on the GTX 260 ASAP. I want to set it to 100% at all times, since my all-steel case does a good job of soundproofing and honestly I can't really notice that much of a difference between 40% and 100% when I have my headphones on with music or game-noise blasting away.

I haven’t tried the v-mod yet. I’m hoping to see more v-mod results in this thread before I attempt it. So, any pioneering individuals doing these v-mods, feel free to post your results.

Also, anyone looking to sell-off their GTX 260, I’m looking for another one at this time, so send me a PM if you are interested.
 

Kursah

Super Moderator
Staff member
Joined
Oct 15, 2006
Messages
14,666 (2.30/day)
Location
Missoula, MT, USA
System Name Kursah's Gaming Rig 2018 (2022 Upgrade) - Ryzen+ Edition | Gaming Laptop (Lenovo Legion 5i Pro 2022)
Processor R7 5800X @ Stock | i7 12700H @ Stock
Motherboard Asus ROG Strix X370-F Gaming BIOS 6203| Legion 5i Pro NM-E231
Cooling Noctua NH-U14S Push-Pull + NT-H1 | Stock Cooling
Memory TEAMGROUP T-Force Vulcan Z 32GB (2x16) DDR4 4000 @ 3600 18-20-20-42 1.35v | 32GB DDR5 4800 (2x16)
Video Card(s) Palit GeForce RTX 4070 JetStream 12GB | CPU-based Intel Iris XE + RTX 3070 8GB 150W
Storage 4TB SP UD90 NVME, 960GB SATA SSD, 2TB HDD | 1TB Samsung OEM NVME SSD + 4TB Crucial P3 Plus NVME SSD
Display(s) Acer 28" 4K VG280K x2 | 16" 2560x1600 built-in
Case Corsair 600C - Stock Fans on Low | Stock Metal/Plastic
Audio Device(s) Aune T1 mk1 > AKG K553 Pro + JVC HA-RX 700 (Equalizer APO + PeaceUI) | Bluetooth Earbuds (BX29)
Power Supply EVGA 750G2 Modular + APC Back-UPS Pro 1500 | 300W OEM (heavy use) or Lenovo Legion C135W GAN (light)
Mouse Logitech G502 | Logitech M330
Keyboard HyperX Alloy Core RGB | Built in Keyboard (Lenovo laptop KB FTW)
Software Windows 11 Pro x64 | Windows 11 Home x64
The fan mod works like a charm...I just modded a bios for 100% as I can deal with the noise, like you most of the time my headphones are on anyways.

The vmod works fine, for me it was so-so on OC results, I could attain higher stable OC's but not by much at all. Like I've said before, when in Extra mode even at FTW stock clocks the temps rose around 3-5C, that was consistent through load temps too, if not a couple more degrees.

So far I think I'm the only one that's actually performed the vmod, but I'm sure others will try and hopefully report back with results!

:toast:
 

Tatty_Two

Gone Fishing
Joined
Jan 18, 2006
Messages
25,793 (3.88/day)
Location
Worcestershire, UK
Processor Rocket Lake Core i5 11600K @ 5 Ghz with PL tweaks
Motherboard MSI MAG Z490 TOMAHAWK
Cooling Thermalright Peerless Assassin 120SE + 4 Phanteks 140mm case fans
Memory 32GB (4 x 8GB SR) Patriot Viper Steel 4133Mhz DDR4 @ 3600Mhz CL14@1.45v Gear 1
Video Card(s) Asus Dual RTX 4070 OC
Storage WD Blue SN550 1TB M.2 NVME//Crucial MX500 500GB SSD (OS)
Display(s) AOC Q2781PQ 27 inch Ultra Slim 2560 x 1440 IPS
Case Phanteks Enthoo Pro M Windowed - Gunmetal
Audio Device(s) Onboard Realtek ALC1200/SPDIF to Sony AVR @ 5.1
Power Supply Seasonic CORE GM650w Gold Semi modular
Mouse Coolermaster Storm Octane wired
Keyboard Element Gaming Carbon Mk2 Tournament Mech
Software Win 10 Home x64
I did the BIOS mod changing the "extra" to 1.18V from 1.12V....does not seem to have made the slightest difference, I dont know whether thats because the card is hardware limited to 1.12V or because, in actual fact 0.06V is so little an increase it makes no difference to overclocks.
 
Last edited:

Kursah

Super Moderator
Staff member
Joined
Oct 15, 2006
Messages
14,666 (2.30/day)
Location
Missoula, MT, USA
System Name Kursah's Gaming Rig 2018 (2022 Upgrade) - Ryzen+ Edition | Gaming Laptop (Lenovo Legion 5i Pro 2022)
Processor R7 5800X @ Stock | i7 12700H @ Stock
Motherboard Asus ROG Strix X370-F Gaming BIOS 6203| Legion 5i Pro NM-E231
Cooling Noctua NH-U14S Push-Pull + NT-H1 | Stock Cooling
Memory TEAMGROUP T-Force Vulcan Z 32GB (2x16) DDR4 4000 @ 3600 18-20-20-42 1.35v | 32GB DDR5 4800 (2x16)
Video Card(s) Palit GeForce RTX 4070 JetStream 12GB | CPU-based Intel Iris XE + RTX 3070 8GB 150W
Storage 4TB SP UD90 NVME, 960GB SATA SSD, 2TB HDD | 1TB Samsung OEM NVME SSD + 4TB Crucial P3 Plus NVME SSD
Display(s) Acer 28" 4K VG280K x2 | 16" 2560x1600 built-in
Case Corsair 600C - Stock Fans on Low | Stock Metal/Plastic
Audio Device(s) Aune T1 mk1 > AKG K553 Pro + JVC HA-RX 700 (Equalizer APO + PeaceUI) | Bluetooth Earbuds (BX29)
Power Supply EVGA 750G2 Modular + APC Back-UPS Pro 1500 | 300W OEM (heavy use) or Lenovo Legion C135W GAN (light)
Mouse Logitech G502 | Logitech M330
Keyboard HyperX Alloy Core RGB | Built in Keyboard (Lenovo laptop KB FTW)
Software Windows 11 Pro x64 | Windows 11 Home x64
I did the BIOS mod changing the "extra" to 1.18V from 1.12V....does not seem to have made the slightest difference, I dont know whether thats because the card is hardware limited to 1.12V or because, in actual fast 0.06V is so little an increase it makes no difference to overclocks.

Yeah mine was pretty minimal really, what about temps? I noticed that immediately upon hitting EXTRA Clock speeds.

:toast:
 

Tatty_Two

Gone Fishing
Joined
Jan 18, 2006
Messages
25,793 (3.88/day)
Location
Worcestershire, UK
Processor Rocket Lake Core i5 11600K @ 5 Ghz with PL tweaks
Motherboard MSI MAG Z490 TOMAHAWK
Cooling Thermalright Peerless Assassin 120SE + 4 Phanteks 140mm case fans
Memory 32GB (4 x 8GB SR) Patriot Viper Steel 4133Mhz DDR4 @ 3600Mhz CL14@1.45v Gear 1
Video Card(s) Asus Dual RTX 4070 OC
Storage WD Blue SN550 1TB M.2 NVME//Crucial MX500 500GB SSD (OS)
Display(s) AOC Q2781PQ 27 inch Ultra Slim 2560 x 1440 IPS
Case Phanteks Enthoo Pro M Windowed - Gunmetal
Audio Device(s) Onboard Realtek ALC1200/SPDIF to Sony AVR @ 5.1
Power Supply Seasonic CORE GM650w Gold Semi modular
Mouse Coolermaster Storm Octane wired
Keyboard Element Gaming Carbon Mk2 Tournament Mech
Software Win 10 Home x64
Yeah mine was pretty minimal really, what about temps? I noticed that immediately upon hitting EXTRA Clock speeds.

:toast:

No, no noticeable difference in temps either for me, I am gonna flash mine back sometime I think. Although I am more than happy running 24/7 for gaming at 760 linked, I dont bench anymore so I dont need to voltmod otherwise the soldering iron would already be out!
 
Top