• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GeForce GTX 260 with 216 Stream Processors Pictured, Benchmarked

Gam'ster

New Member
Joined
Jan 13, 2007
Messages
979 (0.16/day)
Location
South Wales, UK
System Name Toshiba L350
Processor AMD QL-65
Motherboard Toshiba 780G/SB700
Cooling Laptop
Memory 3GB
Video Card(s) ATI HD3000
Storage 250GB
Display(s) 17" Widescreen
Case Black/Carbon effect
Audio Device(s) OnBoard
Power Supply Brick
Software WIN7
Tbh,i dont care if my card is ati or nvidia.I bought my 4850 'cause it was faster than the 3850 i had and was the same price i payed for the 3850....win.

Also,there is faar to much fanboy slanging crap goin on,play nice guys.Does it really matter which is faster or cheaper or hotter or cooler.We are all pc fans in here and it dont really matter if your card is ati or nvidia,just enjoy what you bought and respect the fact that not everyone shares your opinion.

:toast:, Three cheers for common sence, I really hate when fanboi arguments spill into tpu really drags it down. Also on topic this is good for us, a bit of a price drop here to compete and more performance there its all good.
 

jaydeejohn

New Member
Joined
Sep 26, 2006
Messages
126 (0.02/day)
Lets not forget something important here. The 3xxx series from ATI had price reductions because it couldnt compete with nVidia and the G8/9 series. ATI took it in the shorts having to do that, and also created a certain POV towards their product. Simple truth is, the tides have changed, and now its nVidia having to drop prices to stay competitive. No one would be buying a G260 or G260 with 216 SPs lets try this again model for 450$ now would they? nVidias trying to get some market share here, and they only way they can do it is this way. Nothing wrong with that, competition coming from ATI currently is tough. If its released at the same price point, I think its a win for everyone, as anyone should appreciate more bang for the buck. If however theres price increases associated with this card, it wil not bode well for us, nor nVidia, as once again we see a rename/charge more for minimal gain in performance from nVidia, which has turned alot of people off. Im hoping nVidia gets it right this time, and reduces the current G260 prices, and slips this new one at the old price point, then, like I said, everybody wins
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.23/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
i care only for the prices over here tell me why should i care about prices around the world:banghead: isnt that logical enough? im not trying to convince anyone to buy the 4870 because i like it! my point was that in my situation the 4870 is the best choice. and for your information until the 3850 i owned nvidia cards and i was very pleased with their performance. darkmatter i think that you missunder stood my point and i liked your above mentioned point. sorry if i pissed you off it was never intentional

I know this is off topic, but it shows how immature you are. If I, being an American, had made the same comment about how the rest of the world doesn't matter, I would have been flamed to death.

Haha a little bit off topic here isn't it? ;)

About the "GTX260b" being a weakened GTX280. Of course, that's exactly what it is. That's exactly what GTX260 is.

But about being it a way to sell the GTX200 stock, I don't think it's only because of that. I have said thi before, that IMO Nvidia when they design their chips their goal is to make the chip so that the second card can be the same one with one cluster dissabled. But in order for this you need good yields, if you don't have enough of them you have to dissable one more. Yields is the one thing you can improve a lot over the time, so possibly right now dissabling one cluster is enough to assure a high yield rate.

I don't know why everyone is trying to make some kind of big deal out of this, this is exactly what nVidia has been doing since the 6 series. They design their cores with this flexibility in them for this exact reason, so they can use the defective cores in lower card. I'm amazed at the number of people that don't know nVidia does this(and ATi used to also, I don't know why they stopped) and the number of people that think the GTX260 uses a completely different core than the GTX280. Usually, nVidia's entire high-end teir is the same core, each graphics manufacture only really puts out about 3 actual GPU cores each generation.
 

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.28/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
I don't know why everyone is trying to make some kind of big deal out of this, this is exactly what nVidia has been doing since the 6 series. They design their cores with this flexibility in them for this exact reason, so they can use the defective cores in lower card. I'm amazed at the number of people that don't know nVidia does this(and ATi used to also, I don't know why they stopped) and the number of people that think the GTX260 uses a completely different core than the GTX280. Usually, nVidia's entire high-end teir is the same core, each graphics manufacture only really puts out about 3 actual GPU cores each generation.

Yeah that amazes me too.

I suppose Ati stopped doing it, first, because of the architecture they decided to use didn't permit it really well (you lose 25% of the chip with each cluster on R600/RV670, ring bus, the arrangement of TMUs) and at the same time because yields were high enough to not justify the move. That along with the fact they didn't push the fab process to it's limits. Now for RV770 the only argument of the above that is still valid is that they probably have enough yields, as there's nothing preventing them from doing it again, so I dunno.

You have to take into account they already have one SP for redundancy on each SIMD array, so I supppose that already works out well for them. Even if they have to throw away some chips, they probably save some money because they don't have to test all the chips to see it's working units, just how far in Mhz they can go. You lose a bit in manufacturing, you save a bit on QA, I guess.
 

Rapid

New Member
Joined
Feb 19, 2008
Messages
25 (0.00/day)
Processor Intel e6550 Oc'd to 3.17Ghz
Motherboard Gigabyte GA-N650SLI-DS4L
Cooling Akasa Evo Blue AK-922
Memory GeIL 2GB (2x1GB) PC2-6400C4 800MHz Black Dragon
Video Card(s) Sapphire ATI Radeon X1950XT 256mb
Storage 2 x Maxtor 250GB 16mb Cache (RAID 0)
Display(s) 2 x Iiyama Prolite 17inch LCD
Case Thermaltake Armour
Audio Device(s) Creative Audigy
Power Supply Hiper Type R- 580W
Software Windows XP
Dear God!

I read through this forum a lot, reviews, tips/ tricks etc. And I definitely agree with the point made earlier that people get so fanboy'd up about things. From what I can see many people stick their nose in and argue about things that they know nothing about. Other than the fact that they have some blind loyalty to a brand.

What people fail to realise is that being devoted to one company blinds them from the fact that another company may release a product that is better for them, cheaper etc.

I agree with having a healthy conversation / debate about the pros and cons of a product, but FFS stop getting so immature about it all.
 
Joined
Jul 19, 2006
Messages
43,587 (6.71/day)
Processor AMD Ryzen 7 7800X3D
Motherboard ASUS TUF x670e
Cooling EK AIO 360. Phantek T30 fans.
Memory 32GB G.Skill 6000Mhz
Video Card(s) Asus RTX 4090
Storage WD m.2
Display(s) LG C2 Evo OLED 42"
Case Lian Li PC 011 Dynamic Evo
Audio Device(s) Topping E70 DAC, SMSL SP200 Headphone Amp.
Power Supply FSP Hydro Ti PRO 1000W
Mouse Razer Basilisk V3 Pro
Keyboard Tester84
Software Windows 11
Here is my issue. I read that someone used the term "Nv fans". Things like this should not be said as it may be insulting to them, and/or it can flame into arguments due to the fact that "fanboy-talk" is verablly aggressive. This detracts from the original pourpose to why the thread was posted. Stay on topic please.

nVidia's GTX280 is still better than ATi's cards
is still better than ATi's cards
better than ATi's cards

TY SO MUCH FOR THAT LAFF! Do you really still say you are not an Nvidia fanboy?:roll:

Has never heard of a 4870X2:roll:

Perhaps you should heed my warnings before posting.
 
Joined
Jun 1, 2006
Messages
1,745 (0.27/day)
Location
The Nevada Wasteland
System Name 9th Level
Processor AMD Ryzen 5 5600X
Motherboard MSI X570 Carbon wifi
Cooling EK Basic 360, x2 250mm, x1 140mm, x1 120mm fans.
Memory 32GB Corsair Vengeance 3200mhz.
Video Card(s) EVGA RTX 3080 12GB FTW3
Storage 500gb ssd, 2tb ssd, 6tb HD.
Display(s) MSI 27" Curved 1440p@165hz
Case HAF 932
Power Supply Corsair HX850W
Software Windows 10 64bit
Post something worthwhile, how does calling someone a fanboy help this discussion? You need to see his perspective, not just his post. Yes, the GTX 280 is better. For as low as $420, that's a hell of a card versus a $549 4870 X2, and there are reasons to back that statement. Try to read thru the thread or make a credible argument, not "omg lolololol, fanboy".

Sorry Btr, But he's always bashing ATI. I just find it funny he says he not a fanboy.

next time I will post something worthwhile.;)

Also it takes two GTX280's to beat one 4870X2, so 549$ or 840$
 
Joined
Jun 1, 2006
Messages
1,745 (0.27/day)
Location
The Nevada Wasteland
System Name 9th Level
Processor AMD Ryzen 5 5600X
Motherboard MSI X570 Carbon wifi
Cooling EK Basic 360, x2 250mm, x1 140mm, x1 120mm fans.
Memory 32GB Corsair Vengeance 3200mhz.
Video Card(s) EVGA RTX 3080 12GB FTW3
Storage 500gb ssd, 2tb ssd, 6tb HD.
Display(s) MSI 27" Curved 1440p@165hz
Case HAF 932
Power Supply Corsair HX850W
Software Windows 10 64bit
Perhaps you should heed my warnings before posting.

I've been at work all day sorry Erocker. I'll edit the post if you want.
 
Joined
Apr 21, 2008
Messages
5,250 (0.90/day)
Location
IRAQ-Baghdad
System Name MASTER
Processor Core i7 3930k run at 4.4ghz
Motherboard Asus Rampage IV extreme
Cooling Corsair H100i
Memory 4x4G kingston hyperx beast 2400mhz
Video Card(s) 2X EVGA GTX680
Storage 2X Crusial M4 256g raid0, 1TbWD g, 2x500 WD B
Display(s) Samsung 27' 1080P LED 3D monitior 2ms
Case CoolerMaster Chosmos II
Audio Device(s) Creative sound blaster X-FI Titanum champion,Creative speakers 7.1 T7900
Power Supply Corsair 1200i, Logitch G500 Mouse, headset Corsair vengeance 1500
Software Win7 64bit Ultimate
Benchmark Scores 3d mark 2011: testing
the only point ti increase gtx 260 is trying to beat 4870 , but the original 4870 still win in some test's , but also 72 texturing units and 28 ROPs still interesting , and nvidia become do some things without think did they forget the gtx280 , i see the xfx gtx 260xxx edition tests and it is very close to the gtx280, the new gtx 260 oc edition sure beat gtx280 and i that time gtx280 become useless
 
Top