Monday, July 16th 2012

GeForce GTX 660 Arrives Mid-August: Report

NVIDIA's newest product designed to strike the price-performance "sweetspot," the GeForce GTX 660, is set for a mid-August market launch, according to a SweClockers report. The new chip could roll out some time between August 13 and 19. Given that other Kepler-based SKUs have been launched on Tuesdays or Thursdays, it's likely that the launch date could be either the 14th, or the 16th. The GTX 660 will be based on the 28 nm "GK104" GPU. It will feature 1,344 or 1,152 CUDA cores, and a 192-bit wide GDDR5 memory interface, holding 1.5 GB of memory, according to the report. The new GPU could capture a crucial sub-$300 price-point.
Sources: SweClockers, VideoCardz
Add your own comment

78 Comments on GeForce GTX 660 Arrives Mid-August: Report

#51
phanbuey
bim27142"and a 192-bit wide GDDR5 memory interface, holding 1.5 GB of memory, according to the report. The new GPU could capture a crucial sub-$300 price-point"

Ahuh.... So theoretically how will this fare with a 256-bit 7870 (more or less of the same price) or even with a 256-bit 7850 (which is of course cheaper)? :confused:
It might spank them, or it might be the same speed. But I can guarantee that the series is aimed at the 7870 and 7850 so it definitely won't be slower.
Posted on Reply
#52
Benetanegia
I agree it can be nearly anything. But unless they use some absurdingly low clocks, I can't think of it being more than 25% slower than the 680 since it'll have exactly 25% of the chip disabled and it never scales linearly with this kind of architecture, I mean the 670 shows exactly the opposite. How much of that non-linearity is shared by the 660 is impossible to determine right now, but common sense says an architecture doesn't go from a situation where linearity is completely lacking to absolute linearity in just one more disabled cluster. So if I had to say something, I'd say it will offer 80% the performance of the 680.
Posted on Reply
#53
Nihilus
Speculating about "profit margins" based on die size is pointless to say the least. Fact is the market demands mid-range competition to see higher sales for both companies. PERIOD!!
Posted on Reply
#54
Nihilus
sanadanosaWhy? Because they make mid end graphics cards that perform par or better than AMD's high end? better efficiency too and cheaper than AMD at launch? I think it's big plus for me as a consumer.
People on this forum seem to be disconnected from reality. A $400 GTX 670 is NOT a mid-range card. Neither is a $400 GTS 630 or $400 GT 610. It's still a $400 rose by any other name. Releasing a $1000 EXTREME!! card months before there midrange is the real show of competence. :shadedshu
Posted on Reply
#55
bim27142
phanbueyIt might spank them, or it might be the same speed. But I can guarantee that the series is aimed at the 7870 and 7850 so it definitely won't be slower.
Hope it beats the 7870 or at least be at par with it but more power efficient then this will be winner... Otherwise, it's going to be a matter of brand loyalty I guess... :D
Posted on Reply
#56
Wrigleyvillain
PTFO or GTFO
bim27142Otherwise, it's going to be a matter of brand loyalty I guess... :D
Oh that never sells any GPUs... :rolleyes:
Posted on Reply
#57
sanadanosa
NihilusPeople on this forum seem to be disconnected from reality. A $400 GTX 670 is NOT a mid-range card. Neither is a $400 GTS 630 or $400 GT 610. It's still a $400 rose by any other name. Releasing a $1000 EXTREME!! card months before there midrange is the real show of competence. :shadedshu
GTX 670 is not a mid-range card considering its performance, but it uses mid-range kepler GPU since 104 means nvidia's mid-range chip. About my post that saying mid end graphics card, I'm sorry, it's my bad. What I mean is mid-end gpu.
Posted on Reply
#58
Zubasa
sanadanosaGTX 670 is not a mid-range card considering it's performance, but it uses mid-range kepler GPU since 104 means nvidia's mid-range chip.
That is all just speculation :rolleyes:
The fact that there is no GK100 but a GK110 suggests that it is meant for the GTX700 series from the beginning ;)
That or it is simply a matter of fact that the 28nm process is simply not ready for a GK100 so they scraped it.

Regardless the GK104 is the highest end Kepler nVidia has right now and that is a fact.
Posted on Reply
#59
sanadanosa
ZubasaThat is all just speculation :rolleyes:
The fact that there is no GK100 but a GK110 suggests that it is meant for the GTX700 series from the beginning ;)
That or it is simply a matter of fact that the 28nm process is simply not ready for a GK100 so they scraped it.

Regardless the GK104 is the highest end Kepler nVidia has right now and that is a fact.
yes you're right, it's just my speculation based on anandtech's review that comparing GK104 with the GF104, which I think they try to compare it with same level chip from previous generation.
Posted on Reply
#60
Benetanegia
NihilusSpeculating about "profit margins" based on die size is pointless to say the least. Fact is the market demands mid-range competition to see higher sales for both companies. PERIOD!!
The market might demand whatever it wants, it cannot get it regardless. Mid-range and low-end are demanded by the largest amount of people and as such are usually the largest markets by revenue, especially mid-range, because nice ASP meets high volumes of sales. Flash news, there's a wafer start shortage!! They cannot sell 1 million cards of any type even if there is a market demand for them. They are going to sell a limited amount of cards whether these cost $500 or $100 and because of that they only offered the $500 card first. Bussiness wise Nvidia made the best thing for Nvidia.

And I hope your comment about die size was not related to my post, because I didn't say such a thing. I said quite the opposite in fact. The thing is GK104 cards are way cheaper to produce based on everything, simpler PCB, less memory modules, simpler PWM, components... especially the 670. And of course die size. Whoever thinks that a sub-300 mm^2 chip (15% smaller than GF104!!!), in a cheap 256 bit card, was Nvidia's biggest contender this round, is completely deluded.
Posted on Reply
#61
micropage7
why dont they make the GPU like this

im afraid crushing the die if i use aftermarket cooler
Posted on Reply
#62
Elmo
st.boneNo they are far from being screwed, coz they are screwing you the consumer even harder
Everyone has to make money.
Posted on Reply
#63
Naito
CasecutterThese are to be "sub-$300 parts? offering, 192-Bit with only 1.5 Gb?
st.bone1.5GB at this time 192bit shame on you Nvidia, why do you let people down?
Dunno why people are so concerned about only 1.5GB VRAM on a 192-bit bus. My GTX 470 has a 320-bit bus, and only achieves 136GB/s @ 850MHz. If nVidia were to clock the GTX 660's memory at, at least 1250MHz on 192-bit (maybe approaching 125-130GB/s with 1500MHz), I'm sure it will have more than enough bandwidth.

What I am trying to get at, is that, even with a higher memory clock providing more bandwidth, my frame rate didn't show much of an improvement - as I'm sure bandwidth ain't everything (well, at least it isn't for me, at 1920x1080). And as for the VRAM capacity, 1.5GB should satisfy the card just fine. My 1280MB does just fine, and I can run pretty much every game maxed out.
Posted on Reply
#64
Hilux SSRG
I think the GTX 660 is arriving late due to overabundance of unsold 560 series variants and nvidia's hope to clear the channels as much as possible.

I am eager to learn how it shapes up against the 7800 series.
Posted on Reply
#65
Benetanegia
videocardz.com/33874/geforce-gts-650-is-3-faster-than-gts-550-ti

I'm starting to believe that the 660 might be based on GK106 after all, but it's not what everyone thought it was. That or GK106 was scrapped altogether which makes very little sense to me.

My little new theory is that GK106 took longer because it was slightly redesigned when GK100 was scrapped and GK104 took the high-end card position, in order to be able to compete in the performance class*. Early rumors speculated about GK106 being a 768 SP card, basically 2 clusters where GK104 has 4, with 2 SMX each. And it was always rumored to be a 192 bit part. Basically folowing the tradition set with fermi cards, fair enough, it made sense.

But now I think that GK106 might use the same clusters as GK110, with 3 SMX each for a total of 6, which would turn it into a 1152 SP, 192 bit chip. Exactly what rumors say for GTX660.

Like I said the alternative is that GK106 won't be released and that makes no sense to me. But with GK107 + GDDR5 taking the GTS650 name, it looks very very unlikely for it to exist if GTX660 is based on GK104 too. Yeah non Ti 660 being GK106, but only 1 SKu based on the chip is unlikely too: the non-ti 660 theory has always been supported by harvested GK106 being the 650.

Well at least August will be interesting I guess.

* You just cannot compete if your mid-range/performace GPU is half the size of your high-end. It's been half a decade since the second chip offers at least 2/3 the performance, often time 3/4. Going straight to 1/2 is suicide. Of course GK104 was THAT chip, just like GF104 and G94 senved that purpose, but since GK104 became the one to go in high-end cards a new such chip is required. IMO a 6 SMX GK106 would be it. It's good balanced, it doen't require a lot of reworking because the 3 SMX designed was already tested for GK100/110 and packing them in just 2 clusters saves a lot of space. Only "sacrifice" is geometry/tesselation compared to GK104, but that's irrelevant for mid-range because of this:



It would still beat a GTX580 on pure tesselation.
Posted on Reply
#67
gopal
Is it is better then GTX 560ti 448 or hd 7850

EDIT: Wait i thought the GTX 660ti is going to release in the mid-august not the GTX 660+
Well i think the GTX 660 is better then HD 7850 and the GTX 660ti is better then hd 7870 and some where below the hd 7950
Posted on Reply
#68
MxPhenom 216
ASIC Engineer
micropage7why dont they make the GPU like this
benchmarkreviews.com/images/reviews/video_cards/NVIDIA-GTX-460/NVIDIA-GeForce-GTX-460-GF104-GPU.jpg
im afraid crushing the die if i use aftermarket cooler
Why would you want a Heatspreader on a GPU? One of the worst ideas Nvidia has thought of doing. Without a heat spreader you get better heat transfer and direct cooling of the die. And if your afraid of crushing a die, you should definitely look into laying off the roids.
Posted on Reply
#69
D007
I bet it's going to be a sweet card.. 1.5 gig is a nice place to be in memory for a "mid range" card..
Posted on Reply
#70
MxPhenom 216
ASIC Engineer
D007I bet it's going to be a sweet card.. 1.5 gig is a nice place to be in memory for a "mid range" card..
GTX680/670 should have 4GB and then the GTX660ti etc should have 2GB. that sounds a bit better i think
Posted on Reply
#71
INSTG8R
Vanguard Beta Tester
nvidiaintelftwGTX680/670 should have 4GB and then the GTX660ti etc should have 2GB. that sounds a bit better i think
Agreed. AMD have no problem with plenty of RAM.
Posted on Reply
#72
shaglocc
Hmm

Maybe Nvidia is going to wait till they drop prices of the 680 and 670 before they come out with the 660 so they can lessen the gap in the price range and sell the 660 at a 200-250 price range because a 560ti still cost $200+ they might drop price of 560 ti and replace the 560ti price range with the 660 and that opens a gapp to slap a 660ti in for 50-100 dollars more. Just a theory
Posted on Reply
#73
INSTG8R
Vanguard Beta Tester
BenetanegiaSnip...


Did you even look at that graph??

How in the hell am I expected to believe a 7870 has better tessellation than a 7970??:wtf:
Kinda calls the rest of that graphs accuracy into question...
Posted on Reply
#74
Benetanegia
INSTG8Rtechreport.com/r.x/geforce-gtx-680/tessmark-x64.gif

Did you even look at that graph??

How in the hell am I expected to believe a 7870 has better tessellation than a 7970??:wtf:
Kinda calls the rest of that graphs accuracy into question...
Because the HD7870 has the exact same dual engine as the HD7970 and runs nearly 100 mhz higher. The benchmark is accurate.
Posted on Reply
#75
INSTG8R
Vanguard Beta Tester
BenetanegiaBecause the HD7870 has the exact same dual engine as the HD7970 and runs nearly 100 mhz higher. The benchmark is accurate.
Fair enough its a Ghz 7870 then?
Posted on Reply
Add your own comment
Apr 24th, 2024 02:05 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts