Thursday, November 10th 2011

Everything You Need To Know About GeForce GTX 560 Ti 448 Cores

On the 29th of this month, NVIDIA will launch its newest graphics card SKU, the GeForce GTX 560 Ti 448 Cores. We got our first sniff of it last month. Today we present to you all the specifications that matter: clock speeds, voltages, device IDs, etc., but first a brief history. NVIDIA launched the original GeForce GTX 560 Ti back in January, based on its spanking new GF114 silicon. It packed 384 CUDA cores, a 256-bit wide GDDR5 memory interface, 1 GB of memory, high clock speeds, and fairly decent overclocking potential. AMD's Radeon HD 6870 was "pwned" (NVIDIA's words). But then, AMD managed to work closely with its partners to create a 1 GB version of its Radeon HD 6950 graphics card. Coupled with diligent component cost balancing, AMD was able to neuter GTX 560 Ti to a good extant. With the upcoming winter shopping season, NVIDIA does not want to take any chances with its competitiveness in the $250-ish "sweetspot" segment, and hence it had to redesign the GTX 560 Ti.

The new GeForce GTX 560 Ti will come with "448 Cores" brand extension, and as it suggests, the GPU now has 448 CUDA cores as opposed to 384 cores on the original. The new SKU will use the same silicon on which the GTX 570, GTX 580, and dual-GPU GTX 590 are based: GF110. The chip will carry the marking "GF110-270-A1". Apart from the 448 CUDA cores, the new SKU will have a memory bus width of 320-bit, and standard memory amount of 1280 MB, just like the GTX 570. The GTX 560 Ti Core 448 has clock speeds of 732 MHz core, 1464 MHz CUDA cores or shaders, and 950 MHz actual (1900 MHz DDR, or 3.80 GHz GDDR5 effective) memory clock speed. So the only thing that sets the new GTX 560 Ti 448 Cores from GTX 570 is the CUDA core count (448 vs. 480 on the GTX 570).
Moving on to core voltage NVVDD, the GTX 560 Ti 448 Cores has a similar range to the GTX 570, that's 0.950V to 1.100V ± 1%. The card will draw power from two 6-pin connectors. It's not likely that there will be a NVIDIA-reference board design, but even if there is, it will use a cost-effective cooler, similar to EVGA's GTX 570 HD. As for price, SweClockers suggests that it would be exactly in the middle of the price points GTX 560 Ti 384 cores and GTX 570 occupy.Source: SweClockers

Update: After our post on Nov 08, we realised our table could ruffle some feathers at NVIDIA. So we momentarily took the story down and waited for some other source to post news with information resembling ours (yet not citing us). We could count on our pals at SweClockers for something like that.
Add your own comment

38 Comments on Everything You Need To Know About GeForce GTX 560 Ti 448 Cores

#1
newtekie1
Semi-Retired Folder
devguy said:
This is the nVidia thing to do when the competition gets too fierce for a certain part.

1) 2900XT vs 8800GTS 640 - The 8800GTS was the better performer when the 2900XT launched, but better drivers from AMD eventually led to a match, and nVidia rereleased the 8800GTS with 112 shaders (but at least they told partners to label it the SSC edition)

2) 4870 vs GTX260 - The GTX260 launched first, and not expecting the powerful card that was the 4870 (which got even better with newer drivers), nVidia freaked out and rereleased the GTX260 with 216 shaders (no new name or suffix)

3) Now the same deal with the 1GB 6950 is freaking nVidia out as it is priced similarly to the GTX560 TI, and often is able to outperform it.

This tactic is certainly nice for people who purchase the card later in its lifetime, but it certainly is confusing, and I bet if I was an early adopter of these cards, I might be a little miffed by this.
1.) The SSC naming was an eVGA thing, it was not nVidia's, nVidia actually told its AIBs to not market the card as a 112SP version but they said the AIBs could use special edition type naming. The 112 shader version kept the 8800GTS 640MB name officially, the only way to know the difference was to look at the part number. And this was just a way to give the 8800GTS a secret boost and to sell of stock before the 8800GTS 512MB hit 2 months later. It wasn't really a responce to the X2900XT getting better, it was too late in the game to worry about that, and they had G92 to answer that calling.

2.) This was actually when nVidia used a new name, the GTX260 216 was the new name of the card, and was on every box that had a 216 shader GTX260.
Posted on Reply
#2
Completely Bonkers
I'm disappointed TBH. After nearly a year of the current designs, I would have liked to see improvements in performance per x. x=watt, or shader, or $:

But if fails on all accounts, sadly. Look at those yesteryear power and performance figures! This thing has WORSE performance/$ than the 560Ti. What a flunker flop!



Bring on Moore's law! Bring on a new generation!
Posted on Reply
#3
EastCoasthandle
Casecutter said:
How do you sell crippled "Cuda Core" chips without a new SKU?

As to unlocking, I don't recall any previous chips (aka GTX260 Core 216) allowing that. While if they where chips that had a full shader count available, Nvidia wouldn't go through the hassle of shut them off, along with going through the trouble of a new release? They’d just dump them on the AIB's and let them go into "de-contented" boards selling them for less but more profits. As for this GTX460 ti 448 version it sounds like what they’ll do to PCB and power section as the article said "It's not likely that there will be a NVIDIA-reference board design". So I read that as de-contented non-reference board and cooler, right there is approximate $25 savings.
They are preparing to compete with AMD's 7000 series. Until they are ready with their nextgen video cards they will realign their skus to look more competitive. They don't need to create a new sku if they were already ready with their nextgen part. That's the point I'm making here. Besides, they aren't using a reference board for it but using what they already have. That's not a savings, that finding a way to be more competitive with their competition during the later portion of this gen of cards.
Posted on Reply
#4
xtremesv
If Nvidia wanted to compete through the shopping season, wouldn't be easy to just cut prices a little bit for the current 560 Ti and 570 instead of creating a confusing new SKU. I know there could be lots of defective GF110 GPUs ready to be used but are they enough for a product like 560 Ti that is supposed to be sold like hot pancakes?
Posted on Reply
#5
Zubasa
_JP_ said:
Stop dreaming. :slap:
This is nVidia we're talking about. If my RAM serves me correctly, as of now, that technique is doable with 3 ATi models (9550 -> 9600; X800PRO -> X800XT and recently HD6950 -> HD6970). I can't recall any nVidia card that could do this.
You need better RAM than :laugh:
Some of the GTX 465 did unlock to GTX 470.
Posted on Reply
#6
wolf
Performance Enthusiast
These specs put them sooo darn close to a GTX570 it's not even funny, definitely seems like the better buy than a GTX570 at 1/6th less price and only lacking 1/15th of the CUDA cores while keeping the rest. I bet some cards straight unlock to GTX570 spec.

this is good advancement in price/perf IMO for Nvidia as this card is going to be spitting distance from a GTX480 which isn't bad at all, given how far away next gen is.

there is a problem in the comparison chart psoted hoever, amd I'm sure it's not TPUs fault, but the default memory speed on a GTX570 is also 3800mhz not 3600mhz. Just amazing the only difference will be 32 sp's! what a huge leg up from GF114.
Posted on Reply
#7
The Von Matrices
btarunr said:
The voices are telling me $270~$290.

They want to compete with HD 6950 2GB that are in the $279~$299 range. It will be tough for AMD to drive 2GB's cost below that much (due to memory costs and the basic VRM design required for that GPU).
That's ridiculous. I bought my GTX 470's in March of 2010 for $270 each, and a year and a half later a basically identically specced card costs the same? I guess you can't really lower prices when you have a similarly sized chip based on the same process.
Posted on Reply
#8
Syborfical
DarkOCean said:
Why dont they call this gtx 565 it would make alot more sense.
Agreed :-) GTX 565Ti

Nvidia are great at stuffing up names.

Anyone every use a 8600GT with DDR2 .. It shouldn't have been called a GT ....
Posted on Reply
#9
micropage7
EastCoasthandle said:
What this says to me that nvida isn't ready to release a next gen this year. And they are countering AMD next gen cards.
every time they release new stuff they always say the same thing 'this card will perform xxx better than other cards'
Posted on Reply
#10
_JP_
Zubasa said:
You need better RAM than :laugh:
Some of the GTX 465 did unlock to GTX 470.
Oh right, right. :banghead:
But I wouldn't give it any hope still. I mean, nVidia wants to keep selling GTX570s, right?
Posted on Reply
#11
Casecutter
Completely Bonkers said:
I'm disappointed TBH. After nearly a year of the current designs, I would have liked to see improvements in performance per x. x=watt, or shader, or $:

But if fails on all accounts, sadly. Look at those yesteryear power and performance figures! This thing has WORSE performance/$ than the 560Ti. What a flunker flop!
Great point as the GTX480, basically the same performance was $233 just weeks ago this doesn't carry value, but per/watt you're looking probably much improved, but over the 480 that's not saying anything.

EastCoasthandle said:
They aren't using a reference board for it but using what they already have. That's not a savings, that finding a way to be more competitive with their competition during the later portion of this gen of cards.
Well 570's at the least bring strong 8-phase (or 6+2) power sections, could we see them try 4-phase? There're plenty of places to cut like memory and coolers. Being only a "GTX560" Nvidia and there AIB's can do that and such cut-backs will limit OC'n. Let's face it they don't want to eat away at their more premium offerings in that price range.

xtremesv said:
wouldn't be easy to just cut prices a little bit for the current 560 Ti and 570 instead of creating a confusing new SKU. I know there could be lots of defective GF110 GPUs ready to be used but are they enough for a product like 560 Ti that is supposed to be sold like hot pancakes?
Because that erodes the price structure for when Kepler based chips arrive. Nvidia doesn't (or AMD) want to cut pricing anymore; they hope to bring better performance and sell a GTX660ti at and MSRP of say $279, and GTX670 for $350. Neither wants pricing to drop much more at this time, they need to hold it up until after those next-gens are released. This is just more a diversionary spoiler, throw fanbase sloppy seconds, while hoping to garner some of AMD’s momentum on the release their 28Nm part.

Right today I'd grab a Gigabyte GV-N570OC-13I Rev2.0 for $310 at Egg... than wait for one of these at $275. What's an extra 13% for the real deal. Maybe at <$250 these "448 Cuda" card start have a place, but let's wait for W1zzard to do a shake down.
Posted on Reply
#12
94_xj
I don't see how people are criticizing this other than a confusing naming scheme. I would have gone for 565 Ti as well but perhaps they're worried about it being seen as the flop that was the 465 (of which I owned one and was very pleased with it's overclocking but did an EVGA step up to a 470).

To me, I see the 470 being released based on the more power efficient GF110 with higher clock speeds offering a lot of performance at a decent price point. Why is this a bad thing? Being based on the GF110 cards it should be tri-sli capable which will expand options even more for upper-mid level gamers. If I wasn't waiting for Kepler I'd consider swapping my 470s for these for the lower power consumption and I'm guessing better overclockability.

To me this is a win-win. We get a stout performing card at a decent price, nvidia gets to sell GPUs that may not have made the cut for 570s.
Posted on Reply
#13
burebista
I've found a nice green chart but take it with a grain of salt cuz is...well...green. :D



On the other hand it looks like it will be a shortage of those cards so here in Eastern Europe we'll don't see them in stores. At least not at the beginning. :(
Posted on Reply
Add your own comment