Sunday, February 10th 2013

No New GPUs from AMD for the Bulk of 2013

AMD's product manager for desktop graphics products Devon Nekechuk, in an interview with Japanese publication 4Gamer.net, revealed that his firm won't be launching any new Radeon GPUs in 2013, and that the company would instead play out the year on its current Radeon HD 7000 series' performance, with price adjustments and possible performance increments through driver updates. In a slide released to 4Gamers.net, AMD pointed that its Radeon HD 7900 series (high-end), HD 7800 series (performance), and HD 7700 series (mainstream), will carry on the company's mantle "throughout 2013."

This announcement is indication that GPU makers have decided to slow things down from the streak of rapid new GPU launches that lasted from some time around 2007, running up to 2012, which can be heavily taxing in terms of R&D costs for either companies. We know for sure that NVIDIA is clearing its backlog of consumer GPU development by releasing the GeForce GTX "Titan" graphics card in a couple of weeks' time, and we know from older reports that NVIDIA could launch a "refreshed" GeForce Kepler lineup, that largely retains the GeForce Kepler silicon while topping up with subtle changes (clock speeds, software features that don't involve redesigning the silicon, etc.,) but AMD coming out in the open with this announcement could change everything. NVIDIA has the opportunity to save a few coins by sticking to its current lineup (plus the upcoming GTX "Titan,") and responding to competition from AMD by price-adjustments and timely driver optimizations of its own.

Source: 4Gamer.net
Add your own comment

233 Comments on No New GPUs from AMD for the Bulk of 2013

#1
sergionography
by: TRWOV
It already comes with one SMX module disabled isn't it? The full chip is supposed to have 15 SMX AFAIK but the core count suggest 14 active.
the tesla chip does, i doubt the same will be for the consumer gk110
i remember it was the same thing during the gtx480 era were the pro cards had disabled parts
the reason being is pro cards have to be 24/7 operation guaranteed
Posted on Reply
#2
Xzibit
by: TRWOV
It already comes with one SMX module disabled isn't it? The full chip is supposed to have 15 SMX AFAIK but the core count suggest 14 active.
I don't know why anyone thinks its 14 or higher if it has 1 SMX disabled.

C'MON MAN

K20X
14 SMX

K20
13 SMX

So if it has 1 disabled its a K20 that didnt sell or (was binned but we need more detail on the chip and its differences)

Heck it could be 1 disable and be referancing a K20 and be a 12 SMX, W1zzard did allude that hertz was not the only surprise so there is another or more.

SURPRISE!!!
TITAN
12 SMX

It would follow the GK104 model 680, 670, 660Ti (GK110 model K20X, K20, TITAN). Disabling 1 SMX along with others components. TITAN could be a gimped K20 GK110 like the GK104 in 660Ti is to 670.

K20 GK110s that could be salvaged by disabling 1 SMX or other components and you get your Limited TITAN product.
Posted on Reply
#3
Protagonist
by: Xzibit
I don't know why anyone thinks its 14 or higher if it has 1 SMX disabled.

C'MON MAN

K20X
14 SMX

K20
13 SMX

So if it has 1 disabled its a K20 that didnt sell or (was binned but we need more detail on the chip and its differences)

Heck it could be 1 disable and be referancing a K20 and be a 12 SMX, W1zzard did allude that hertz was not the only surprise so there is another or more.

SURPRISE!!!
TITAN
12 SMX

It would follow the GK104 model 680, 670, 660Ti (GK110 model K20X, K20, TITAN). Disabling 1 SMX along with others components. TITAN could be a gimped K20 GK110 like the GK104 in 660Ti is to 670.

K20 GK110s that could be salvaged by disabling 1 SMX or other components and you get your Limited TITAN product.
And they call it GTX685
Posted on Reply
#4
HumanSmoke
by: Xzibit
I don't know why anyone thinks its 14 or higher if it has 1 SMX disabled.
Probably because:
1.The full GK 110 has 15 SMX @ 192 cores per, and
2.When was the last time a Tesla (or Quadro) part features more shaders than the GeForce variant (hint: never).
by: Xzibit
So if it has 1 disabled its a K20 that didnt sell or (was binned but we need more detail on the chip and its differences)
...or more likely a higher leakage GPU
by: Xzibit

SURPRISE!!!
TITAN
12 SMX
Extremely unlikely :laugh: Tesla K20X/K20 are A1 revision GPU's...and GPU's invariably yield better as the process matures.
More likely 14 SMX (or possibly 15 if a fully enabled die falls outside Tesla's 225-235W power envelope / assuming that the 15th SMX isn't built in redundancy to improve yield)
by: GC_PaNzerFIN
I can guarantee you there will be like a ton of not-good-enough-for-Titan GK110s. Its massive chip with poor yields ;)
And you know this how? Do you work for Nvidia or TSMC ? There have been precisely two SKU's based on the GK 110...one has 93% functionality, and the other 86.7%. Using your logic, Tahiti's yields must also be poor, since the split is 100%, 87.5%, and 75%.
Posted on Reply
#5
qubit
Overclocked quantum bit
by: Xzibit
I don't know why anyone thinks its 14 or higher if it has 1 SMX disabled.

C'MON MAN

K20X
14 SMX

K20
13 SMX

So if it has 1 disabled its a K20 that didnt sell or (was binned but we need more detail on the chip and its differences)

Heck it could be 1 disable and be referancing a K20 and be a 12 SMX, W1zzard did allude that hertz was not the only surprise so there is another or more.

SURPRISE!!!
TITAN
12 SMX

It would follow the GK104 model 680, 670, 660Ti (GK110 model K20X, K20, TITAN). Disabling 1 SMX along with others components. TITAN could be a gimped K20 GK110 like the GK104 in 660Ti is to 670.

K20 GK110s that could be salvaged by disabling 1 SMX or other components and you get your Limited TITAN product.
No, that's not right. The GK110 has 15 SMX units and is used in the K20 card.

The apparently cut down version used in the upcoming Titan has one disabled, making 14 SMX units.

NVIDIA's official whitepaper can be downloaded from here, which shows this.
Posted on Reply
#6
Aceman.au
Well guess Im buying a titan then. Thanks for making it an easy choice (of course I'll look @ numbers against my 7970s before buying though)
Posted on Reply
#7
qubit
Overclocked quantum bit
by: Aceman.au
Well guess Im buying a titan then. Thanks for making it an easy choice (of course I'll look @ numbers against my 7970s before buying though)
It's gonna cost like a GTX 690. Are you prepared to pay that kind of money?
Posted on Reply
#8
Aceman.au
by: qubit
It's gonna cost like a GTX 690. Are you prepared to pay that kind of money?
Yeah. The performance jump would have to be relatively high for me to buy it though.
Posted on Reply
#9
qubit
Overclocked quantum bit
Good for you. :toast:

I wish I could buy it - I'd seriously geek out with a card like this. :D
Posted on Reply
#10
leopr
I sincerely hope the Titan will outperform my CF of 7970 @ 1200/1700, if thats the case now it would be the moment to get rid of my crossfire.
Posted on Reply
#11
NeoXF
^ OK people, enough with the GK110 talk, there's already 3 or more threads in the news section alone, about it.

On to the subject at hand. I'm all for it... as long as it's not because AMD is in a bad rut. Otherwise I'm pretty sure nVidia will follow suit, do you guys really think they'd shoot themselves in the foot like that? Release GF Titan for $900 then 1-2Q later, release GTX 780 for $500 that would offer the same performance? In either case, both nVidia and AMD have to release something mighty amazing when time is due (Q4 2013+).
Posted on Reply
#12
SIGSEGV
well, some of people have a lot of money to buy those titan, that's great :laugh:, but for me, i have no reason spending ton of money for titan (benchmark? i don't care about benchmark scores lol), instead i would go to buy the next gen console especially ps4 replacing my current console machine (ps3) for around $350 - $400. :rockout:

i'd like to say thanks to amd for no new gpus in 2013. :rockout:
Posted on Reply
#13
buggalugs
Oh well I guess I can save some money on GPUs this year.
Posted on Reply
#14
Axaion
One thing all the AMD guys keep forgetting when posting those higher framerates benchmarks for amd is

"FRAME LATENCY"


Let the massive excrement war begin, may the dank side be victorious.
Posted on Reply
#15
Eagleye
by: Axaion
One thing all the AMD guys keep forgetting when posting those higher framerates benchmarks for amd is

"FRAME LATENCY"

Let the massive excrement war begin, may the dank side be victorious.
Think you missed the memo where AMD fixed "FRAME LATENCY" :slap:

by: GC_PaNzerFIN
It is codenamed to be different gen than GK104. Unless you are claiming they will never release anything else but the top model with that chip and throw away all not too well working chips I can't see how this would be any different. Sure, price is this time higher but well competition has moved to play with consoles so...
From the info out there, the GK110 was made b4 the GK104.

by: HumanSmoke
And you know this how? Do you work for Nvidia or TSMC ? There have been precisely two SKU's based on the GK 110...one has 93% functionality, and the other 86.7%. Using your logic, Tahiti's yields must also be poor, since the split is 100%, 87.5%, and 75%.
It took nvidia roughly 6 Months to supply Amazon roughly 3000 cards.:eek:

by: newtekie1
Actually, I think nVidia was very ready with the card, but waited for AMD to make the move so they would know what they had to compete against. The weak showing by AMD this round is why nVidia ended up releasing what they planned to be a mid-range card as the actual high end card. They didn't need to release GK110 so they held it back to retaliate against whatever AMD had planned next. The lame step of simply boosting clock speeds meant nVidia didn't really need to retaliate.
It took nvidia 6/7 years to design and make the Keplar chip, so how they quickly put one together in 3/4 months after seeing the AMD 7 series is beyond me.:confused:
Posted on Reply
#16
EarlZ
This is bad news.. it may mean that Nvidia will sell at a higher price since there is no competition :(
Posted on Reply
#17
Axaion
by: Eagleye
Think you missed the memo where AMD fixed "FRAME LATENCY" :slap:



From the info out there, the GK110 was made b4 the GK104.



It took nvidia roughly 6 Months to supply Amazon roughly 3000 cards.:eek:



It took nvidia 6/7 years to design and make the Keplar chip, so how they quickly put one together in 3/4 months after seeing the AMD 7 series is beyond me.:confused:
Indeed i have, please provide source, as i cant freaking find it.
Posted on Reply
#18
EarlZ
by: Eagleye
Think you missed the memo where AMD fixed "FRAME LATENCY" :slap:
Link to a credible source please
Posted on Reply
#19
okidna
by: EarlZ
Link to a credible source please
Since Catalyst 13.2 BETA 3 : http://support.amd.com/us/kbarticles/Pages/AMDCatalyst132BetaDriver.aspx
AMD Catalyst 13.2 Beta 3 for Windows
  • Improves performance up to 15% in high MSAA cases for the Crysis 3 beta release
  • Future updates will be made to AMD Catalyst 13.2 Beta to further improve performance in Crysis 3 for both single GPU and CrossFire configurations
  • Significantly improves latency performance in Skyrim, Boderlands 2, and Guild Wars 2
  • Improves single GPU performance up to 50% for DmC Devil May Cry
  • Improves CrossFire performance up to 10% for Crysis 2
  • Resolves Texture flickering seen in DirectX9.0c applications.
TR did a review for this driver : http://techreport.com/review/24218/a-driver-update-to-reduce-radeon-frame-times
Posted on Reply
#21
NeoXF
I keep thinking... who honestly thinks nVidia would offer anything new other that this Titan thing, for this year, I mean, hello, even if it is $900 bucks, that'd be shooting theselves in the foot, the theoretical GTX 780 will have to end up faster than "Titan"... Maybe even be based on Maxwell?

Either way, there's chance of refreshes (withing this "gen") from both sides.
Posted on Reply
#24
wowman
Possibly AMD is getting cold feet and is shook knowing Titan's performance number.
Posted on Reply
#25
TheMailMan78
Big Member
I think AMD just showed the hand of the industry as a whole. Mobile and console are the future and pursuing a shrinking market (desktop PC's) aggressively isn't a smart move. NVIDIA has to anymore. They hold no real advantage at this point except in the dedicated GPU market.
Posted on Reply
Add your own comment