Monday, February 4th 2013

It's Sony, Not AMD in GeForce Titan's Crosshair

When we first heard of NVIDIA launching its GK110-based consumer graphics card by as early as February, it took us by surprise. Intimidating naming (GeForce Titan 780?) aside, the graphics card is hoping to better NVIDIA's current-generation flagship, the dual-GPU GeForce GTX 690, in a single-GPU package, but does the graphics card market really need NVIDIA to launch its card at the moment? Perhaps not, but the answer lies not with AMD and competition in the graphics card market, but Sony, and competition between PC and console platforms.

Over the weekend, it surfaced that Sony would introduce its next-generation PlayStation console (codenamed "Orbis") later this month, and it would mark the beginning of the next-generation of game consoles. PlayStation 4 features an updated hardware feature-set, and promises to raise the bar with graphics detail that the console industry held with an iron fist for the past half decade. This presents a challenge for not only NVIDIA, but PC gaming in general. Here's how.

It's no news that PC graphics have always trumped consoles, but lost out on the "cost factor." Advocates of consoles falsely compare the cost of an entire PC (approaching or crossing $1,000) with a $300 console. In our opinion, marketing honchos at both NVIDIA and AMD failed to adequately present the argument that a graphics card as a single component costs exactly the same as a game console, and transforms desktop computers that average households already own, into gaming PCs.

With the introduction of the next-generation PlayStation "Orbis," PC graphics companies such as NVIDIA need to launch new products to remind the masses that PC gaming looks, feels, and plays better than consoles, even the newest ones on the block. NVIDIA just happened to have the GK110 lying around.

The GeForce Kepler 110 (GK110) is NVIDIA's (possibly the industry's) biggest GPU. Conceived around the time when the 28 nanometer silicon fabrication process at TSMC was relatively new and prone to yield problems, it was put on the back-burner when NVIDIA realized its second fastest chip, the GK104, stood a real chance against AMD's "Tahiti" high-end GPU. Even as New Year's 2013 approached, the most audacious speculators in the press were led to believe that NVIDIA would take its time launching the GK110 with its GTX 700 series, some time much later than February. What changed? Well for one, Sony and Microsoft agreed to chart out their next-generation console launch schedules, so either's products get maximum market exposure, and that is bad for the PC platform.

The GeForce "Titan" 780 GK110 card, hence, is NVIDIA not only batting for its own GeForce brand (which already leads AMD Radeon in the PC space), but PC gaming in general. We don't expect to see crates full of these graphics cards making their way to stores just yet, but a text-book NVIDIA launch. Over the decade NVIDIA learned that when it has limited initial inventories of a new product and yet wants to avoid the dunce cap of a "paper launch," (a launch that's just on paper, with no public availability), it pools up just enough quantities of the product for worldwide press (for launch date reviews), and limited launches in key markets such as the US and EU.
Add your own comment

86 Comments on It's Sony, Not AMD in GeForce Titan's Crosshair

#1
theoneandonlymrk
Im going with monetizing chips too, makes sense plus atcually using ip up to its revenue potential.
I see btas point but think its prob much simpler.

CASH dam pone
Posted on Reply
#2
qubit
Overclocked quantum bit
by: TheMailMan78
Why? Its called the free market. I remember when people were saying we didn't need AMD or ATI because Intel and NVIDIA would never price gouge due to market demand. Well welcome to the realities of the real world. Market demand is partly driven by competition. This is what happens, and to bash them for it is BS. You would do the SAME THING.

I say bring on the $1000 GPU's.
Yes, I guess you could call it "market forces" and unfortunately is what happens when there's not enough competition, or lack of said forces. That's the real problem here.

And how DARE you suggest I'd do the same thing! My heart is pure and honest and of course I'd never think of price gouging my customers like this. Never!

Mods! Do something about mailman for such insolence! j/k :laugh:
Posted on Reply
#3
erocker
If Nvidia wants to compete with Sony they need a console. Shield is a start, I guess.
Posted on Reply
#4
Zubasa
by: erocker
If Nvidia wants to compete with Sony they need a console. Shield is a start, I guess.
Funny thing is AMD is suppose to be in every new console coming up :p
Posted on Reply
#5
jihadjoe
by: TheMailMan78
Why? Its called the free market. I remember when people were saying we didn't need AMD or ATI because Intel and NVIDIA would never price gouge due to market demand. Well welcome to the realities of the real world. Market demand is partly driven by competition. This is what happens, and to bash them for it is BS. You would do the SAME THING.

I say bring on the $1000 GPU's.
I think the current generation of GPUs proves that "competition" can work against the consumer.

AMD decided their 6970 replacement should be priced at $550 because it was faster than the GTX580, nevermind that it was replacing a $375 part. Nvidia then followed suit: when it found that the replacement for its $250 part was competitive with AMD's $550 offering they decided to launch at $500, instead of properly replacing their previous midrange offering.

If AMD wasn't around during Kepler's launch, I imagine Nvidia would've acted like Intel, and simply refreshed their existing products at their established price points -- GK104 badged as 560Ti and priced at $250. Likewise if the GTX580 didn't exist when AMD was launching the 7970, then it probably would've launched at exactly the same price the 6970 had.
Posted on Reply
#6
jagd
I dont know about 240FPS , May it be for stereoscopic 3D ? It has been talked for PS4 iirc

by: 3870x2
Anyone else find it odd that an Analyst believes they are going to render games at 240 FPS, where current PC games render 30/60FPS? All that with an APU.

I don't think this analyst is worth his weight in horse manure.
Posted on Reply
#7
HumanSmoke
by: Rebel333
I think Nvidia will fail soon, their Tegra 4 is garbage and they lost Xbox 720, PS4. I actually do not mind, because they the one who destroyed 3dfx disgusting way...
Garbage. 3dfx killed 3dfx. Just goes to prove that if enough uninformed people say the same thing it then gets parsed as "truth".

3dfx's problems are actually pretty well documented.
-Kicking their board partners in the nuts by deciding to manufacture and sell their own cards when they bought STB. Taiwanese TSMC and UMC quality >>>>>>>>Juarez, Mexico quality
-A management that burnt money for a hobby
-Dumping 20% of their workforce then (over) paying $186m for GigaPixel.
- Slow and overhyped hardware development. Massive delays with the Rampage chip and competing cards that quickly overhauled 3dfx's raw performance while adding features like 32-bit colour, T&L, better than 256x256 texture support. 3dfx's Voodoo line were already being overhauled by Matrox's G400, Nv's TNT2/GF2 GTS, ATi's Radeon, and STMicro's Kyro.

You might also remember that not only were 3dfx not turning a profit when they sold their IP to Nvidia, they were effectively bankrupt with the prospect of their cards failing compatibility with AGP 3.3V

So, great cards for their day and the vanguard for PC gaming (at least the 3D part), and implemented some awesome tech (AA, T-Buffer etc.)- but 3dfx's problems stemmed from bad money/resource management, an over reliance on artificially inflated performance thanks to GLIDE, and a slower product cycle and feature set additions than the competition.
Posted on Reply
#8
Crap Daddy
by: hardcore_gamer
http://www.nextpowerup.com/news/697/nvidia-in-trouble-with-tegra-4-chips.html
Don't believe all that you read:
http://seekingalpha.com/article/1152151-don-t-believe-this-nvidia-rumor?source=yahoo
Funny thing is AMD is suppose to be in every new console coming up
Steambox, Shield, Ouya and God knows what else will be also gaming consoles. For better or worse it remains to be seen. Everybody nowadays is building a gaming console but yes, Nvidia lost big with the established consoles.
Posted on Reply
#9
SIGSEGV
why the heck nvidia don't fight fire with fire?
i agree with what erocker has already said.
If Nvidia wants to compete with Sony they need a console. Shield is a start, I guess.
In my view if they really intended to fight Sony in console war they need some fire to fight with fire. Console is very different with PC in so many aspects. In addition, they have different market segment. All in all, this would be surely good for us as pc gamer (not console gamer) if nvidia releasing their titan card to the market on this last february.
NVIDIA not only batting for its own GeForce brand
wrong move


by: Rahmat Sofyan
--snip--
meh..
Posted on Reply
#10
TRWOV
by: 3870x2
Anyone else find it odd that an Analyst believes they are going to render games at 240 FPS, where current PC games render 30/60FPS? All that with an APU.

I don't think this analyst is worth his weight in horse manure.
I think he's referring to console games. Most PS3/360 games run at 30fps, a handful run at 60fps. PC games usually don't have a fps cap.
Posted on Reply
#11
[H]@RD5TUFF
Good, $ony needs to be taken down several pegs with their nasty consumer screwing drm.
Posted on Reply
#12
tastegw
Fun thread, wish we could all be in the same room talking about this, I'd bring lots of popcorn with me.

I'm still going with wishful thinking, bring on the Titan!
Posted on Reply
#13
sergionography
its true gaming desktops are expensive but its much more than just for gaming, the idea of graphics cards is that u just buy one, throw it in the desktop which you already have, and there u have a gaming computer, but with mobile taking over and with the lack of any modular upgradable designs on laptops thats were the challenge comes into place

i think if amd and nvidia truly wanna compete with consoles they need to make a solution for external gpus, that way it works on anything using usb3.0/dp or whatever
Posted on Reply
#17
xenocide
by: Zubasa
Funny thing is AMD is suppose to be in every new console coming up :p
It's not like that will make them a ton of money. There's a reason Nvidia doesn't care to get involved in consoles--they don't make a ton of money off of it. In order for Nvidia to make a noticable profit off a game console, they have to sell the GPU above cost by a decent amount, and the console has to sell as many units as the Wii did at first, or like the PS2 did. Otherwise they are just breaking even after you factor in R&D and ManufacturingTesting.
Posted on Reply
#18
TRWOV
When nVidia actually managed to make money on consoles (first Xbox) MS gave them the boot the next time around. Basically the contract stated a fixed price per GPU for the whole 5 year period so, as time went on, nVidia's margins started to widen but MS had to do price cuts in order to keep up with Nintendo and Sony. Since then console manufacturers have wised up and add clauses for periodical price revisions and such.

AMD was present in both the Wii and Xbox 360 but that hardly did any good to them, those kind of design wins only help to pay the bills and keep the machine going.
Posted on Reply
#19
overpass
The console manufacturers just buy designs off AMD and source the chips themselves at fixed prices, I reckon. AMD may gain some traction from this maybe, just maybe that the console game developers might be familiar with the architecture of AMD graphics engines (e.g. Japanese developers especially) and make the port/optimization job that much convenient. Perhaps AMD releasing bundles left and right might have something to do with this new relationship or portend of new things to come.

Nevertheless the Titan will definitely be a sight to behold. Maybe AMD should rename their next chips Saturn, or Olympia.
Posted on Reply
#20
Naito
by: qubit
Thus, we've paid top dollar for a mid range card, pushing up the price of the true top GPU to stratospheric levels.

And that really sucks for us. If you don't feel resentment towards nvidia for doing this, then you should.
I feel no resentment here. As long as I get the performance I paid for, I have no problem paying for a GPU that technically should have been high mid-range. My card sits exactly where I think it should, in terms of performance, for what I paid for it. Maybe the resentment you feel is misdirected and you are actually feeling it for AMD, who dropped the ball with the HD7000 series, causing nVidia to do such a thing.
Posted on Reply
#21
xenocide
by: TRWOV
When nVidia actually managed to make money on consoles (first Xbox) MS gave them the boot the next time around. Basically the contract stated a fixed price per GPU for the whole 5 year period so, as time went on, nVidia's margins started to widen but MS had to do price cuts in order to keep up with Nintendo and Sony. Since then console manufacturers have wised up and add clauses for periodical price revisions and such.

AMD was present in both the Wii and Xbox 360 but that hardly did any good to them, those kind of design wins only help to pay the bills and keep the machine going.
That was what Nvidia had to do to actually make good money off consoles. The Xbox actually sold pretty well considering how short a time it was in the market that generation. I also remember there being something of a disagreement between Nvidia and Microsoft, apparently Nvidia gave them very short notice before discontinuing production of the GPU's they used in the original Xbox. All of that combined with ATi at the time offering more power-efficient GPU's at a better rate, lead to most companies chosing ATi.

by: overpass
The console manufacturers just buy designs off AMD and source the chips themselves at fixed prices, I reckon.
This is how they do it now, because it's a lot more cost-efficient.

by: overpass
AMD may gain some traction from this maybe, just maybe that the console game developers might be familiar with the architecture of AMD graphics engines (e.g. Japanese developers especially) and make the port/optimization job that much convenient. Perhaps AMD releasing bundles left and right might have something to do with this new relationship or portend of new things to come.
The biggest benefit would be if Microsoft and Sony went x86 for their new consoles. It would make porting games between PC and Console almost seemless. I still have a suspicion that either of them will cling to PowerPC for the sake of backwards compatability, but this time I actually hope they prove me wrong. x86-64 architecture combined with DirectX would make porting games between consoles and PC's trivially easier.

by: Naito
I feel no resentment here. As long as I get the performance I paid for, I have no problem paying for a GPU that technically should have been high mid-range. My card sits exactly where I think it should, in terms of performance, for what I paid for it. Maybe the resentment you feel is misdirected and you are actually feeling it for AMD, who dropped the ball with the HD7000 series, causing nVidia to do such a thing.
Pretty much how I stand. I bought a GTX670 knowing it would be an upper-mid level card, and it has not let me down yet. I wish I had held out for a DCII or Windforce version, but I had the money burning a hole in my pocket, and it's no louder than my HD5850 was anyway. If Nvidia had released GK100 and it was as powerful as they made it out to be, it would have either been $750-1000 cards, or AMD would have been crushed and Nvidia would have to go either unopposed, or help keep AMD afloat to prevent anti-trust law suits.
Posted on Reply
#22
SIGSEGV
XBOX 720's GPU Specifications leaked online.

this gpu is rumoured able to do 2x, 4x and even 8x MSAA. what a beast console box.
i wonder what gpu specification inside the playstation 4 is.

so, i understand why nvidia seems so desperate releasing their titan.
Posted on Reply
#23
Naito
by: SIGSEGV
XBOX 720's GPU Specifications leaked online.
Going off FLOPS only, that is theoretically ~5x more powerful than Xbox 360 Xenos.
Posted on Reply
#24
Zubasa
by: SIGSEGV
XBOX 720's GPU Specifications leaked online.

this gpu is rumoured able to do 2x, 4x and even 8x MSAA. what a beast console box.
i wonder what gpu specification inside the playstation 4 is.

so, i understand why nvidia seems so desperate releasing their titan.
I don't really see nVidia being desperate at all.
All they are doing is clearing out all of their scrap GK110 stock anyways.
The ones that don't make it for Tesla etc.
Posted on Reply
#25
buggalugs
Could be true. With 4K TVs going mainstream this year we're going to need to some powerful cards. Maybe Sony are gonna market their new 4K TVs with the new playstation.
Posted on Reply
Add your own comment