Monday, February 4th 2013

It's Sony, Not AMD in GeForce Titan's Crosshair

When we first heard of NVIDIA launching its GK110-based consumer graphics card by as early as February, it took us by surprise. Intimidating naming (GeForce Titan 780?) aside, the graphics card is hoping to better NVIDIA's current-generation flagship, the dual-GPU GeForce GTX 690, in a single-GPU package, but does the graphics card market really need NVIDIA to launch its card at the moment? Perhaps not, but the answer lies not with AMD and competition in the graphics card market, but Sony, and competition between PC and console platforms.

Over the weekend, it surfaced that Sony would introduce its next-generation PlayStation console (codenamed "Orbis") later this month, and it would mark the beginning of the next-generation of game consoles. PlayStation 4 features an updated hardware feature-set, and promises to raise the bar with graphics detail that the console industry held with an iron fist for the past half decade. This presents a challenge for not only NVIDIA, but PC gaming in general. Here's how.

It's no news that PC graphics have always trumped consoles, but lost out on the "cost factor." Advocates of consoles falsely compare the cost of an entire PC (approaching or crossing $1,000) with a $300 console. In our opinion, marketing honchos at both NVIDIA and AMD failed to adequately present the argument that a graphics card as a single component costs exactly the same as a game console, and transforms desktop computers that average households already own, into gaming PCs.

With the introduction of the next-generation PlayStation "Orbis," PC graphics companies such as NVIDIA need to launch new products to remind the masses that PC gaming looks, feels, and plays better than consoles, even the newest ones on the block. NVIDIA just happened to have the GK110 lying around.

The GeForce Kepler 110 (GK110) is NVIDIA's (possibly the industry's) biggest GPU. Conceived around the time when the 28 nanometer silicon fabrication process at TSMC was relatively new and prone to yield problems, it was put on the back-burner when NVIDIA realized its second fastest chip, the GK104, stood a real chance against AMD's "Tahiti" high-end GPU. Even as New Year's 2013 approached, the most audacious speculators in the press were led to believe that NVIDIA would take its time launching the GK110 with its GTX 700 series, some time much later than February. What changed? Well for one, Sony and Microsoft agreed to chart out their next-generation console launch schedules, so either's products get maximum market exposure, and that is bad for the PC platform.

The GeForce "Titan" 780 GK110 card, hence, is NVIDIA not only batting for its own GeForce brand (which already leads AMD Radeon in the PC space), but PC gaming in general. We don't expect to see crates full of these graphics cards making their way to stores just yet, but a text-book NVIDIA launch. Over the decade NVIDIA learned that when it has limited initial inventories of a new product and yet wants to avoid the dunce cap of a "paper launch," (a launch that's just on paper, with no public availability), it pools up just enough quantities of the product for worldwide press (for launch date reviews), and limited launches in key markets such as the US and EU.
Add your own comment

86 Comments on It's Sony, Not AMD in GeForce Titan's Crosshair

#1
Easy Rhino
Linux Advocate
consoles have the one thing that gamers want that desktops will never have: ease of use.

it takes a lot more time and effort to maintain a desktop where a console is automatic. less parts to fail, less to worry about. flick the switch on and you are gaming.
Posted on Reply
#2
RejZoR
Hopefully the prices won't be as idiotically high as they were till now. Because i wouldn't mind going with GeForce this time around...
Posted on Reply
#3
TheMailMan78
Banstick Dummy
by: Easy Rhino
consoles have the one thing that gamers want that desktops will never have: ease of use.

it takes a lot more time and effort to maintain a desktop where a console is automatic. less parts to fail, less to worry about. flick the switch on and you are gaming.
RROD victims would disagree.
Posted on Reply
#4
Slizzo
by: FordGT90Concept
Doesn't the PlayStation 4, Wii U, and next Xbox all have an AMD GPU? NVIDIA is literally only competing in the computer market so they have to hit hard. Makes sense why NVIDIA would launch a monster--their revenue stream is in danger of drying up.
Hardly drying up. They're killing in the mobile market with Tegra. A lot of the GPU development information gets fed into that program as well. They're also still performing well in the desktop GPU market.

If anything, nVidia is in a much stronger position now than they've ever been in the past.
Posted on Reply
#5
Siskods9
I think the new consoles will have more in game physics effects with all those cpu cores... So much for Physx Nvidia :laugh:

With Sony and MS both looking for 60fps @ 1080p (well they should be!) and with consoles using similar hardware to PCs (more than before) we should in theory see more better looking ports but how they will play and perform on PC in terms of the quality of the ports remains to be seen.

As DX9 was the previous standard for consoles, does anyone know if the next gen console games will be developed using DX10 or DX11 etc?

As both new consoles will use Bluray, will this mark the end of DVD games as a format on PC too? (I know there are other factors like digital distribution too).
Posted on Reply
#6
brandonwh64
Addicted to Bacon and StarCrunches!!!
by: TheMailMan78
RROD victims would disagree.
Yea kinda sucked for xbox and PS3 users to get such hardware failure. I still have my original SNES and N64 while they have had quadruple the game time on them as xbox/PS3 have been released. Factors that came into play with xbox/ps3 was the low heat point LEADFREE solder.
Posted on Reply
#7
Calin Banc
by: Filiprino
PC already has standards (OpenGL, Direct3D), but the quality of drivers is not up to par. Making tweaks for games shouldn't be necessary if their software was properly defined and their internals well done, but you know, they also want people to partner with AMD or NVIDIA, not both, so doing nasty things on purpose is also something to take into account.
I'm not talking about those. I'm talking about optimizations for multi-core CPUs, DX10/10.1/11/11.1, on every game those partner companies launch. I'm talking about working with other devs in order to implement more features for the PC crowd, to make a PC first and then port it to consoles. I'm talking about GPU accelerated physics and AI and so the list goes on. :)
Posted on Reply
#9
Easy Rhino
Linux Advocate
by: TheMailMan78
RROD victims would disagree.
you missed the point. and even so, PCs have many many more ways to fail and a lot more software layers to get through.
Posted on Reply
#10
TheMailMan78
Banstick Dummy
by: Easy Rhino
you missed the point. and even so, PCs have many many more ways to fail and a lot more software layers to get through.
They only fail when people think they know how to take of a OS better then MS does or fail to follow mfg. directions.
Posted on Reply
#11
3870x2
Anyone else find it odd that an Analyst believes they are going to render games at 240 FPS, where current PC games render 30/60FPS? All that with an APU.

I don't think this analyst is worth his weight in horse manure.
Posted on Reply
#12
TheMailMan78
Banstick Dummy
by: 3870x2
Anyone else find it odd that an Analyst believes they are going to render games at 240 FPS, where current PC games render 30/60FPS? All that with an APU.

I don't think this analyst is worth his weight in horse manure.
I render BF3 well above 60fps. Problem is my monitor. If you can get a 240Mhz monitor/TV I could run Quake 3 all day at 240 FPS. :laugh: I think it all depends on the game and monitor because even current APU's can do that. They are not saying the PS4 will run BF4 at 240 FPS. They are just saying it can run SOMETHING at 240 FPS.

FYI Carmack already said next gen consoles are staying at the 30FPS cap in most circumstances so that "analyst" is playing with semantics......or is just a retard.
Posted on Reply
#13
Rebel333
I think Nvidia will fail soon, their Tegra 4 is garbage and they lost Xbox 720, PS4. I actually do not mind, because they the one who destroyed 3dfx disgusting way...
Posted on Reply
#14
NeoXF
by: Slizzo
They're killing in the mobile market with Tegra.
Tegra is a POS, thank you for playing.

Only breakthrough they made there is because of the Nexus 7, and all the OS/software optimizations and tweaks that came with.


Well, I remember the rumored pricing on AMD's HD 8800s cards, and the speculation that they would outperform any of the current gen (pre-Titan) single GPUs. And I'm pretty sure a pair of R8870s would demolish the GK110... for what? $560?
Posted on Reply
#15
3870x2
by: TheMailMan78
....or is just a retard.
Pretty much what I was pointing out. Even high end PCs can't play games from 6 years ago at a solid 240 FPS. (getting pretty damned close though)
Posted on Reply
#16
McSteel
You're all forgetting that games written for consoles are done in assembly as much as possible. The one strength of a console is it's uniformity. There's only one possible choice of hardware, so you don't go through drivers and unknowns, but code directly for well-known hardware, shedding some pretty significant overhead and inefficiency.

A well-executed APU with shared GPU and CPU cache could be a rather potent tool in the hands of a skilled coder. Having an engine that never fetches GPU instructions and mesh data from RAM, only using it for immediately needed variables for final rendering, and only looking in VRAM for raw texture data would mean it executes an order of magnitude faster than a typical Direct3D title. Add a bit of driverless low-level access to all registers and shaders, and you have a machine that could very well render at 120 FPS constant. 240, I'm not sure about, but I suppose it's doable...
Posted on Reply
#17
TheMailMan78
Banstick Dummy
by: McSteel
You're all forgetting that games written for consoles are done in assembly as much as possible. The one strength of a console is it's uniformity. There's only one possible choice of hardware, so you don't go through drivers and unknowns, but code directly for well-known hardware, shedding some pretty significant overhead and inefficiency.

A well-executed APU with shared GPU and CPU cache could be a rather potent tool in the hands of a skilled coder. Having an engine that never fetches GPU instructions and mesh data from RAM, only using it for immediately needed variables for final rendering, and only looking in VRAM for raw texture data would mean it executes an order of magnitude faster than a typical Direct3D title. Add a bit of driverless low-level access to all registers and shaders, and you have a machine that could very well render at 120 FPS constant. 240, I'm not sure about, but I suppose it's doable...
Doesn't matter how optimize the code is if the hardware isn't there to push it. APU's are damn nice but they are still APU's and cannot hold a candle to a current mid-range dedicated GPU. Cannot substitute horsepower with skill.
Posted on Reply
#18
qubit
Overclocked quantum bit
An editorial! We need more of these please. :)

What gets me is the sleight of hand that nvidia did to increase the price of graphics cards by pitching the mid range chip of the next generation architecture (Kepler) as a top end product (GTX 680) instead of something like GTX 660 where it belongs, simply because it beat the GTX 580. In contrast, the previous generation Fermi GPU in the GTX 580 is a true top end chip.

Thus, we've paid top dollar for a mid range card, pushing up the price of the true top GPU to stratospheric levels.

And that really sucks for us. If you don't feel resentment towards nvidia for doing this, then you should.
Posted on Reply
#19
TheMailMan78
Banstick Dummy
by: qubit
An editorial! We need more of these please. :)

What gets me is the sleight of hand that nvidia did to increase the price of graphics cards by pitching the mid range chip of the next generation architecture (Kepler) as a top end product (GTX 680) instead of something like GTX 660 where it belongs, simply because it beat the GTX 580. In contrast, the previous generation Fermi GPU in the GTX 580 is a true top end chip.

Thus, we've paid top dollar for a mid range card, pushing up the price of the true top GPU to stratospheric levels.

And that really sucks for us. If you don't feel resentment towards nvidia for doing this, then you should.
Why? Its called the free market. I remember when people were saying we didn't need AMD or ATI because Intel and NVIDIA would never price gouge due to market demand. Well welcome to the realities of the real world. Market demand is partly driven by competition. This is what happens, and to bash them for it is BS. You would do the SAME THING.

I say bring on the $1000 GPU's.
Posted on Reply
#20
McSteel
by: TheMailMan78
Doesn't matter how optimize the code is if the hardware isn't there to push it. APU's are damn nice but they are still APU's and cannot hold a candle to a current mid-range dedicated GPU. Cannot substitute horsepower with skill.
Oh? Go tell that to the 1964 Morris Mini Cooper S :p
Posted on Reply
#21
TheMailMan78
Banstick Dummy
by: McSteel
Oh? Go tell that to the 1964 Morris Mini Cooper S :p
We are not talking about crappy European economy car racing. You can substitute the car with a pickle.
Posted on Reply
#22
RejZoR
by: Easy Rhino
consoles have the one thing that gamers want that desktops will never have: ease of use.

it takes a lot more time and effort to maintain a desktop where a console is automatic. less parts to fail, less to worry about. flick the switch on and you are gaming.
Yes and no. If you mostly buy games on Steam, half of the job has already been done. So in the end, all you need to do is update graphic drivers here and there. amd.com and nvidia.com. Not exactly a complicated thing to do.

You also have to understand that while PC may be more complicated, it's not locked down platform. I can play games that were designed for PC's from 20 years ago on a current modern systems.

Or even patch them yourself. For example Need for Speed 3 game released in 1998, i've patched it myself and you can play it on a brand new 2013 PC pretty much without any hassle. You just slam in CD, run my patch that copies the game files and updates them and voila. Try doing that with a PS2 game on a PS3. Or an Xbox game on a X360.

Developers don't give a toss even though some of us would buy refreshed games (like we did with Serious Sam HD series). But on PC you at least have community patches like my NFS3 patch and hundreds of others. On consoles you can only stick a finger up your bottom because developers don't care and you have no community.
Posted on Reply
#23
bpgt64
So, was there any news as to whether it's going to be a dual GPU based card or...single?
Posted on Reply
#24
Recus
by: Rebel333
I think Nvidia will fail soon, their Tegra 4 is garbage and they lost Xbox 720, PS4. I actually do not mind, because they the one who destroyed 3dfx disgusting way...


by: NeoXF
Tegra is a POS, thank you for playing.

Only breakthrough they made there is because of the Nexus 7, and all the OS/software optimizations and tweaks that came with.


Well, I remember the rumored pricing on AMD's HD 8800s cards, and the speculation that they would outperform any of the current gen (pre-Titan) single GPUs. And I'm pretty sure a pair of R8870s would demolish the GK110... for what? $560?
Keep dreaming.
Posted on Reply
#25
xorbe
by: btarunr
With the introduction of the next-generation PlayStation "Orbis," PC graphics companies such as NVIDIA need to launch new products to remind the masses that PC gaming looks, feels, and plays better than consoles
I think this is where the editorial went off the rails. One more video card isn't going to deter the console market. And new consoles aren't going to deter the PC gaming market. I think people are looking for a non-existent deeper meaning, when it's most likely, "How the hell can we monetize these chips that didn't cut it for Telsa products?"
Posted on Reply
Add your own comment