• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Editorial It's Sony, Not AMD in GeForce Titan's Crosshair

With the introduction of the next-generation PlayStation "Orbis," PC graphics companies such as NVIDIA need to launch new products to remind the masses that PC gaming looks, feels, and plays better than consoles

I think this is where the editorial went off the rails. One more video card isn't going to deter the console market. And new consoles aren't going to deter the PC gaming market. I think people are looking for a non-existent deeper meaning, when it's most likely, "How the hell can we monetize these chips that didn't cut it for Telsa products?"
 
Im going with monetizing chips too, makes sense plus atcually using ip up to its revenue potential.
I see btas point but think its prob much simpler.

CASH dam pone
 
Why? Its called the free market. I remember when people were saying we didn't need AMD or ATI because Intel and NVIDIA would never price gouge due to market demand. Well welcome to the realities of the real world. Market demand is partly driven by competition. This is what happens, and to bash them for it is BS. You would do the SAME THING.

I say bring on the $1000 GPU's.

Yes, I guess you could call it "market forces" and unfortunately is what happens when there's not enough competition, or lack of said forces. That's the real problem here.

And how DARE you suggest I'd do the same thing! My heart is pure and honest and of course I'd never think of price gouging my customers like this. Never!

Mods! Do something about mailman for such insolence! j/k :laugh:
 
If Nvidia wants to compete with Sony they need a console. Shield is a start, I guess.
 
If Nvidia wants to compete with Sony they need a console. Shield is a start, I guess.
Funny thing is AMD is suppose to be in every new console coming up :p
 
Why? Its called the free market. I remember when people were saying we didn't need AMD or ATI because Intel and NVIDIA would never price gouge due to market demand. Well welcome to the realities of the real world. Market demand is partly driven by competition. This is what happens, and to bash them for it is BS. You would do the SAME THING.

I say bring on the $1000 GPU's.

I think the current generation of GPUs proves that "competition" can work against the consumer.

AMD decided their 6970 replacement should be priced at $550 because it was faster than the GTX580, nevermind that it was replacing a $375 part. Nvidia then followed suit: when it found that the replacement for its $250 part was competitive with AMD's $550 offering they decided to launch at $500, instead of properly replacing their previous midrange offering.

If AMD wasn't around during Kepler's launch, I imagine Nvidia would've acted like Intel, and simply refreshed their existing products at their established price points -- GK104 badged as 560Ti and priced at $250. Likewise if the GTX580 didn't exist when AMD was launching the 7970, then it probably would've launched at exactly the same price the 6970 had.
 
I dont know about 240FPS , May it be for stereoscopic 3D ? It has been talked for PS4 iirc

Anyone else find it odd that an Analyst believes they are going to render games at 240 FPS, where current PC games render 30/60FPS? All that with an APU.

I don't think this analyst is worth his weight in horse manure.
 
I think Nvidia will fail soon, their Tegra 4 is garbage and they lost Xbox 720, PS4. I actually do not mind, because they the one who destroyed 3dfx disgusting way...
Garbage. 3dfx killed 3dfx. Just goes to prove that if enough uninformed people say the same thing it then gets parsed as "truth".

3dfx's problems are actually pretty well documented.
-Kicking their board partners in the nuts by deciding to manufacture and sell their own cards when they bought STB. Taiwanese TSMC and UMC quality >>>>>>>>Juarez, Mexico quality
-A management that burnt money for a hobby
-Dumping 20% of their workforce then (over) paying $186m for GigaPixel.
- Slow and overhyped hardware development. Massive delays with the Rampage chip and competing cards that quickly overhauled 3dfx's raw performance while adding features like 32-bit colour, T&L, better than 256x256 texture support. 3dfx's Voodoo line were already being overhauled by Matrox's G400, Nv's TNT2/GF2 GTS, ATi's Radeon, and STMicro's Kyro.

You might also remember that not only were 3dfx not turning a profit when they sold their IP to Nvidia, they were effectively bankrupt with the prospect of their cards failing compatibility with AGP 3.3V

So, great cards for their day and the vanguard for PC gaming (at least the 3D part), and implemented some awesome tech (AA, T-Buffer etc.)- but 3dfx's problems stemmed from bad money/resource management, an over reliance on artificially inflated performance thanks to GLIDE, and a slower product cycle and feature set additions than the competition.
 

Don't believe all that you read:
http://seekingalpha.com/article/1152151-don-t-believe-this-nvidia-rumor?source=yahoo

Funny thing is AMD is suppose to be in every new console coming up

Steambox, Shield, Ouya and God knows what else will be also gaming consoles. For better or worse it remains to be seen. Everybody nowadays is building a gaming console but yes, Nvidia lost big with the established consoles.
 
why the heck nvidia don't fight fire with fire?
i agree with what erocker has already said.
If Nvidia wants to compete with Sony they need a console. Shield is a start, I guess.
In my view if they really intended to fight Sony in console war they need some fire to fight with fire. Console is very different with PC in so many aspects. In addition, they have different market segment. All in all, this would be surely good for us as pc gamer (not console gamer) if nvidia releasing their titan card to the market on this last february.

NVIDIA not only batting for its own GeForce brand

wrong move



meh..
 
Anyone else find it odd that an Analyst believes they are going to render games at 240 FPS, where current PC games render 30/60FPS? All that with an APU.

I don't think this analyst is worth his weight in horse manure.

I think he's referring to console games. Most PS3/360 games run at 30fps, a handful run at 60fps. PC games usually don't have a fps cap.
 
Good, $ony needs to be taken down several pegs with their nasty consumer screwing drm.
 
Fun thread, wish we could all be in the same room talking about this, I'd bring lots of popcorn with me.

I'm still going with wishful thinking, bring on the Titan!
 
its true gaming desktops are expensive but its much more than just for gaming, the idea of graphics cards is that u just buy one, throw it in the desktop which you already have, and there u have a gaming computer, but with mobile taking over and with the lack of any modular upgradable designs on laptops thats were the challenge comes into place

i think if amd and nvidia truly wanna compete with consoles they need to make a solution for external gpus, that way it works on anything using usb3.0/dp or whatever
 
Funny thing is AMD is suppose to be in every new console coming up :p

It's not like that will make them a ton of money. There's a reason Nvidia doesn't care to get involved in consoles--they don't make a ton of money off of it. In order for Nvidia to make a noticable profit off a game console, they have to sell the GPU above cost by a decent amount, and the console has to sell as many units as the Wii did at first, or like the PS2 did. Otherwise they are just breaking even after you factor in R&D and Manufacturing\Testing.
 
When nVidia actually managed to make money on consoles (first Xbox) MS gave them the boot the next time around. Basically the contract stated a fixed price per GPU for the whole 5 year period so, as time went on, nVidia's margins started to widen but MS had to do price cuts in order to keep up with Nintendo and Sony. Since then console manufacturers have wised up and add clauses for periodical price revisions and such.

AMD was present in both the Wii and Xbox 360 but that hardly did any good to them, those kind of design wins only help to pay the bills and keep the machine going.
 
Last edited:
The console manufacturers just buy designs off AMD and source the chips themselves at fixed prices, I reckon. AMD may gain some traction from this maybe, just maybe that the console game developers might be familiar with the architecture of AMD graphics engines (e.g. Japanese developers especially) and make the port/optimization job that much convenient. Perhaps AMD releasing bundles left and right might have something to do with this new relationship or portend of new things to come.

Nevertheless the Titan will definitely be a sight to behold. Maybe AMD should rename their next chips Saturn, or Olympia.
 
Thus, we've paid top dollar for a mid range card, pushing up the price of the true top GPU to stratospheric levels.

And that really sucks for us. If you don't feel resentment towards nvidia for doing this, then you should.

I feel no resentment here. As long as I get the performance I paid for, I have no problem paying for a GPU that technically should have been high mid-range. My card sits exactly where I think it should, in terms of performance, for what I paid for it. Maybe the resentment you feel is misdirected and you are actually feeling it for AMD, who dropped the ball with the HD7000 series, causing nVidia to do such a thing.
 
When nVidia actually managed to make money on consoles (first Xbox) MS gave them the boot the next time around. Basically the contract stated a fixed price per GPU for the whole 5 year period so, as time went on, nVidia's margins started to widen but MS had to do price cuts in order to keep up with Nintendo and Sony. Since then console manufacturers have wised up and add clauses for periodical price revisions and such.

AMD was present in both the Wii and Xbox 360 but that hardly did any good to them, those kind of design wins only help to pay the bills and keep the machine going.

That was what Nvidia had to do to actually make good money off consoles. The Xbox actually sold pretty well considering how short a time it was in the market that generation. I also remember there being something of a disagreement between Nvidia and Microsoft, apparently Nvidia gave them very short notice before discontinuing production of the GPU's they used in the original Xbox. All of that combined with ATi at the time offering more power-efficient GPU's at a better rate, lead to most companies chosing ATi.

The console manufacturers just buy designs off AMD and source the chips themselves at fixed prices, I reckon.

This is how they do it now, because it's a lot more cost-efficient.

AMD may gain some traction from this maybe, just maybe that the console game developers might be familiar with the architecture of AMD graphics engines (e.g. Japanese developers especially) and make the port/optimization job that much convenient. Perhaps AMD releasing bundles left and right might have something to do with this new relationship or portend of new things to come.

The biggest benefit would be if Microsoft and Sony went x86 for their new consoles. It would make porting games between PC and Console almost seemless. I still have a suspicion that either of them will cling to PowerPC for the sake of backwards compatability, but this time I actually hope they prove me wrong. x86-64 architecture combined with DirectX would make porting games between consoles and PC's trivially easier.

I feel no resentment here. As long as I get the performance I paid for, I have no problem paying for a GPU that technically should have been high mid-range. My card sits exactly where I think it should, in terms of performance, for what I paid for it. Maybe the resentment you feel is misdirected and you are actually feeling it for AMD, who dropped the ball with the HD7000 series, causing nVidia to do such a thing.

Pretty much how I stand. I bought a GTX670 knowing it would be an upper-mid level card, and it has not let me down yet. I wish I had held out for a DCII or Windforce version, but I had the money burning a hole in my pocket, and it's no louder than my HD5850 was anyway. If Nvidia had released GK100 and it was as powerful as they made it out to be, it would have either been $750-1000 cards, or AMD would have been crushed and Nvidia would have to go either unopposed, or help keep AMD afloat to prevent anti-trust law suits.
 
XBOX 720's GPU Specifications leaked online.

this gpu is rumoured able to do 2x, 4x and even 8x MSAA. what a beast console box.
i wonder what gpu specification inside the playstation 4 is.

so, i understand why nvidia seems so desperate releasing their titan.
 
XBOX 720's GPU Specifications leaked online.

Going off FLOPS only, that is theoretically ~5x more powerful than Xbox 360 Xenos.
 
Last edited:
XBOX 720's GPU Specifications leaked online.

this gpu is rumoured able to do 2x, 4x and even 8x MSAA. what a beast console box.
i wonder what gpu specification inside the playstation 4 is.

so, i understand why nvidia seems so desperate releasing their titan.
I don't really see nVidia being desperate at all.
All they are doing is clearing out all of their scrap GK110 stock anyways.
The ones that don't make it for Tesla etc.
 
Back
Top