Discussion in 'News' started by btarunr, Feb 5, 2013.
899 bucks ? I'd rather save my money for buying both PS4 and XBOX 720.
I can see the marketing campaign now!
GEFORCE TITAN 780..Easily Satisfying the Demands of All Your Console Ports!
And the amount of memory that is placed on the PCB and the firmware - yes. There is no architectural differences in the GPU itself. The most you'd see is some disabled CUDA Clusters here and there, but that doesn't change the underlying architecture.
Sure if you want to stay in 1080p.
I'm rather interested in the remaining pieces of the 700 series. Possible gtx 760 = current gtx 670 ?
why are you guys worrying about 1080p... I would hope that anyone that spends 900 on a card would have a 1600p or 3d surround setup
$900 USD, lol.
Too expensive, who needs this to play console ports... blah blah blah.
This place never gets old.
$900, still no improvement in perf/USD, 2bad. Yawn. That's not how you popularize PC gaming.
Tough times for nVidia, Tesla got a problem with Xeon Phi, Tegra 4 apparently sucks, next gen consoles use AMD hardware (although a mistake IMHO, absolute performance and perf/Watt wise), GeForce might be next in trouble (I certainly hope so, because of their price politics, PhysX restrictions, 1/24 double precision performance of GTX600 series, they practically invented and killed GPGPU for consumer)...
It used to be that PC gaming was clearly superior. Graphics were light years ahead, controls were....more than 4 buttons lol. And the fact that the game can be extensive and lengthly.
Now that difference is much smaller while the difference in price still remains a huge gap. $250 console is ready to game with any TV and works as soon as you turn it on without much hassle. PC gaming needs over $1000 and a lot more problems.
From my experience, xbox live seems to have much less server problems than any of the PC titles. There tends to be less hackers and cheaters on xbox live. However, the big downside is there's a lot of screechy 10 year old boys that will curse at anything unfavorable to him.
The answer is clear for the vast majority of the people who wants to game. It's cheaper, it's simpler, it works much much more often, it cheaper, it's much more reliable, it eats less juice, it's cheaper, and............it requires no upgrades so therefore it's cheaper.
This card will allow you to go from 100fps in console ports to over 9000fps!
which is exactly what makes me think Titan is a dual gpu. Plus the memory amount 2x 3Gb makes more sense then a single gpu 6gb card. Then I'd expect the single gpu variant to be in the 450-500$ range with 3gb while offering 10-20% more performance than the 680.
Well, you see, it's not as easy to maintain a 64 players server as it is a 24 players server.
Under that context all mainstream/performance laptops would be considered the top of economical efficiency.
PCs don't "Require" updates, advanced game engines does. If you are going to stall that, well....
The graphics bit might be true now, but historically it's not true. Look at the old SNES JRPG's for instance (for extensive and lenghty), or the original Tomb Raider and Goldeneye for the N64 (for graphics).
And when you get to certain genres the console has always been superior (fighting and sports comes to mind).
Point to be noted: what's known for sure that's the Titan will sport a GK110, which is faster than the GK104 used in the GTX690.
Now, it's difficult to believe that Nvidia will retail a card based on two state-of-the-art/flagship/etc GPUs, and retail it for less than another card with 2 lesser GPUs.
How did this become a PC vs Console thread?
There are no doubts PC has its own advantages and Consoles have their own.
One thing is for sure, you can't put all the passion you put into your PC on a Console.
On topic : If this is a single GPU and will have unlocked voltage then count me in for one, I've been longing for a worthy 6990 upgrade.
but in this case GTX690 will better deal than this one with 1080P resolution.
Yup...because the target market for $900 graphics cards uses 1920x1080
To those bitching and moaning about the graphical detail of "console ports", I suggest that software development generally follows hardware development. I think I'd be more concerned with the content (originality) of a game before its textures' ability to saturate the framebuffer, or looping post-process compute function for an end result that yields minimal image quality improvements while heavily decreasing framerate.
yes, software development generally follows hardware development but it also follows the money. in case you havn't noticed, for the past 6 years developers have been heavily focused on console titles. the hardware doesn't change so you get a nice static API to work with which means more efficient developers which leads to higher profit margin. this is not going to change.
also note that yes PC sales have been up the past couple years respectively but that is mainly because the games being developed are not pushing the hardware. developers are not bothering!
It probably has to. Integrated graphics are evolving faster than discrete, and that benefits the one company that couldn't give a shit about game dev relations. With screen resolution and APIs (D3D, OGL) increasing only fractionally for graphics horsepower requirement, and the prospect of rasterization remaining the only form of gaming render, the ball is in Nvidia's -and more increasingly, AMD's (with their console hw and Gaming Evolved program) court.
On a related note, here's a graph I made (numbers averaged between JPR and Mercury Research where both were available) for an article (still under consideration at another site) tracing the history of graphics from the 1950's military simulators to the present day. The trend is pretty self explanatory.
With a market share like that intel should make a GPU that is decent to game on and they would skyrocket.
Don't forget that not all console games come to PC, eg i would have loved to see MGS4 Guns Of The Patriot come to PC, but it was a PS3 exclusive so i had to get a PS3 just to play the game.
As someone wrote I'd rather save the $899 and by both PS4 & XBOX720 just so i can play the exclusive titles that i love, tho i wish the exclusive titles were Multi-Platform so i would just use the PC, but its not like that.
the point is they don't have to. it is probably more profitable not building a discrete gpu.
if history is any indicator the gpu card is going to way of the sound card, ethernet card and raid card. unless you need the absolute best you simply don't need one.
Am I missing something here? They will get thousands of complaints..... all those 32bit systems with no Ram left to play the game with...... keen gamers are not necessarily tech enthusiasts
/not sure if you're playing the ironic card
Just a wild guess on my part, but there could be an outside possibility that someone spending $900 on graphics could possibly be using a 64-bit OS. And I'm not sure that "thousands" of prospective Titan owners would still be tied to a 32-bit operating system...if indeed, thousands of Titan cards are actually produced.
Maybe Crysis 3 or Witcher 3 to maximize the VRAM on 3 monitors with 8xFSAA and 18xAF?
Separate names with a comma.