Monday, February 18th 2013
NVIDIA GeForce GTX Titan Graphics Card Pictured in Full
Here it is, folks; the first pictures of NVIDIA's newest pixel crunching dreadnought, the GeForce GTX Titan. Pictures leaked by various sources east of the Greenwich Median reveal a reference board design that's similar in many ways to that of the GeForce GTX 690, thanks to the magnesium alloy cooler shroud, a clear acrylic window letting you peep into the aluminum fin stack, and a large lateral blower. The card features a glowy "GeForce GTX" logo much like the GTX 690, draws power from a combination of 6-pin and 8-pin PCIe power connectors, and features two SLI bridge fingers letting you pair four of them to run 3DMark Fire Strike as if it were a console port from last decade.The GeForce GTX Titan PCB reveals that NVIDIA isn't using a full-coverage IHS on the GK110 ASIC, rather just a support brace. This allows enthusiasts to apply TIM directly on the chip's die. The GPU is wired to a total of twenty four 2 Gbit GDDR5 memory chips, twelve on each side of the PCB. The card's VRM appears to consist of a 6+2 phase design which uses tantalum capacitors, slimline chokes, and driver-MOSFETs. The PCB features a 4-pin PWM fan power output, and a 2-pin LED logo power output that's software controllable.
Given the rumored specifications of the GTX Titan, the card could be overkill for even 2560 x 1600, and as such could be designed for 3DVision Surround (3 display) setups. Display outputs include two dual-link DVI, an HDMI, and a DisplayPort.
According to most sources, the card's specifications look something like this:
Sources:
Egypt Hardware, VideoCardz
Given the rumored specifications of the GTX Titan, the card could be overkill for even 2560 x 1600, and as such could be designed for 3DVision Surround (3 display) setups. Display outputs include two dual-link DVI, an HDMI, and a DisplayPort.
According to most sources, the card's specifications look something like this:
- 28 nm GK110-based ASIC
- 2,688 CUDA cores ("Kepler" micro-architecture)
- 224 TMUs, 48 ROPs
- 384-bit GDDR5 memory interface
- 6 GB memory
- Clocks:
o 837 MHz core
o 878 MHz maximum GPU Boost
o 6008 MHz memory - 250W board power
118 Comments on NVIDIA GeForce GTX Titan Graphics Card Pictured in Full
Nvidia Launch Prices:
GTX280 - $650
GTX480 - $500
GTX580 - $500
GTX680 - $499
Granted not a huge drop towards the end, but I think $500 for the pinnacle of GPU performance is somewhat a sweet spot. Nope. The GK110 should be a revision of the GK100, which was the successor to the GTX580. That means all that compute functionality that was stripped away from GTX680 is back and presumably better than ever. If I recall, didn't the HD7970 launch with clocks similar to what is reported for Titan? Around 800-850 MHz? Who's to say the Titan won't overclock at least as well? GK104 and Tahiti have had no problems being stretched upwards of 30% for retail cards...
Average % OCs are a bit less for GTX 670/680 in comparison.
I am curious to see how this Titan card OCs. I'm glad they're supposedly restoring real voltage control to the card, but with the chip being so large (7.1B transistors!!!), I kind of doubt that it's going to OC much more than 10-15%.
The big average overclocks seen from the 7950 and 7970 are somewhat uncommon for high end GPUs.
And for the record, when I saw ocholic.ch's "review" of it, I was absolutely ecstatic about getting one during the summer maybe... even tho they smelled like complete bullshit... Thanks nVidia for confirming that... no GPUs to look forward to anytime soon then...
Actually, I'm fairly certain this is still the case. Converting a gaming GPU to a "workstation" GPU through a bios flash will make a gaming GPU appear to be a workstation GPU, but it doesn't mean it will have the same lack of computation error...
Boobs :)
If I have 1 girl I get 2 or X-Fire/SLI.
If I have a pair of girls I have 4 or Quad-Fire/Quad-SLI
Pro cards like the Tesla have a much more rigorous validation procedure which goes far beyond crafting UMD's (User Mode Drivers) and in-place ongoing test evaluation, so while the GPUs are binned for voltage and usable logic blocks, you'd find that a more fine scaled binning is also being used to test the integrity of the logic blocks that are functional. I don't think the binning of pro GPUs differs a great deal from pro CPUs like Xeon and Opteron in that respect.