• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

6 GB Standard Memory Amount for GeForce Titan

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,675 (7.43/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
NVIDIA's next high-end graphics card, the GeForce "Titan" 780, is shaping up to be a dreadnought of sorts. It reportedly ships with 6 GB of GDDR5 memory as its standard amount. It's known from GK110 block diagrams released alongside the Tesla K20X GPU compute accelerator, that the chip features a 384-bit wide memory interface. With 4 Gbit memory chips still eluding the mainstream, it's quite likely that NVIDIA could cram twenty four 2 Gbit chips to total up 6,144 MB, and hence the chips could be spread on either sides of the PCB, and the back-plate could make a comeback on NVIDIA's single-GPU lineup.

On its Radeon HD 7900 series single-GPU graphics cards based on the "Tahiti" silicon (which features the same memory bus width), AMD used 3 GB as the standard amount; while 2 GB is standard for the GeForce GTX 680; although non-reference design 4 GB and 6 GB variants of the GTX 680 and HD 7970, respectively, are quite common. SweClockers also learned that NVIDIA preparing to price the new card in the neighborhood of $899.



View at TechPowerUp Main Site
 
A titan out there... and coming...

What will be Mt. Olympus's champion in the coming battle?
 
Oh my head hurts ever since the news of Geforce Titan and I'm feeling dizzy too... I think i should take a vacation till it launches in the real world
 
When they release it and you buy, they will be announcing a 12GB next-gen GPU.

I could care less about the 6gb memory. We barely use 1gb of memory in most games at the popular resolutions. I was speaking about everything else. 7.1 billion transistors.

I will also be keeping my 7970 for I hope the next 2 gens. Its a great card and more than I need right now before being overclocked.
 
Last edited:
$899 usd, yikes. Wonder how long until we hear of the dual gk110 gpu with the near 2k pricerange? Lol. As long as the perfomance is there...Nvidia will charge the earth and tbh, why wouldnt they.
 
I was interested until I saw the $899 price tag. If they offered it for $300, I'd be tempted to have another go at NVIDIA.
 
  • Like
Reactions: xvi
899$ for it but if my calculations are right it should consume around 300w which I wonder how their gonna cool. Most likely the clocks will be pulled down to 800mhz so that the TDP drops but then performance should be about 39% greater than that of a gtx 680 at which point the 80% price increase is stupid. If it has unlocked voltage I would buy it but most likely this will end the same way as the 600 series with Boost and locked volts.
So I will stick with AMD.
 
We barely use 1gb of memory in most games.

I beg to differ. If this truly is a high end card, then it needs all the memory it can get. This card is expected to be used by people with 4K and greater resolutions. When I run Skyrim at 5760x1200 with high resolution textures I can can easily use 2.5 GiB of graphics memory. I only expect that to become the norm with new generation consoles and therefore more detailed games coming out.
 
I beg to differ. If this truly is a high end card, then it needs all the memory it can get. This card is expected to be used by people with 4K and greater resolutions. When I run Skyrim at 5760x1200 with high resolution textures I can can easily use 2.5 GiB of graphics memory. I only expect that to become the norm with new generation consoles and therefore more detailed games coming out.

You are right. I was thinking of 1080p. If one has a card like this I hope they have higher than 1080p.
 
On another note, I think that this would be the most memory chips ever on a reference single GPU graphics card at 24. GTX 295 still holds the reference card crown at 28, and the ASUS ARES III would hold the overall record at 32 if it was released.
 
Last edited:
899, let the raping begin.. I'm not paying 899..lol..
 
899$ for it but if my calculations are right it should consume around 300w which I wonder how their gonna cool. Most likely the clocks will be pulled down to 800mhz so that the TDP drops but then performance should be about 39% greater than that of a gtx 680 at which point the 80% price increase is stupid. If it has unlocked voltage I would buy it but most likely this will end the same way as the 600 series with Boost and locked volts.
So I will stick with AMD.

300W of heat isn't such a problem IMO, look at all the dual GPU's that are rated at 375W (and go way beyond that when overclocked).
 
Don't forget that GTX690 is now roughly 899-950$. The price for Titan is quite normal to me.
 
SweClockers also learned that NVIDIA preparing to price the new card in the neighborhood of $899.

The PlayStation 4 “Orbis” is going to debut on the market later this year (on February 20, 2013) at price-points starting from $350 to $400.

:cool:
 
Still more than happy with my GTX 680 at this point even if this turn out to be a monster. I'm only gaming at 1080p anyway so all that amount of ram would be useless to me.

Last card I had was a GTX 560 Ti, before that HD 5870 - i.e gaming cards. GTX 680 = gaming card. GK110 just doesnt make sense to me as a gaming card. Like GTX 480/580 this would obviously consume much more power. For HPC yes these gpu will do a job but for gaming I think its not the best / optimum solution.

Yes I understand people want the best of the best but at that alleged price point its ridiculous. However thats not to say I'm not excited about this. Unless AMD does miracles with GCN then performance crown would almost certainly be, without arguments, with Nvidia. However if AMD's pricing is very competitive, combined with great gaming bundles then who knows how the market will turn out. YAY gpu wars.
 
I wonder how it would be for WCG
 
I smell dual GPU here....
 
$900 is not much if you consider the size and cost of such a huge die. They won´t make profit with this pricing.
 
Back
Top