Monday, May 5th 2014

GeForce GTX 880 ES Intercepted En Route Testing Lab, Features 8 GB Memory?

An engineering sample (ES) of the GeForce GTX 880 was intercepted on its way from a factory in China, to NVIDIA's development center in India, where it will probably undergo testing and further development. The shipping manifest of the courier ferrying NVIDIA's precious package was sniffed out by the Chinese press. NVIDIA was rather descriptive about the ES, in its shipping declaration. Buzzwords include "GM204" and "8 GB GDDR5," hinting at what could two of the most important items on its specs sheet. GM204 is a successor of GK104, and is rumored to feature 3,200 CUDA cores, among other things, including a 256-bit wide memory bus. If NVIDIA is cramming 8 GB onto the card, it must be using some very high density memory chips. The manifest also declares its market value at around 47,000 Indian Rupees. It may convert to US $780, but adding all taxes and local markups, 47,000 INR is usually where $500-ish graphics cards end up in the Indian market. The R9 290X, for example, is going for that much.
Sources: ChipHell, VideoCardz
Add your own comment

66 Comments on GeForce GTX 880 ES Intercepted En Route Testing Lab, Features 8 GB Memory?

#26
W1zzard
It makes perfect sense to build an early ES board with more than enough memory. So, using a single card, you can gather performance data for 1 GB, 2 GB, 3 GB, 4 GB, 6 GB, 8 GB configurations. Data that will later be used to decide what memory size the final shipping cards will have.

It also helps to uncover hardware and software design flaws. Imagine you build a new GPU, then a year later decide you'll make a 6 GB Titan card, and oops, the memory controller in the GPU doesn't work with that much memory.
Posted on Reply
#27
Hilux SSRG
FluffmeisterIt was more than enough to compete, not sure why they should sell it for any cheaper. It's performance fit, maybe AMD should have charged less for their high-end chip that only performed like a mid-range one apparently.

/shrug
Nvidia saw Amd's weakness and ran with it, to the detriment of consumers. It's just shocking how much they overcharged. I wonder if their failed Tegra3 with lack of LTE support caused Nvidia to seek higher profits in gfx cards?
Posted on Reply
#28
Fluffmeister
Hilux SSRGNvidia saw Amd's weakness and ran with it, to the detriment of consumers. It's just shocking how much they overcharged. I wonder if their failed Tegra3 with lack of LTE support caused Nvidia to seek higher profits in gfx cards?
Welcome to the world of multi-billion dollar corps.

As always people are free to vote with their wallets, nVidia have a stronger brand and people are clearly happy to pay, again I'm sorry but that is just how it is.
Posted on Reply
#29
haswrong
since theres the cheapest workforce in india and china, it is very likely nvidia is finally going to price these cards very cheaply (like 300$ for the best model) and will finally focus on the affordability for end customers, which has been severely lacking the last decade.
Posted on Reply
#30
D3LTA09
This doesnt make sense to me, how can an 880 be the successor to a GK104 board? it would have to be the successor to GK110?
Posted on Reply
#31
OneCool
Who gives a damn about the video card........... I want the Water Cooled Plunger!!!!!!!!



Posted on Reply
#32
64K
D3LTA09This doesnt make sense to me, how can an 880 be the successor to a GK104 board? it would have to be the successor to GK110?
When Nvidia launched the GTX 680 there were yield problems with the GK110 at TSMC and AMD's flagship performance was such that Nvidia could label the GK104 as a GTX 680 when in the past this would have been a midrange GPU. It was a success and they probably will repeat this with the Maxwell series.
Posted on Reply
#33
john_
256bit mid range card hoping to cost ***ONLY*** $500.
How nice.


A! Yes.... 8GB RAM.
for example
Posted on Reply
#34
KainXS
it looks like a dual gpu to me, am I the only one thinking that
Posted on Reply
#35
NC37
Oh thats not a good sign if a x04 chip is taking the high end. This is Kepler all over again where we went from GF104 in Fermi being the mid-high chip to the GK104 being dumped in the high end with the original high end 110 coming later.

It wreaked prices. Fermi had the 104s in a beautiful pricing segment and then Kepler saw them jump cause suddenly 104s were used for all the high ends. NV then stuffed 106s into the mid-high bracket when 106s were more mid-low. So they literally found a way to make people pay more for weaker GPUs.
Posted on Reply
#36
matar
So now NVidia will give us the GTX 880 with 2 flavors a 4GB and a 8GB but still based on The 28nm.
Then we will still have the same but redefined architecture for the GTX 990 with a 20nm it seams to me now NVidia and Intel > ( like sandy bridge ivy bridge ) are both going the same route to make more money and not give us the latest technology right away when available remember we seen this with the 8800 to 9800 same but redefined architecture then 400 to 500 same then 600 to 700 same and now we will get 800 AND ONLT IN THE 900 we will get true full Maxwell chip and that's what NVidia use to stand for. nVidia that's the way its meant to be played. but I guess those days are going away don't get me wrong I still would only buy only NVidia GPUs.
Posted on Reply
#37
GhostRyder
Nvidia needs to put more ram on their cards at better prices (At least on the gaming GPUS's). Its annoying to have to buy the EVGA, Asus, or MSI cards that add more ram to make up for the detriment to high resolution gaming. I love the fact theres an 8gb version coming because that will be sweet (So long as the price is right).
Posted on Reply
#38
Relayer
cadavecatwo words:


Dual GPU.



Everything in that list hints at it.


:lovetpu: I dunno WTF I am talking about.
You actually make sense though. Either that or I'm Tweedle Dee to your Tweedle Dumb. :toast:
Posted on Reply
#39
Relayer
haswrongsince theres the cheapest workforce in india and china, it is very likely nvidia is finally going to price these cards very cheaply (like 300$ for the best model) and will finally focus on the affordability for end customers, which has been severely lacking the last decade.
I have a bridge I'd like to sell you. ;)
Posted on Reply
#40
HalfAHertz
I don't really see the 680 as a purely mid ranged card. I think that Nvidia was indeed ahead of AMD with the desing of the so-called high end GPU but only a by bit and their CEO (like every good CEO should) tried to misrepresented the scale of the situation to put Nvidia in a better light.

I think the timeline went something along those lines:

-TSMC's 28nm process had problem as usual.

- There weren't that many 7970s at first.

-The 680 came in a bit later than the 7970.

-Gk104 had some supply issues as well.

-The 7970 GHz Ed. came out.

-Non-competitive high pricing for both AMD and Nvidia

-The 780 came out some time later.

-The 290x came out a few months after that.

Frankly, I think that the 680 was simply the best that Nvidia could put out from TSMC at that time. It was a new architecture with a complicated design on a new manufacturing node. The last time they tried this, they came up with the fx5000 series, which was a bit of a flop. So they made the calculated choice not to rush things like last time and work through the issues to deliver a more solid solution.

Unsurprisingly AMD was in a similar position. They were getting sub-par results that delivered good performance but in a higher power envelope. They made the choice not to optimize power consumption as much and to rush the design so that they can deliver before Nvidia and conquer the market in the mean time. AMD as a whole was in a financially fragile situation at that time and was really in need of a few months of good sales. Not long after the initial release AMD tried to mediate their unoptimized design with the refined 7970 GHz edition somewhat successfully.

Both products had high pricing for a long time due to the difficulty of manufacturing them at a sufficiently high rates to satisfy demand.

Once Nvidia sorted things out with the manufacturing process, they could finally bring out the Fermi successor they envisioned from the start and the 780 came out.

The 290x came out a few months later and managed to deliver comparable performance but had the same high power usage issues like the 7970. The conclusion is that the 290 had the same silicon substrate as the 7970 and the two were designed in tandem from the start. Most likely AMD tried to solve the high power consumption but failed and decided not to delay the release any further to prevent further loss of sales to the 780.
Posted on Reply
#41
xenocide
NC37Oh thats not a good sign if a x04 chip is taking the high end. This is Kepler all over again where we went from GF104 in Fermi being the mid-high chip to the GK104 being dumped in the high end with the original high end 110 coming later.

It wreaked prices. Fermi had the 104s in a beautiful pricing segment and then Kepler saw them jump cause suddenly 104s were used for all the high ends. NV then stuffed 106s into the mid-high bracket when 106s were more mid-low. So they literally found a way to make people pay more for weaker GPUs.
Alright, let's get this over with. Stop acting like the product code immediately determines the value. You wanna know why the GTX 680 was the GK104? Because it was better than AMD's best offering at the time, and substantially better than the GTX 580, and absolutely destroyed the GTX 560. Everyone keeps acting like just because it says GK104 instead of GK100 or GK110 it's suddenly crap--it's not. If the GTX880 comes out as a GM204 part on 28nm, and offers performance that exceeds the GTX 780 Ti while using less power for ~$500, you cannot tell me you would consider it crap. Being 20nm or GM100/110 doesn't matter as long as the performance is there. Don't like Nvidia's system? Then encourage AMD to put out a GPU that isn't a miniature heater with a leafblower attached to it that can compete with Nvidia at the high end.

The GTX 680 came out and did screw with prices, because it was so good. AMD had to drop the price of their HD7970 to compete with Nvidia's smaller, cooler, more energy efficient mid-range part that they were getting better profit margins on. The people who control the upper bracket of performance set the price bar. AMD didn't hesitate to throw the HD7970 out there at $550 when the HD6970 launched at under $400. Just like they didn't hesitate to throw the FX Series Socket 939 CPU's out there for $1000+ when they were substantially better than Intel's offerings. It's capitalism. If they are offering a superior product, they are going to charge a premium. Corvette's are more expensive than Camaro's, it doesn't mean Camaro's are a rip off or crappy products.
GhostRyderNvidia needs to put more ram on their cards at better prices (At least on the gaming GPUS's). Its annoying to have to buy the EVGA, Asus, or MSI cards that add more ram to make up for the detriment to high resolution gaming. I love the fact theres an 8gb version coming because that will be sweet (So long as the price is right).
Show me a single benchmark where a 4GB variation of a normally 2GB card significantly outperforms at higher resolutions. I'll wait. I haven't found a single one.
Posted on Reply
#42
Relayer
xenocideAlright, let's get this over with. Stop acting like the product code immediately determines the value. You wanna know why the GTX 680 was the GK104? Because it was better than AMD's best offering at the time, and substantially better than the GTX 580, and absolutely destroyed the GTX 560. Everyone keeps acting like just because it says GK104 instead of GK100 or GK110 it's suddenly crap--it's not. If the GTX880 comes out as a GM204 part on 28nm, and offers performance that exceeds the GTX 780 Ti while using less power for ~$500, you cannot tell me you would consider it crap. Being 20nm or GM100/110 doesn't matter as long as the performance is there. Don't like Nvidia's system? Then encourage AMD to put out a GPU that isn't a miniature heater with a leafblower attached to it that can compete with Nvidia at the high end.

The GTX 680 came out and did screw with prices, because it was so good. AMD had to drop the price of their HD7970 to compete with Nvidia's smaller, cooler, more energy efficient mid-range part that they were getting better profit margins on. The people who control the upper bracket of performance set the price bar. AMD didn't hesitate to throw the HD7970 out there at $550 when the HD6970 launched at under $400. Just like they didn't hesitate to throw the FX Series Socket 939 CPU's out there for $1000+ when they were substantially better than Intel's offerings. It's capitalism. If they are offering a superior product, they are going to charge a premium. Corvette's are more expensive than Camaro's, it doesn't mean Camaro's are a rip off or crappy products.



Show me a single benchmark where a 4GB variation of a normally 2GB card significantly outperforms at higher resolutions. I'll wait. I haven't found a single one.
GK104 was better than Tahiti perf/W in Gaming and that's it (Once both are O/C'd there is no perf advantage where Tahiti is typically a little faster.). Tahiti is a much better chip in any compute metric (excluding CUDA, of course). Stronger in DP and destroys it in OpenCL and scrypt (what mining uses). for some reason the Titan and Titan-Z (especially) are worth huge premiums because of the DP prowess. Nobody takes that into acct. though when comparing GK104 to Tahiti.
Posted on Reply
#43
buggalugs
RelayerGK104 was better than Tahiti perf/W in Gaming and that's it (Once both are O/C'd there is no perf advantage where Tahiti is typically a little faster.). Tahiti is a much better chip in any compute metric (excluding CUDA, of course). Stronger in DP and destroys it in OpenCL and scrypt (what mining uses). for some reason the Titan and Titan-Z (especially) are worth huge premiums because of the DP prowess. Nobody takes that into acct. though when comparing GK104 to Tahiti.
I agree, the 680 wasn't that great. It only just beat the 7970,(under 10% and 7970 won plenty of games) and that was with a highly overclocked and boost enabled 680. The GHz edition changed things again then Nvidia rushed out the 780. Then they had the cheek to release the Titan for $1,000 knowing the 780 was just weeks away and would demolish the titan for much less money.

I look at it very negatively in that Nvidia decided to screw consumers and milk them for another upgrade.
Posted on Reply
#44
HumanSmoke
xenocideShow me a single benchmark where a 4GB variation of a normally 2GB card significantly outperforms at higher resolutions. I'll wait. I haven't found a single one.
Very true. It's almost as if some people believe that adding memory makes a cumulative difference, when the fact is that the chip architects ACTUALLY know what they are doing in making a balanced design. For a 4GB card to show a marked improvement over a 2GB version, it must mean that the original design wasn't balanced in the first place - and that simply doesn't happen in modern GPU design. The other scenario would have to be that the lower framebuffer was allied with a less functional die
xenocideAlright, let's get this over with. Stop acting like the product code immediately determines the value. You wanna know why the GTX 680 was the GK104? Because it was better than AMD's best offering at the time, and substantially better than the GTX 580, and absolutely destroyed the GTX 560. Everyone keeps acting like just because it says GK104 instead of GK100 or GK110 it's suddenly crap--it's not.
Actually probably a case of both vendors looking over the fence. AMD's die size has been steadily increasing as they see the value of compute - traditionally (at least since G80) something that Nvidia incorporated wholesale. Nvidia noted how not every GPU needed to check every feature box. AMD's Barts GPU sold very well despite a having a complete lack of double precision, but Nvidia also commands the lions share of the pro market and needs compute - hence the bifurcation of the product lines. The top part gets all the bells and whistles, the second tier parts and down become stripped down to the bare minimum to save die space and power.
xenocideIf the GTX880 comes out as a GM204 part on 28nm, and offers performance that exceeds the GTX 780 Ti while using less power for ~$500, you cannot tell me you would consider it crap.
I don't think you can judge any performance and price in a vacuum. How the card is viewed will be as much about how it stands against its competition as its own features. R600 might have been judged a pretty fair GPU had the G80 and G92 not bracketed its release.
AMD have a record of playing the GPU cards close to their chests (something the CPU division could learn from), so I'll reserve judgement for the time being. I wouldn't be at all surprised to see a re-run of the previous few generations with the GM204 v Pirate Islands episode. When people are arguing about superiority when the difference is a few percentage points and an optimized game bench or two can swing the result one way of the other, it seems that neither side will put daylight between their product and the competition.
Posted on Reply
#45
64K
I think it's fair to judge the GTX 680 as upper midrange. I certainly had no complaints about performance when I got mine nor do I have any complaints about performance now.

www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_680/32.html

When you compare the true Kepler flagship GTX 780Ti to the GTX 680 it's clear where the GTX 680 fits on the Kepler scale.
Posted on Reply
#47
NationsAnarchy
AdityaYeah I wonder the same,didnt the old all powerful GTX-295 have 896bit memory interface? Maybe the higher clock speeds must be compensating for the reduced width.
Well, if you do pay attention to the topic, a lot of reasons have been made lol
Posted on Reply
#48
cadaveca
My name is Dave
RelayerYou actually make sense though. Either that or I'm Tweedle Dee to your Tweedle Dumb. :toast:
Just because it makes sense, doesn't make it right, though. Listed memory and shader counts don't make sense to me, and to me point to that dual-GPU... W1zz is probably bang-on as to why there might be an 8 GB listing, however. I hadn't considered that it might have to do with memory controller testing, and that makes even more sense to me. 3000 shaders tops 780 TI though.
Posted on Reply
#49
rtwjunkie
PC Gaming Enthusiast
I totally agree with W1zz, in that it makes sense to send an ES with all kinds of goodies on it. It allows you to test many different variations on one card. Plus it keeps all of us in a guessing frezy as to what the specs will be on release!
Posted on Reply
#50
arbiter
FluffmeisterJust because GK104 was mid-range in the Kepler hierarchy it hardly warranted mid range prices.

People seem to be quick to forget it launched being both cheaper and faster than the competition:
www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_680/1.html
GTX680 was top of like card for that series, it was GK104 chip. It wasn't midrange at the time it was released.


Don't see why everyone assumes its 256bit memory bus as it would be a bit of a downgrade performance wise if they did that. 256bit was nothing but a rumor so best to leave it as one til proper specs are released.
Posted on Reply
Add your own comment
Apr 19th, 2024 21:44 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts