• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

ICYMI, NVIDIA GeForce RTX 4080 12GB Uses 192-bit Memory Bus

There is no excuse for a 192bit VRAM bus on a TOP TIER CARD. It's stupid, plain and simple.
They said the same thing when bus width was being reduced, starting with the GTX280 at 512 bit, then down to 384 bit with the 480 and 580, and finally down to the usual 256 with the 680. While 192 does seem slim for a high end card, I'm more concerned with there being two variants of 4080s with no clear distinction, other than the memory size and the bus width itself being listed if one digs into the specs of the particular card being viewed. It seems like it should be a 4070 to me. Anyway, the reviews will tell the tale and show whether or not the card really does suffer from a narrower bus.
 
It"s just marketing. Last gen Nvidia introduced a new tier, the 90s in order to be able to upsell their customers. Now they are calling the 4070 Ti a 4080. Merketing, all that matters is price/performance, and that we'll have to see.

The price is very stiff, already, for a world in recession. Let's hope performance makes up for it, at least partly.
 
They said the same thing when bus width was being reduced, starting with the GTX280 at 512 bit, then down to 384 bit with the 480 and 580
True and that was a bit of thing, but things worked out. And if they work out for this gen in a positive way, then ok.
I'm more concerned with there being two variants of 4080s with no clear distinction
Agreed. This is not good.
Anyway, the reviews will tell the tale and show whether or not the card really does suffer from a narrower bus.
Also agreed. The proof is in the pudding, as they say...
 
Last edited:
12gb over 196bits 192bits is fine, although that put that 4080 in the league of the RX 7700 XT if that one keep the RX 6700 XT memory/bus pattern

well at least 196bits 192bits does not hinder the RX 6700 XT :oops:

edit: the 4080 12gb is technically the 4070 o_O i wonder how the 4070 will look .... because if it has better specs on memory and close to the 4080 12gb in cuda ... the 4080 12gb is DOA basially
 
Last edited:
12gb over 196bits is fine, although that put that 4080 in the league of the RX 7700 XT if that one keep the RX 6700 XT memory/bus pattern

well at least 196bits does not hinder the RX 6700 XT :oops:
Those 4 extra bits do make all the difference! :cool:
 
What many people are saying. This was the original 4070 that Nvidia decided to name "4080 12GB" and is now trying to sell for $200 more than what they where probably thinking.
 
It's GTX 970 moment all over again, but amplified and with x80 chip (what was once the flagship single chip card of the generation). For shame.
Waiting for 3dfx like dramatic deflation of nvidia and shameful slip into oblivion. :D
 
12GB and 192 bit membory bus for 899$ ?! :kookoo: Crazy cryptocurrency prices kept. Nvidia wants to sell ~ RTX 4060 12GB as RTX 4080 :nutkick:
RTX 20XX series - 5-10% performance increase to GTX 10XX. RT on the first RTX series cards is useless in 4K(2K). RT on RTX 30XX series cards without DLSS, not practically usable for playing.
The prices of graphics cards didn't rise, they were detached from the hardware market. These are prices from the cryptocurrency market, completely detached from reality.
In addition, the performance without artificial enhancers was tragic - the RTX 3090Ti offered 30-60 FPS in 4K with RT for $ 2,000 :banghead: Now probably the new RTX with the new DLSS will not be compatible with the two previous RTX series.
1st- then it could be exactly the same with each series, you either buy a new series or lose support and playability. Like loss of support when releasing a new Android or iOS version.
2nd- the prices are so high that you can buy an OLED TV, console, and the most expensive card still costs more, just GPU. The prices of PC components from the basic home appliance, entertainment have become insane, like some premium goods.
 
192bit is the bare minimum I'd ever go and have before, in a high midranger. Just, never seen one in a high end card before...if they charge bundles for it, lol. Can see this card becoming a discount king and possibly devaluing the 4070s. Although, this does raise the question if nVidia is going to water down the 4000 series under the 4080. Does not bode well. I can see them doing just that and charging big bucks for 4070s and under with 192bit. 4060s going back to 128bit and so on.

AMD might just win this next gen if they can get their act together. 7000 series had been their best era in the past.
 
Lol.

The 4080 12gb is £950 (USD to gbp current exchange plus 20% vat)

The equivalent previous gen card, the 3060 ti, cost just £350 on the same basis, at launch

This is almost three times the price.

I hope some of the people ranting about the 6500 xt will give this turd the proper slating it deserves
 
GALAX confirms AD102-300, AD103-300 and AD104-400 GPUs for GeForce RTX 4090/4080 series

GALAX-RTX-40-GPU-3.jpg


It was expected of course based on the final specs.
So $900 for a 12GB 192bit bus < 300mm² chip!
The leakers are saying only Navi31 this year and the max cut-down Navi31 shouldn't be less than 160RBs/8960SP/320 bit bus/20GB in worst case scenario and at least 4080 16GB raster performance (and logically a lot more depending frequency and how far cut-down is it)
So Nvidia will start this year at $899, I wonder what SRP AMD will give for cut-down Navi31 if it has only Navi31 for this year!
 
Last edited:
If those prices are correct, good luck to them selling them. 4080 12GB looks it'll be gimped in the only use case scenario's where it would make sense to buy one.
 
Of course, memory bandwidth is no way to compare the RTX 40-series from its predecessors, there are a dozen other factors that weigh into performance, and what matters is you're getting generationally more memory amounts with the RTX 4080-series.
Something Nvidia did in A100 was to add 40MB of L2 cache compared to 4MB or 6MB in previous x100 iterations. Ampere graphics card GPUs still had only a couple megs of L2 cache with GA102 topping out at 6MB. RTX 4080 has 48MB and RTX 4090 has 96MB of the stuff. This sounds Infinitely familiar change for some reason :D
 
The reason why AD10x chip can get away with lower memory bandwidth is the much increased L2 cache. Same reason as to why 6900XT with 256bit bus was able to compete with 3080 (320) and 3090 (384).
 
12gb over 196bits 192bits is fine, although that put that 4080 in the league of the RX 7700 XT if that one keep the RX 6700 XT memory/bus pattern

well at least 196bits 192bits does not hinder the RX 6700 XT :oops:

edit: the 4080 12gb is technically the 4070 o_O i wonder how the 4070 will look .... because if it has better specs on memory and close to the 4080 12gb in cuda ... the 4080 12gb is DOA basially
7700XT 16GB N32 GCD + 4 MCD = 256bit. 7700 12GB N32 GCD + 3 MCD =192bit. 7600XT 8GB N33 monolithic die is 128bit. All rumored.
 
Something Nvidia did in A100 was to add 40MB of L2 cache compared to 4MB or 6MB in previous x100 iterations. Ampere graphics card GPUs still had only a couple megs of L2 cache with GA102 topping out at 6MB. RTX 4080 has 48MB and RTX 4090 has 96MB of the stuff. This sounds Infinitely familiar change for some reason :D

The higher the resolution, the less the help from that cache. This is the reason why the Radeons were always slower at 3840x2160 4K.
 
Ada 96MB L2 logically will have a lot higher throughput than a 96MB Infinity cache (L3) but it will have higher die size cost also (I don't know if my assumption is correct but I suspect around 83-92mm² for 96MB on N4 but I may be completely off, we will see.
 
Last edited:
The 40xx prices are absolutely insane. It is however what I was expecting. Also 12GB for a 4080 while my 1080ti from 6 years ago has 11GB. WTF...

Btw doesn't 1200 euros mean 1450-1500 with taxes?
 
Last edited:
I think saying that in as many words doesn't account for the performance and all the other spec nuances though, like lets say it equals or bests a 3090Ti and costs significantly less, is it stupid?
Agreed but I think the 16GB flips it back to ridiculous personally.
It's a bad play IMHO.
 
The higher the resolution, the less the help from that cache. This is the reason why the Radeons were always slower at 3840x2160 4K.
4K does not increase memory dependence all that much. I would argue that Ampere was simply slow in lower resolutions, particularly due to fill rate concerns - too few units at a comparatively low frequency. At high resolutions, shading power became the limiting factor and vs RDNA2 Ampere had more of that. In Ada Nvidia has basically doubled the ROP count.

Btw doesn't 1200 euros mean 1450-1500 with taxes?
EU prices include taxes.
 
To be fair, if 4080 is 192-bit what 4070 or 4060 will be? 128 and 64 bit?

they have typically capped feature cards at 128-bits.; that doesn't mean several different chips don't use the same memory bus (see: Maxwell 750 Ti, and 960 representing with the exact same width!)

I expect the 4070 to be 192-bits (but only GDDR6 at 18Gbps to save cost and power),
GTX 4060 Ti and maybe 4060 at 160-bits, 4050 at 128-bits!

we just went through a 2-year transition to move on to 2GB density chips, so not expecting anything less than 128-bits in all these cards! its going to be half a decade before we see the 8GB on 64-bit bus video cards!
 
Last edited:
The 4080 12GB is just a rebranded 4070 to justify wringing more money out of customers. The branded 4070 is going to look more like a 4060 specs-wise. This way they sell what was going to be a mid-tier card like it is a high-end card.

It's to be expected with what Jensen was saying in regards to the stock price and profits at the shareholder meeting; market manipulation to inflate profits. Nvidia is doing this through misbranding their products to justify a higher price than what the customers would normally bear for a similar tier product.

I won't be surprised if more AIBs start dumping Nvidia partnerships after this generation as Nvidia is going to do whatever they can to take up more market share with the FE cards.
 
Last edited:
4080 16 gigs looks ultra stupid. 9,728 shaders/2.51Ghz makes it 1.4X performance of 3080 based on TFLops calculation coupled with 1.4X MSRP price increase.
Where's the price to performance ratio improvement from generation to generation we used to get? :banghead:
 
Back
Top