• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 3080 Ti GA102-225 GPU Pictured and Detailed

I'm going to buy one of these when they become available and reasonably priced. So 3-4 years from now. :laugh:
I was really lucky (actually I have a good supplier that I buy nearly all my expensive IT stuff from, so he treats me well), managed to get a Strix 3090 when they first came out for almost MSRP price (1,850 Euros). Now I see them being sold for 3,000 Euros (!!!) - and that is if you can find one in stock!!!

Still have my old Strix 2080ti gathering dust on the cupboard, but I ain't selling that either despite them currently being sold for 1,000 Euros in the second hand market (almost what I paid for it NEW)! I suspect that if something goes wrong with a 3000 series card (knock on wood) with the current supply issues they will rather give you your money back than a replacement card, so at least I have something to fall back on.
 
Availability:
1619126635614.png
 
This could be a good thing for more supply, if there is shortage of GDDR6x which prevents them from shipping more 3090s. Instead of selling more 3080's they could have just calculated they can make an extra $300 per card with a new SKU with half the ram of the 3090 while simultaneously shipping more cards in total. So I guess optimistically, this could b a win-win assuming one is in the market for a >$1000 card.
 
Maybe we should take ads on Fox News to advertise crypto to the crowds of easily swayed old-folks. It'd be the chinpokomon effect and then it wont be cool or profitable to mine anymore! The monopoly money-scheme will fall flat for what it is, the miners will tire of grandma wanting tech support for her "nintendo money box-a-machine", and then we'll have a market back to normal.

EDIT: call it.....Matlock coin!
 
Maybe we should take ads on Fox News to advertise crypto to the crowds of easily swayed old-folks. It'd be the chinpokomon effect and then it wont be cool or profitable to mine anymore! The monopoly money-scheme will fall flat for what it is, the miners will tire of grandma wanting tech support for her "nintendo money box-a-machine", and then we'll have a market back to normal.

EDIT: call it.....Matlock coin!

YES. And have Tom Selleck hawk some reverse mortgage scheme involving crypto. They all trust Selleck, every last one of 'em
 
Good thing we'll still be able to buy picture of a graphics card on Ebay.
 
Great.......another freaking $1k gpu...:roll:
 
Great.......another freaking $1k gpu...:roll:

Yep, even if this pandemic weren't around GPUs would still be overpriced regardless.

AMD and Nvidia are less competing and more pricing around each other without disrupting profit margins.
 
So this will perform exactly like an RTX 3090 considering the nearly the same core count and same 384-bit bus. Just with half the memory amount.
 
Great.......another freaking $1k gpu...:roll:
Its 1k on paper. Considering that the RTX 3080 is now 699 on paper, but actual price currently is probably around 900 or more, depending on the model. So I am not surprise this will replace RTX 3090's price bracket in the current situation.
 
Its 1k on paper. Considering that the RTX 3080 is now 699 on paper, but actual price currently is probably around 900 or more, depending on the model. So I am not surprise this will replace RTX 3090's price bracket in the current situation.
Where is it $900 usd for a 3080? I've seen about $1500 -$2200 usd for a 3080.

Unless you mean what MSRP is....
 
The problem is that two 6800xts draw a lot more power than one of these.
This..and also 2 x 6800xt is def not anywhere in the fragments range of $1000 :D
Also looking at it that way, 60 - 70 MH/s that the guy stated for 6800xt doesnt look so good either anymore..when rtx 3060 does 50 MH/s and the rtx 3060ti crosses 60MH/s
 
118 megahash doesn't seem all that great imo for a 3080 ti level card. I mean I think I read 6800 xt's get like 60 or 70... and they are like 1/3 1/4 the price of what this will be scalped for.
Ampere easily beats RDNA2 in mining, especially when tweaked - Go look up calculators if you want proof

So this will perform exactly like an RTX 3090 considering the nearly the same core count and same 384-bit bus. Just with half the memory amount.

Probably since 24GB VRAM is overkill for gaming

Problem with 3080 Ti is that price willl be higher than 3090 was on release - so it's pointless. 3090 came out 8 months ago and I bet it will be 2-3-4 months more till 3080 Ti can be bought so people pretty much waited 1 full year more, to buy a lesser version of 3090, for a higher price

By sep 2022, 4000 series launch and hopefully GPU availability is back to normal
 
Last edited:
They will show up on sites like Ebay for between $2,500 and $3,000 and some people will buy them anyway.
 
Probably since 24GB VRAM is overkill for gaming

Directly, yes, but not if you are anything like me, with a PC running 24/7 and tens of Firefox windows and tabs open simultaneously (yes, those people do exist lol) with GPU hardware acceleration enabled.

In such a case (especially if you spend a lot of time on Youtube) VRAM usage from the browser alone can reach as high as 8 GB - on a 12 GB card that would leave only 4 GB available for games. Definitely not enough if you are running at 4K.

Of course, you could always exit Firefox before running a game to reclaim all that memory, but with 24GB VRAM you really don't need to - convenience wins!
 
I dont think there is enough facepalms left for this...Lets have a $2500 gpu so we dont need to close them tabs..but still go on and express our anxiety about VRAM never quite enough.
 
I dont think there is enough facepalms left for this...Lets have a $2500 gpu so we dont need to close them tabs..
That is actually only ONE of the reasons, obviously. But facepalm away! :)
 
That is actually only ONE of the reasons, obviously. But facepalm away! :)
...and facepalm away I will...oh dont forget,you can splash another $2500 for a second 3090,SLI them and kill your anxiety once and for all!
 
Directly, yes, but not if you are anything like me, with a PC running 24/7 and tens of Firefox windows and tabs open simultaneously (yes, those people do exist lol) with GPU hardware acceleration enabled.

In such a case (especially if you spend a lot of time on Youtube) VRAM usage from the browser alone can reach as high as 8 GB - on a 12 GB card that would leave only 4 GB available for games. Definitely not enough if you are running at 4K.

Of course, you could always exit Firefox before running a game to reclaim all that memory, but with 24GB VRAM you really don't need to - convenience wins!

I use Chrome and have 25+, often 50+ and sometimes 100+ tabs open with GPU hw accel enabled and I'm not using anywhere near that, makes no sense, your VRAM usage is allocation not required amount

More VRAM, more usage (higher allocation) this is nothing new, you would be able to do the exact same thing on a RTX 2060 with 6GB.
 
More VRAM, more usage (higher allocation) this is nothing new, you would be able to do the exact same thing on a RTX 2060 with 6GB.

This is common especially among gamers. As you said VRAM needed isn't the same as VRAM allocated.

Gamers will definitely know whether or not they actually need more VRAM because the GPU will start using System RAM which is much slower than VRAM.
 
  • Like
Reactions: las
I use Chrome and have 25+, often 50+ and sometimes 100+ tabs open with GPU hw accel enabled and I'm not using anywhere near that, makes no sense, your VRAM usage is allocation not required amount

Tell that to Mozila then. :)

Actually I think the problem is related to a combination of Firefox and YouTube: looks like a memory leak of some kind, but in VRAM. It's not just the amount of tabs you have open, but also the amount of time you have Firefox running while browsing the Internet and continuously watching tons of YouTube videos.

As I mentioned, my PC is on 24/7 and I only tend to reboot it whenever installing an update - so that can mean a month of runtime or more, for instance. Leaks eventually pile up - current VRAM usage is 4455 MB but I have seen it eventually go as high as 8 GB.

First time I noticed the issue was when playing Wolfenstein II The New Colossus (a game that likes to load a ton of textures onto VRAM, IIRC it could actually use up to 8-9GB?) on my old 2080 Ti (with 'only' 11GB VRAM): I launched the game and it was running slow as molasses, but actual GPU usage as measured by Afterburner on my secondary monitor was very low. That's when I noticed that *PCIe bus usage* was peaking to 100% (normally it's negligible) and that VRAM usage was maxed out. Basically the game could not load all the textures it needed into VRAM and so it was swapping them out 'on the fly'. Exiting the game showed why: Firefox was using nearly all of the available video memory.

To prevent other users from hurting themselves while face palming (eheh), the MAJOR reason I upgraded from a 2080 TI to a 3090 was actually the HDMI 2.1 support. This, together with a 48" LG CX OLED meant I could FINALLY experience my games at 4K 120Hz HDR with full chroma sampling, all at the same time. I had been waiting for HDMI 2.1 support for a very long time, as up until then I was limited to 60 Hz on my LG 43" 4K non-HDR IPS monitor.

The 3090 was also faster than a 3080 (I do like to run games with ray tracing enabled, when available) and the huge amount of VRAM was (to me) a big bonus given the above. The fact that I got my 3090 for basically MSRP, thus much less than people are paying for a 3080 these days, makes this a win-win, sorry. :)

More VRAM, more usage (higher allocation) this is nothing new, you would be able to do the exact same thing on a RTX 2060 with 6GB.
Not sure what led you to say something like this? Games won't normally allocate more than they actually require, and Firefox usage will increase over time because this is likely a memory leak (not sure if memory fragmentation can occur on VRAM, but it's also a possibility).
 
This is common especially among gamers. As you said VRAM needed isn't the same as VRAM allocated.

Gamers will definitely know whether or not they actually need more VRAM because the GPU will start using System RAM which is much slower than VRAM.

For sure, when you run out of VRAM you will know. Very noticeable stutter will occour. Very low fps dips, often to 0 and back up.

Last time I personally experienced this, was in Bad Company 2 with a GTX 570 I think it was. 1.25GB VRAM maxed out.

Not sure what led you to say something like this? Games won't normally allocate more than they actually require, and Firefox usage will increase over time because this is likely a memory leak (not sure if memory fragmentation can occur on VRAM, but it's also a possibility).

Tons of game engines allocate all VRAM (or 80-90%). COD games usually do, for example.

Generally you can't really trust the VRAM Usage, it does not tell you much. If you are not stuttering, you have enough
 
Last edited:
  • Like
Reactions: 64K
Tons of game engines allocate all VRAM (or 80-90%). COD games usually do, for example.

Generally you can't really trust the VRAM Usage, it does not tell you much. If you are not stuttering, you have enough

Ah, sure, VRAM allocated by a game is not the same as actual VRAM usage, just the 'maximum' the game thinks it might need, I understand that. That's not what is happening with Firefox though, as it allocates VRAM on a 'as needed' base.

So far I haven't seen any game max out VRAM usage on my 3090 though, even with Firefox gobbling up tons of it. :) Games cannot simply allocate ALL of the VRAM to themselves, as the Windows DWM also uses it for desktop composition (plus modern browsers) etc
 
Back
Top