• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

ASUS TUF Gaming GeForce RTX 3090 Ti Box Pictured

20201212_112059.jpg


That 3090 Ti box looks just as sexy as mine :D
 
  • Like
Reactions: ixi
450W haha sorry had to laugh.
 
I'm actually quite happy I'm not buying an Ampere GPU. But that's what I've been saying since it got released. The whole gen is an utter mess. TDPs through the roof, VRAM is hit/miss, and RT is still early adopter nonsense. And its not 7nm TSMC, but shaky Samsung, directly relating to the stack's other issues.

Then again, I'm also not buying an RX GPU :D But that's just an availability issue. The gen itself is solid, normal product stack, proper balance, and as per AMD's mojo, slightly behind on featureset.

All things considered its not a huge issue that stuff's hardly available. If you have a working GPU.

Yeah, it's like the people who made those stupid decisions for the lineup are with very low intelligence.
RTX 3080 10 GB, and RTX 3090 24 GB is absolute nonsense.

In the worst case, it should have been:
RTX 3080 12 GB and RTX 3090 16 GB.
or RTX 3080 16 GB, and RTX 3090 20 GB..
 
Yeah, it's like the people who made those stupid decisions for the lineup are with very low intelligence.
RTX 3080 10 GB, and RTX 3090 24 GB is absolute nonsense.

In the worst case, it should have been:
RTX 3080 12 GB and RTX 3090 16 GB.
or RTX 3080 16 GB, and RTX 3090 20 GB..
It still doesn't make sense because 12GB 3080 Ti exists.
 
Judging from the box length, this is some super big card. Can't wait for Jensen to come out with BS statments again about power efficiency with this 450W card. Also, this box is as much as i will see of this card. :laugh:
 
Next up AMD with a 32-bit GPU backed by like 4MB of infinity cache for a paltry $250's.
 
Does this come with the external foot heater extension?
 
picture of a box, pretty much the only thing everyone will see.
 
Three High End GPUs... for three percent difference in performance

In the worst case, it should have been:
RTX 3080 12 GB and RTX 3090 16 GB.
or RTX 3080 16 GB, and RTX 3090 20 GB..
almost all of these combinations are impossible.
 
Micron made 8 Gbit/1 GByte GDDR6X chips which obviously wasn't enough. I wonder, if they couldn't make 2 GB chips, why not compromise and make 12 Gbit/1.5 GByte GDDR6X chips?

The RTX 3080 especially would have been pretty sweet with 15 GB VRAM (10 chips x 1.5 GB per chip).
 
Last edited:
Well, for a typical build, a 4-slot card would be okay as not many has a sound card or other cards that often anymore.

And does Halo Infinite has also unlimited FPS on menu or what, totally missed that?
True, but in my case it would stop me from testing other vendor on the same PC. Testing can lead to weird things, some days ago I had to test an issue with flatpak and video decoding, if you had an Nvidia GPU accompanied by any other dedicated card, decoding would fail and make the app crash. A 4 slot card would block me from testing that

Halo's menu seems to be like New World, a GPU toaster.
 
For the incompetently stupid Nvidia engineers? :D Yes!
no it is impossible for a simple reason.
every memory chip is bound to a 32bit bus.
you have 8Gbit chips (1GB)

with a 320 Bit memory bus you can use 10 or 20 memory chips. and nothing in between (without making the card a complete mess and barely functional as soon as the "odd" memory gets used. like with the 970.
 
10-15 more performance at 4k with the increase of memory bandwidth plus 2 sm's best case
 
no it is impossible for a simple reason.
every memory chip is bound to a 32bit bus.
you have 8Gbit chips (1GB)

with a 320 Bit memory bus you can use 10 or 20 memory chips. and nothing in between (without making the card a complete mess and barely functional as soon as the "odd" memory gets used. like with the 970.

There are two workarounds - the second with the infamous 3.5-4GB 970. But you are wrong here because the 970 has a 256-bit MI, so in theory it must function normally with its 4 GB.
It has only 3.5 for another reason.
And yes, Nvidia has already designed other models with decoupled MI bandwidth-VRAM capacity ratios :D

And first - you have to design the MI interface with the memory capacity in mind, not the other way round.

If you want 16 GB, then give the card a 256-bit MI.
 
There are two workarounds - the second with the infamous 3.5-4GB 970. But you are wrong here because the 970 has a 256-bit MI, so in theory it must function normally with its 4 GB.
It has only 3.5 for another reason.
And yes, Nvidia has already designed other models with decoupled MI bandwidth-VRAM capacity ratios :D

And first - you have to design the MI interface with the memory capacity in mind, not the other way round.

If you want 16 GB, then give the card a 256-bit MI.
so you want to reduce the performance of the card to get a couple gigs more vram for no reason?
 
450W and people whine at ADL :laugh:

How long before american power sockets cant handle the load of both a CPU and GPU on the same fuse?
 
Back
Top