• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

ASUS GeForce RTX 3090 STRIX OC

If you trick out RTX at native 1080p with shadows, reflections, and global illumination, I wouldn't surprised if this card could only handle it at 60fps. Mind you, that's still just 1 bounce per pixel, with de-noising to smooth out light gaps. DLSS isn't free -- it at least costs time for training to ensure no artifacts for a given game, but I'm sure NVidia charges for this training. Full fidelity path-tracing graphics isn't as cheap as it's been made out to be.

For those who want 1080p at 360Hz, isn't it better to just lower the details on a mid-range high-frequency GPU, since they only care about the competitive advantage? Having RTX and all sorts of other details in the scene just make it hard to discern the opponent anyway.
That's the problem. In modern games, especially BRs with large open environments with a lot of structures, even at minimum graphics settings at 1080p, the FPS drops (fluctuates) well below 240fps. Forget 360fps and fully utilizing 360Hz even on a 3090.

However, obviously 360Hz offers better input lag reduction even at lower FPS due to faster scanout which is why I'm still buying one.
 
Last edited:
I would absolutely love to get my hands on this card, but with it being $1800, it's going to be really hard for me to nab it up. Might have to wait for the possible 3080 20GB (Ti?) to come out and see the performance.

Also,
We also did a test run with the power limit maximized to the 480 W setting that ASUS provides. 480 W is much higher than anything else availalable on any other RTX 3090, so I wondered how much more performance can we get. At 4K resolution it's another 2%, which isn't that much, but it depends on the game, too. Only games that hit the power limit very early due to their rendering design can benefit from the addded power headroom.

Got a typo there in the conclusion.

Also, id love to see WoW added back into reviews. :)
 
pxielsss.jpg


Ouuufffff! Might be worth getting to just see what it does to legacy bechmarks like 3DMark 01 and 03 combined with say a 10900k in 5.3-5.4GHz range on AIO. Non exotic way to grab some HWbot records.

Oh yeah, new games, and all that other stuff in the review. Awesome tech! I guess... Not gonna lie, skipped all those games tests since I literally do NOT own any of the new stuff. (Does No Man Sky count, circa 2016? :D)

But yeah, crazy fillrate! Definitive HWbot record taker as far as benchmarking and overclocking community is concerned. If you are buying this to play Fortnite/PUBG or something, a $60 ebay RX580 8GB can do that, and has been able to do that since about 2-3 years now.

...
..
.
 
I doubt Apex Legends is the only MP game that has large FPS fluctuations @1080p even at medium to low settings, and upcoming games will generally be more demanding.

Not to mention 360Hz will only become more common.
All games have significant fluctuations. MP can be worse... but that isn't dependent on the GPU really. That's the nature of MP gaming man.

360Hz will become more common, indeed. By that time, the next gen cards will be out. Like 4K, any decent 360 Hz monitor is quite pricey and overkill for anyone who doesn't have F U money or plays competitively where that stuff matters.

You're fighting for the <1%. :)
 
It really depends on the use case where a Titan would benefit.

Everyone and Nvidia are saying it's an 8K gaming card, I agree it will be useful to some but it isn't a Titan.
Surely you agree with below, I don't mind greedy cards but that's a lot of heat thrown into any room.
480w it is insane

I can get close to that with effort and such for benching(,not doing this anymore though) but I think it a tad high 24/7.
 
This card (just like I predicted) is a lot more power efficient that the RTX 3080:

performance-per-watt_3840-2160.png


So, your criticism is not totally sincere as the RTX 3080 must be an even worse card in your opinion. Also, NVIDIA has a higher performance per watt ratio for this gen vs the previous gen cards, so your comparison to Intel isn't valid.

As for me personally I never buy GPUs with a TDP above 150W (and CPUs above 95W) because I simply don't want that much heat under my desk, so I guess I will skip the entire GeForce 30 series because RTX 3060 has been rumored to have a TDP around 180-200W.

"Good lord" is misplaced. Again, this is a Titan class card. Either use it appropriately or forget about it - it does not justify its cost purely from a gaming perspective.

Are you kidding me? It's a "whole" 7 % more efficient than the 3080 which is VERY little and even if it were 20% more efficient with the ridiculous price difference it'd take a lifetime to make back the initial price difference between these cards, which is huge no matter how you lslice why it is what it is. Even if it is a titan I'd be interested in seeing how much better it performs in business applications etc, doubt it's even worth it in that scenario.
 
All games have significant fluctuations. MP can be worse... but that isn't dependent on the GPU really. That's the nature of MP gaming man.

360Hz will become more common, indeed. By that time, the next gen cards will be out. Like 4K, any decent 360 Hz monitor is quite pricey and overkill for anyone who doesn't have F U money or plays competitively where that stuff matters.

You're fighting for the <1%. :)
Nah, I'm just saying people shouldn't make blanket statements without acknowledging the benefits that most people don't realize can actually be very noticeable. We all acknowledge the price/performance value statement of a 3080 over a 3090. However, I do not agree with anyone saying the 3090 makes no noticeable difference at 1080p.
360Hz is supposed to be available starting end of this month. Any serious competitive gamer will want one.
 
Last edited:
Sorry, this is not a 1080p card, not by a long shot. If you're here to argue about this card merits for this low resolution I don't know what else to say because I don't have any polite words in my lexicon.

What's next, you're going to argue that an excavator is not a good way of planting seeds? Or maybe you need to play your favourite game at 600fps? Why?

Well no need to get upset, its just a videocard, lets look at the facts and not made up stats shall we?
Nvidia themselves pushed for new 360hz screens, nr1 so high refreshrate/framerate gaming is definitly something they are going for, now lets look at this review shall we?

Average FPS at 1080p with this card is about 200fps.....so yeah we are not even close to touching that new pushed 360hz standard and not even remotely close to your made up 600fps...

sooo yeah.
 
solid card too bad it is a "scalper" edition also
 
Well no need to get upset, its just a videocard, lets look at the facts and not made up stats shall we?
Nvidia themselves pushed for new 360hz screens, nr1 so high refreshrate/framerate gaming is definitly something they are going for, now lets look at this review shall we?

Average FPS at 1080p with this card is about 200fps.....so yeah we are not even close to touching that new pushed 360hz standard and not even remotely close to your made up 600fps...

sooo yeah.

So, NVIDIA is to be blamed for extremely poorly optimized games which are often limited by the CPU performance. AMD fans never fail to disappoint with the utmost disrespect towards intelligence and logic.

And according to you games under RDNA 2.0 will magically scale better. LMAO.

sooo yeah.

Here's some harsh reality for you:

anno-1800-1920-1080.png
borderlands-3-1920-1080.png
project-cars-3-1920-1080.png


Tell me exactly how NVIDIA is supposed to fix this suckfest.
 
Last edited:
There are diminishing returns and then there's this
IMG_20200924_202100.jpg

Truly the champ of wasted money.
 
I've ordered one and hopefully going to receive it before Cyberpunk 2077 release (I am from Italy).
What to say, will probably end up being the fastest 3090 together with Aorus Xtreme and Zotac AMP extreme. $/perf 3080 is for sure better but you're paying the fact this card is the fastest GPU on Earth.
 
Nah, I'm just saying people shouldn't make blanket statements without acknowledging the benefits that most people don't realize can actually be very noticeable. We all acknowledge the price/performance value statement of a 3080 over a 3090. However, I do not agree with anyone saying the 3090 makes no noticeable difference at 1080p.
I get you... I don't like blanket statements either, but when everything is covered except for the toes.... :D

Think of this way too... you're going to be heavily CPU limited at low res trying to feed all of those frames in. I think in TPU reviews we see much larger, significant gains at the higher resolution. To use this on anything less than 4K does it an injustice IMO. You're so CPU limited, especially at 1080p, the bottleneck shifts significantly to the CPU with this monster (and the 3080).

Watch the FPS go up with CPU clock increases at that low of a res. Look how poorly AMD does with its lack of clock speed. I think starting off at a static 4.5, then go 4.7, 4.9, 5.1, 5.3, etc... hell, I'd love to see some single stage results at 5.5+ just to see if it keeps going so we know how high it scales. The lows aren't due to the GPU at that point1080p).

These overclocked versions like the Strix are awesome, but, unless you are at 4K+, the 'value' (and I use that term loosely, like hot dog down a hallway loosely) of this card for gaming is garbage. Similar to the Titan it 'replaces'. You bought the Titan because it was a bridge between gaming and prosumer offering perks for the latter. This is the same thing going on here, sans the Titan name (so far). Everything about this card screams Titan........except for the Geforce naming.......thanks Nvidia. :p

Average FPS at 1080p with this card is about 200fps.....so yeah we are not even close to touching that new pushed 360hz standard and not even remotely close to your made up 600fps...
CPU limited at such a low resolution man. It may be the game devs, it may be too slow CPUs for the beast of a GPU at 1080p. Look at how little the card gains over a 3080 at 1080p compared to 4K.

Only 5 years ago Graphics card flagship were only 500 dollars today you need 2000 dollars. What a rip off
The flagship is the 3080. This is a Titan replacement.

Also, the first Titan was released in 2013... IIRC, it was $1000. So, seven years later, it costs 50% more (reference to FE). And the 780Ti was $700... 780 was $500. ;)
 
Last edited:
Only 5 years ago Graphics card flagship were only 500 dollars today you need 2000 dollars. What a rip off
 
The BOM has steadily increased too.
According to one guy on here these cooler's cost ÂŁ20 (can't recall who luckily for him or his name would be here), the silicon cost has gone up, so has Nvidia's die size , and the cost of all other parts making the 3080 and 70 good buys, personally I think the markup on these 3090 is a bit much, they're worth more than a 3080 but not this much.

@EarthDog , I think they (Nvidia)gimped the pro use of the 3090 specifically because Pros were just buying Titan's, they're 6000 Quadros are due and I think they want their money for them.
 
I get you... I don't like blanket statements either, but when everything is covered except for the toes.... :D

Think of this way too... you're going to be heavily CPU limited at low res trying to feed all of those frames in. I think in TPU reviews we see much larger, significant gains at the higher resolution. To use this on anything less than 4K does it an injustice IMO. You're so CPU limited, especially at 1080p, the bottleneck shifts significantly to the CPU with this monster (and the 3080).

Watch the FPS go up with CPU clock increases at that low of a res. Look how poorly AMD does with its lack of clock speed. I think starting off at a static 4.5, then go 4.7, 4.9, 5.1, 5.3, etc... hell, I'd love to see some single stage results at 5.5+ just to see if it keeps going so we know how high it scales. The lows aren't due to the GPU at that point....

These overclocked versions like the Strix are awesome, but, unless you are at 4K+, the 'value' (and I use that term loosely, like hot dog down a hallway loosely) of this card for gaming is garbage. Similar to the Titan it 'replaces'.
Yet the 3090 is geared primarily towards the "toes" or 1%.

Here are my PC's main specs and real-world results:
8700k 5.03GHz, 0 AVX offset
2080 Ti - Asus Strix OC - Constant 1890core, 1900mem
32GB DDR4 3216, 13-14-14-28-350
240Hz - PG258Q

When fps drops to ~125fps in heavy scenes in Apex Legends at 1680x1050 (even lower than 1080), GPU load is at 80%+ read using GPU-Z. As most of us should know, 80% GPU load does not mean it's not occasionally hitting full or near-full capacity. It's just a rough guideline but it's quite high.
When FPS jumps to around 187fps in typical scenes, GPU load is around 50% (lower).
CPU usage stays relatively constant with the total/average CPU usage at ~25%, and highest total usage on any physical core at ~40%.

Clearly the CPU is not the bottleneck in this case, whereas the GPU is.

Of course we'll see better gains for high-end GPUs at higher resolutions on average because the GPU will be the bottleneck more often for obvious reasons throughout the scenes but that does not mean you can't be GPU bound at lower resolutions as scenes vary drastically. It simply depends on the complexity/load of the scene vs the capacity of the graphics card.

Again, you keep arguing value while I am not. I'm pointing out the performance differences for those that want it and are willing to pay for it.

I've ordered one and hopefully going to receive it before Cyberpunk 2077 release (I am from Italy).
What to say, will probably end up being the fastest 3090 together with Aorus Xtreme and Zotac AMP extreme. $/perf 3080 is for sure better but you're paying the fact this card is the fastest GPU on Earth.
Lucky bastard. I'm assuming you're saying you were able to order the Asus 3090 Strix O24G.

Where did you find it?

I've only seen it listed as unavailable on BestBuy and Newegg.
 
Last edited:
Shame on NV for marketing it as a 8K gaming GPU. And from that perspective, it is a patheticly performing product. How can it even be positive that it is "19% faster" than the 3080? Which is not true in terms of that it is the reference 3080. When you the AIB 3090 to an AIB 3080, the difference is more like 15% or below. And comparing the same AIB models, difference is closer to 10%. This is a shame. A big shame. For efficiency, so far both the 3080 and the 3090 sits in a not so better position than the 2080 and 2080 Ti. The only positive thing so far in the 3000 series is the performance upgrade of the 3080, but it's still short of the 1080's performance gain, not to speak of the 1/3 of its efficiency gain. And what we have seen of the 3070 so far (Galax slides showing it's well below the 2080 Ti, while NV slides claimed it was faster than the 2080 Ti), it isn't supposed to be a bomb either. From NV, my last hope is in the 3060.
 
Last edited:
So, NVIDIA is to be blamed for extremely poorly optimized games which are often limited by the CPU performance. AMD fans never fail to disappoint with the utmost disrespect towards intelligence and logic.

And according to you games under RDNA 2.0 will magically scale better. LMAO.

sooo yeah.

Here's some harsh reality for you:

Tell me exactly how NVIDIA is supposed to fix this suckfest.

Literally did not mention a thing about AMD....the fact that you felt the need to bring that company up says a lot about your mindset in this little discussion I'm afraid.

Lets also not forget its the Nvidia users who did not jump from the 10 series to the 20 series and that the presentation was very much about convincing them "thats its time to upgrade"....

And are you really saying the stack of games TPU uses for their reviews are bad choices? that they somehow not represent reality out there? like I honestly im kinda taken back but your viewpoint here.

And intelligence and logic, already said man, 50% more performacne for 50% more power consumption is not some impressive leap in tech, 50% performance while using no more power would be, 50% more performance for 20% less power...now that would be something to cheer for.
 
80% gpu load is a problem. They should be at 99% when they arent held back. ;)
That's a common misconception. GPU load or utilization is only a certain kind of metric over a sample period of time. Specifically:

utilization.gpuPercent of time over the past sample period during which one or more kernels was executing on the GPU.
The sample period may be between 1 second and 1/6 second depending on the product.

It does not indicate exactly the "capacity" used. There are many other parts of the entire graphics system/pipeline that can be a bottleneck yet result in <99% "utilization".
 
I like that Nvidia built this BFGPU and put it out there. It just showcases their top tier product and the potential of their technology.

Kind of like cars, M3 vs. 3-series. You pay for those infinitesimal improvements.
You are funny, really. :)

This card (just like I predicted) is a lot more power efficient that the RTX 3080:

performance-per-watt_3840-2160.png


So, your criticism is not totally sincere as the RTX 3080 must be an even worse card in your opinion.
LOL, a lot more power efficient than the 3080? F.e. this chart compares an AIB 3090 to a reference 3080. If you take ASUS TUF 3080 f. .e. it's 2% better in perf/W than the ref 3080. And if you check back-to-back generations, the 2080 Ti was 22% more efficient than the 1080 Ti (which is not a big leap at all), while the 3080 Ti is also around that 22% uplift from the 2080 Ti (assuming a 96-97% reference in the chart). When you check the 980 Ti - 1080 Ti switch, there is a whopping 65% efficiency increase. 3 times more than the 1080TI-2080Ti or the 2080Ti-3080Ti switch.

However, if you check all the other 3 3090's results, all are much worse in terms of efficiency than the ASUS Strix as they all consume more power and they perform worse. There is some problem for me regarding that.
performance-per-watt_3840-2160.png

performance-per-watt_3840-2160.png


performance-per-watt_3840-2160.png

If you check power consumption charts, ASUS consumes 20W less in games while the ASUS card performs 9% better in games. However in Furmark, the ASUS card consumes 50W more. For me, this shows some kind of measurement anomalies in the gaming section.

So, NVIDIA is to be blamed for extremely poorly optimized games which are often limited by the CPU performance.

Please don't cry. :(
 
Last edited:
> no-holes-barred

no holds barred
 
Back
Top