• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 3090 Looks Huge When Installed

Joined
Dec 18, 2018
Messages
24 (0.01/day)
System Name godzilla
Processor Intel Core i7-920
Cooling Air
Memory 12GB
Video Card(s) Nvidia Geforce 970
Storage Samsung 970 EVO
Display(s) LG OLED55C9
Audio Device(s) Sennheiser HD 800 S
Mouse Logitech G Pro Wireless
Keyboard Logitech G19
They did not even try to make an unobtrusive cable adapter.
It's not a Titan, AIBs got you covered with their 3 fan RGB clown editions xD
 
Joined
May 31, 2016
Messages
4,325 (1.50/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 16GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
Not sure why people think this card is Titan? It is called 3090 not Titan for one thing and second, The Titans have always had full chip enabled and this one doesn't (well most of them there was this XP and X Titan crap NV pulled off with 1080 TI era).
 

bug

Joined
May 22, 2015
Messages
13,226 (4.06/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Not sure why people think this card is Titan? It is called 3090 not Titan for one thing and second, The Titans have always had full chip enabled and this one doesn't (well most of them there was this XP and X Titan crap NV pulled off with 1080 TI era).
Well, it was confusing with Turing, too. With 2080Ti being so similar to the Titan (save for the price, but even then, they were both in the "crazy" territory).

Neither card serves any purpose, except letting Nvidia tell AMD: see? whatever you can do, we can build two more tiers on top. Halo products through and through.
 
Joined
Dec 31, 2009
Messages
19,366 (3.70/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
Not sure why people think this card is Titan? It is called 3090 not Titan for one thing and second, The Titans have always had full chip enabled and this one doesn't (well most of them there was this XP and X Titan crap NV pulled off with 1080 TI era).
We've went over this (in multiple threads) on why some(me) feel this way. There may be a Titan, but NV literally talked about Titan class and in the same breath he took the 3090 out of his fancy ass oven. :p

Also, just before that, they state verbally AND on the slide that the 3080 is the "FLAGSHIP". Either one will come out later (doubtful to me - why considering what the 3090 is) or they are ditching the Titan name. Why, if the 3090 isn't a titan replacement (as he inferred in the video) is the 3080 a flagship? What is the 3090 then?
 
Joined
Sep 17, 2014
Messages
20,944 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
I wasn't talking to OP specifically, as I don't know OP's situation.
How's posting your opinion any different than me posting mine? :)

If someone makes minimum wage and buys a 3090, I just hope they're not going into debt in doing so. It's just so not worth it after experiencing the pain of paying down debts myself.
The fact that you became so defensive on this, though, shows that it must hit home. I was raised in a blue collar family and took on debt to get my Msc in ECE. I understand, from a first-hand perspective, what it's like... And I also understand the freedom of being disciplined enough to live well below your means in order to live a better life tomorrow. I didn't mean to get into a big, philosophical or financial discussion. I've just seen so many posts on "can't wait to pick up a 3090", and I'm just surprised is all. It's not really targeted for gamers, and it seems like most enthusiasts are still on 1440p or 1440p ultrawide anyways, so why buy all that VRAM when there's likely a 3080 (Ti?) 20GB coming next year? Sorry to derail the convo!

Back to tech -- I think the 3090 is an overpriced 3080 Ti, plus I personally wouldn't use all that VRAM. I'm looking forward to tinkering with the 3080, tho! :)

Its a tech. forum.

Tech: you get enthusiasts here so a much higher pecentage wanting the biggest hardware they can find. Useful? Dude, its big. Its the reason pimp cars, drive motorcycles etc. for fun.
Forum: you get 90% lies and bs fed to you. The better half saying they'll buy every next product is just that.
We've went over this (in multiple threads) on why some(me) feel this way. There may be a Titan, but NV literally talked about Titan class and in the same breath he took the 3090 out of his fancy ass oven. :p

Also, just before that, they state verbally AND on the slide that the 3080 is the "FLAGSHIP". Either one will come out later (doubtful to me - why considering what the 3090 is) or they are ditching the Titan name. Why, if the 3090 isn't a titan replacement (as he inferred in the video) is the 3080 a flagship? What is the 3090 then?

Simple. A graphics card

/thread ;)
 
Joined
Feb 18, 2005
Messages
5,238 (0.75/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Logitech G613
Software Windows 10 Professional x64
There should be a wall of shame for all the plebs who bought a 2080ti and for the ones considering a 3090. Bonkers monkey regardless of how flush you are.

Last time I checked, people are allowed to spend their hard-earned money the way they want. Jealousy makes you nasty.

We've went over this (in multiple threads) on why some(me) feel this way. There may be a Titan, but NV literally talked about Titan class and in the same breath he took the 3090 out of his fancy ass oven. :p

Also, just before that, they state verbally AND on the slide that the 3080 is the "FLAGSHIP". Either one will come out later (doubtful to me - why considering what the 3090 is) or they are ditching the Titan name. Why, if the 3090 isn't a titan replacement (as he inferred in the video) is the 3080 a flagship? What is the 3090 then?

Why does it matter what it's called, FFS?
 
Joined
Apr 16, 2019
Messages
632 (0.34/day)
I actually agree with most of what's been said in that video and I remember well when many of those things happened, and I still don't like it one bit.

But in this case, I think there's no need to call out the conspiracy yet, wait for Gamer's Nexus review of how that design affects thermals of different case setups. I'm pretty sure for many cases the outcome will be positive (improving the airflow in the case, although outputting more heat).
If anyone should be displeased, it would have to be Intel owners because let's face it - that's what 95%+ of buyers of this bad boy are going to pair them with; the same as with 2080Ti, but obviously even more so, due to the much increased performance - you don't want a (notable in many, massive in some cases) cpu bottleneck in every other game...

 
Last edited:
Joined
Mar 10, 2010
Messages
11,878 (2.30/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
If anyone should be displeased, it would have to be Intel owners because let's face it - that's what 95%+ of buyers of this bad boy are going to pair them with; the same as with 2080Ti, but obviously even more so, due to the much increased performance - you don't want a (notable in many, massive in some cases) cpu bottleneck in every other game...

You realise not everyone would buy a 3090 or 80 to 1080,p game on, 1440_4k is where these cards are aimed at making the CPU disparity negligible at best.
1080p wins mean absolutely nothing to me for example ,and at 4k most CPUs are equal, ATM.

And why would any of this anger intel owners, probably 99% couldn't give a rat's ass in reality the 1% of enthusiasts are not the norm.
 
Joined
Apr 16, 2019
Messages
632 (0.34/day)
Joke's on you bud - the first of the above graphs is 1440p....with a 2080Ti and everywhere there are already notable differences at that resolution with a top Turing card, there are now going to be even at 4k with the Ampere champion. The only case where those wouldn't matter much is if you have a 4k 60Hz display AND don't intend to upgrade to a higher refresh rate one anytime soon, but I suspect most future 3090 owners will...
 
Joined
Mar 10, 2010
Messages
11,878 (2.30/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
Joke's on you bud - the first of the above graphs is 1440p....with a 2080Ti and everywhere there are already notable differences at that resolution with a top Turing card, there are now going to be even at 4k with the Ampere champion. The only case where those wouldn't matter much is if you have a 4k 60Hz display AND don't intend to upgrade to a higher refresh rate one anytime soon, but I suspect most future 3090 owners will...
Yes the niche will indeed have a niche, They are not more than a skint bit of the 1% of enthusiasts.

4k120 is appealing tbf, but expensive , do you believe lot's of people have a few thousand to spend on a GPU and monitor.

Hardware unboxed tested the Radeon 7 ,1080ti and 2080ti with present driver's recently, go check it out, yes the 2080Ti wins all but there are instances it gets beat by the 1080Ti.

It simply wasn't as good as some hoped, and certainly wasn't anything like the value some wanted.

We all do want our games to run and look as good as possible, but that is measured against cost for nearly everyone, the next guy saying 1500£ is nothing should note 98% of reader's of that opinion disagree.
 

Toxicscream

New Member
Joined
Sep 10, 2020
Messages
1 (0.00/day)
Location
Texas
I will put it inside a Corsair 1000D and it will look normal size.
 
Joined
Nov 13, 2007
Messages
10,233 (1.70/day)
Location
Austin Texas
Processor 13700KF Undervolted @ 5.6/ 5.5, 4.8Ghz Ring 200W PL1
Motherboard MSI 690-I PRO
Cooling Thermalright Peerless Assassin 120 w/ Arctic P12 Fans
Memory 48 GB DDR5 7600 MHZ CL36
Video Card(s) RTX 4090 FE
Storage 2x 2TB WDC SN850, 1TB Samsung 960 prr
Display(s) Alienware 32" 4k 240hz OLED
Case SLIGER S620
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse Xlite V2
Keyboard RoyalAxe
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
if the fan is meant to pull air through that card then they mounted the blades backwards. That is a fan oriented to push air through... if it spins the other way to to pull air then it's definitely mounted backwards
 
Joined
Oct 22, 2014
Messages
13,210 (3.80/day)
Location
Sunshine Coast
System Name Black Box
Processor Intel Xeon E3-1260L v5
Motherboard MSI E3 KRAIT Gaming v5
Cooling Tt tower + 120mm Tt fan
Memory G.Skill 16GB 3600 C18
Video Card(s) Asus GTX 970 Mini
Storage Kingston A2000 512Gb NVME
Display(s) AOC 24" Freesync 1m.s. 75Hz
Case Corsair 450D High Air Flow.
Audio Device(s) No need.
Power Supply FSP Aurum 650W
Mouse Yes
Keyboard Of course
Software W10 Pro 64 bit
If anyone should be displeased, it would have to be Intel owners because let's face it - that's what 95%+ of buyers of this bad boy are going to pair them with...
I'm calling Troll post.
 
Joined
Apr 16, 2019
Messages
632 (0.34/day)
To give you just one example (that you can quickly start checking out) - out of several dozen users on this forum who have a 2080Ti listed as their gpu (or one of their gpus) I think there is only 1 who has it paired with Ryzen of any kind. Now, 3090 will be at least around 50% faster and trust me, people who dish out 1.5 - 1.8k $ for a top of the line gpu don't want to see it being bottlenecked by an inferior cpu, especially not one that is just a couple bucks cheaper than the competing, faster option.
 
Joined
Mar 18, 2015
Messages
2,960 (0.89/day)
Location
Long Island
You realise not everyone would buy a 3090 or 80 to 1080,p game on, 1440_4k is where these cards are aimed at making the CPU disparity negligible at best.
1080p wins mean absolutely nothing to me for example ,and at 4k most CPUs are equal, ATM.

Until now... frankly 4k did not provide the user experience that 1440p did, at least to my eyes. I didn't see the logic to doing 4k until it can do 1440p in ULMB @ 120 hz or better. I think we will see a drop on the relative market saturation of the xx60 versus xx70 versus xx80 as more people will drop a tier because they are not gaming at 4k..... only 2.24% of steam users are at 4k while 2 out of every 3 are still at 1080. Only 6.6% are at 1440p

1024 x 768 0.35%
1280 x 800 0.55%
1280 x 1024 1.12%
1280 x 720 0.35%
1360 x 768 1.51%
1366 x 768 9.53%
1440 x 900 3.05%
1600 x 900 2.46%
1680 x 1050 1.83%
1920 x 1080 65.55%
1920 x 1200 0.77%
2560 x 1440 6.59%
2560 x 1080 1.14%
3440 x 1440 0.90%
3840 x 2160 2.24%
Other 2.07%

The thing to consider tho is the same that we have always had to consider .... the relative performance depends on what you are measuring. We've always had the argument whereby both sides proved they were right simply by choosing what to test and which games to test with.

RAM Speed Doesn't Matter:

Look at average fps and you could show that that in most cases it didn't matter and that was because the GPU was the bottleneck. But look at other things such as:

a) Certain Games Like STRIKER and F1 and RAM speed did matter.
b) Go with SLI / CF and RAM Speed mattered
c) Look at Min. fps and RAM Speed mattered

This was because the GPU was the bottleneck at average fps ... in other situations it was not.

RAM Quantity Doesn't Matter as long as ya have xx GB:

Look at average fps and you could show that that in most cases it didn't matter and that was because the GPU was the bottleneck. But look at other things such as:

a) Certain Games and RAM Quantity did matter.
b) Go with SLI / CF and RAM Quantity mattered.
c) Look at Min. fps and RAM Quantity mattered

Again, this was because the GPU was the bottleneck at average fps ... in other situations it was not.


At 140p / 4k, CPU performance Doesn't Matter:

Yes, if you are a gamer, there's hardly a reason to buy anything more than the Intel 10400 at any resolution and as you say... the differences lessen at higher resolutions. However, that 1s not universay


A 9900k paired with a 2080 Ti delivered 49.1 average fps (41.0 @ 99th percentile) at 1440p in MS Flight Simulator Ultra Settings
A ($500 MSRP) 3900X paired with a 2080 Ti delivered 43.9 average fps (34.5 @ 99th percentile) at 1440p in MS Flight Simulator Ultra Settings

That's a performance difference of 12% (19% @ 99th percentile)

Point here being that, while average fps deserves it's "go to" yardstick for relative performance between cards, as the average to minimum ratio will most times be relative, the relative performance of CPUs and RAM can be significantly impacted by other factors. This is also true with VRAM ... At 1080p, you will rarely see a difference between a 3 GB / 4 GB card or 4 / 8 GB card .... but if you all big on Hitman and Tomb Raider ... it will be something to consider. Witcher 3 and most other games was less than 5% performance difference and that can be attributed solely to the 11% shader count difference.

Would love to see TPU include 99th percentile numbers in the TPU reviews ... would also like to see overclocked performance. But while most sites that do provide this info. only test a handful of games, it's asking a bit much to do this with TPUS 18 - 23 game test suite.
 
Joined
Mar 10, 2010
Messages
11,878 (2.30/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
Until now... frankly 4k did not provide the user experience that 1440p did, at least to my eyes. I didn't see the logic to doing 4k until it can do 1440p in ULMB @ 120 hz or better. I think we will see a drop on the relative market saturation of the xx60 versus xx70 versus xx80 as more people will drop a tier because they are not gaming at 4k..... only 2.24% of steam users are at 4k while 2 out of every 3 are still at 1080. Only 6.6% are at 1440p

1024 x 768 0.35%
1280 x 800 0.55%
1280 x 1024 1.12%
1280 x 720 0.35%
1360 x 768 1.51%
1366 x 768 9.53%
1440 x 900 3.05%
1600 x 900 2.46%
1680 x 1050 1.83%
1920 x 1080 65.55%
1920 x 1200 0.77%
2560 x 1440 6.59%
2560 x 1080 1.14%
3440 x 1440 0.90%
3840 x 2160 2.24%
Other 2.07%

The thing to consider tho is the same that we have always had to consider .... the relative performance depends on what you are measuring. We've always had the argument whereby both sides proved they were right simply by choosing what to test and which games to test with.

RAM Speed Doesn't Matter:

Look at average fps and you could show that that in most cases it didn't matter and that was because the GPU was the bottleneck. But look at other things such as:

a) Certain Games Like STRIKER and F1 and RAM speed did matter.
b) Go with SLI / CF and RAM Speed mattered
c) Look at Min. fps and RAM Speed mattered

This was because the GPU was the bottleneck at average fps ... in other situations it was not.

RAM Quantity Doesn't Matter as long as ya have xx GB:

Look at average fps and you could show that that in most cases it didn't matter and that was because the GPU was the bottleneck. But look at other things such as:

a) Certain Games and RAM Quantity did matter.
b) Go with SLI / CF and RAM Quantity mattered.
c) Look at Min. fps and RAM Quantity mattered

Again, this was because the GPU was the bottleneck at average fps ... in other situations it was not.


At 140p / 4k, CPU performance Doesn't Matter:

Yes, if you are a gamer, there's hardly a reason to buy anything more than the Intel 10400 at any resolution and as you say... the differences lessen at higher resolutions. However, that 1s not universay


A 9900k paired with a 2080 Ti delivered 49.1 average fps (41.0 @ 99th percentile) at 1440p in MS Flight Simulator Ultra Settings
A ($500 MSRP) 3900X paired with a 2080 Ti delivered 43.9 average fps (34.5 @ 99th percentile) at 1440p in MS Flight Simulator Ultra Settings

That's a performance difference of 12% (19% @ 99th percentile)

Point here being that, while average fps deserves it's "go to" yardstick for relative performance between cards, as the average to minimum ratio will most times be relative, the relative performance of CPUs and RAM can be significantly impacted by other factors. This is also true with VRAM ... At 1080p, you will rarely see a difference between a 3 GB / 4 GB card or 4 / 8 GB card .... but if you all big on Hitman and Tomb Raider ... it will be something to consider. Witcher 3 and most other games was less than 5% performance difference and that can be attributed solely to the 11% shader count difference.

Would love to see TPU include 99th percentile numbers in the TPU reviews ... would also like to see overclocked performance. But while most sites that do provide this info. only test a handful of games, it's asking a bit much to do this with TPUS 18 - 23 game test suite.
Do you have template's?.
I'm not overplaying 4k!? But you are under playing it , I have enjoyed it and 1080p144Hz, but 1080p less so.
 
Joined
Mar 10, 2015
Messages
3,984 (1.19/day)
System Name Wut?
Processor 3900X
Motherboard ASRock Taichi X570
Cooling Water
Memory 32GB GSkill CL16 3600mhz
Video Card(s) Vega 56
Storage 2 x AData XPG 8200 Pro 1TB
Display(s) 3440 x 1440
Case Thermaltake Tower 900
Power Supply Seasonic Prime Ultra Platinum

bug

Joined
May 22, 2015
Messages
13,226 (4.06/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Nothing is big in a Tower 900. I fear not.



Wtf is other? 640x480?
Probably custom resolutions.
But hey, I was playing Test Drive in CGA and 320x200 ;)
 

aQi

Joined
Jan 23, 2016
Messages
645 (0.21/day)
Feeling Nostalgia

Courtesy of Nvidia when they released 8800 ultra
 
Joined
Sep 17, 2014
Messages
20,944 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Until now... frankly 4k did not provide the user experience that 1440p did, at least to my eyes. I didn't see the logic to doing 4k until it can do 1440p in ULMB @ 120 hz or better. I think we will see a drop on the relative market saturation of the xx60 versus xx70 versus xx80 as more people will drop a tier because they are not gaming at 4k..... only 2.24% of steam users are at 4k while 2 out of every 3 are still at 1080. Only 6.6% are at 1440p

1024 x 768 0.35%
1280 x 800 0.55%
1280 x 1024 1.12%
1280 x 720 0.35%
1360 x 768 1.51%
1366 x 768 9.53%
1440 x 900 3.05%
1600 x 900 2.46%
1680 x 1050 1.83%
1920 x 1080 65.55%
1920 x 1200 0.77%
2560 x 1440 6.59%
2560 x 1080 1.14%
3440 x 1440 0.90%
3840 x 2160 2.24%
Other 2.07%

The thing to consider tho is the same that we have always had to consider .... the relative performance depends on what you are measuring. We've always had the argument whereby both sides proved they were right simply by choosing what to test and which games to test with.

RAM Speed Doesn't Matter:

Look at average fps and you could show that that in most cases it didn't matter and that was because the GPU was the bottleneck. But look at other things such as:

a) Certain Games Like STRIKER and F1 and RAM speed did matter.
b) Go with SLI / CF and RAM Speed mattered
c) Look at Min. fps and RAM Speed mattered

This was because the GPU was the bottleneck at average fps ... in other situations it was not.

RAM Quantity Doesn't Matter as long as ya have xx GB:

Look at average fps and you could show that that in most cases it didn't matter and that was because the GPU was the bottleneck. But look at other things such as:

a) Certain Games and RAM Quantity did matter.
b) Go with SLI / CF and RAM Quantity mattered.
c) Look at Min. fps and RAM Quantity mattered

Again, this was because the GPU was the bottleneck at average fps ... in other situations it was not.


At 140p / 4k, CPU performance Doesn't Matter:

Yes, if you are a gamer, there's hardly a reason to buy anything more than the Intel 10400 at any resolution and as you say... the differences lessen at higher resolutions. However, that 1s not universay


A 9900k paired with a 2080 Ti delivered 49.1 average fps (41.0 @ 99th percentile) at 1440p in MS Flight Simulator Ultra Settings
A ($500 MSRP) 3900X paired with a 2080 Ti delivered 43.9 average fps (34.5 @ 99th percentile) at 1440p in MS Flight Simulator Ultra Settings

That's a performance difference of 12% (19% @ 99th percentile)

Point here being that, while average fps deserves it's "go to" yardstick for relative performance between cards, as the average to minimum ratio will most times be relative, the relative performance of CPUs and RAM can be significantly impacted by other factors. This is also true with VRAM ... At 1080p, you will rarely see a difference between a 3 GB / 4 GB card or 4 / 8 GB card .... but if you all big on Hitman and Tomb Raider ... it will be something to consider. Witcher 3 and most other games was less than 5% performance difference and that can be attributed solely to the 11% shader count difference.

Would love to see TPU include 99th percentile numbers in the TPU reviews ... would also like to see overclocked performance. But while most sites that do provide this info. only test a handful of games, it's asking a bit much to do this with TPUS 18 - 23 game test suite.

I think what's most telling is that despite laptops dying left and right (3-5 years is being generous for midrangers and below) and 720p being very much yesterday's mainstream, we still have a 50% higher amount of people on 1366x768 than we have on 1440p.

CPU matters. There is no question and its apparently still the last bastion for Intel, if you want high refresh rates. I mean we can walk past those 1440p benches as if nothing happened, but realistically... 4K is nothing and 1440p is for gaming, the next '1080p mainstream station'... You can safely ignore 4K for gaming even with a 3090 out and the tiny subset that uses the resolution can still drop to something lower; (Monitor sales won't tell the whole story) but it never happens the other way around (1080p owners won't DSR 4K, what's the point). As GPUs get faster, the importance of the fastest possible CPU do increase again. AMD is going to have to keep pushing hard on that to keep pace, and its still a meaningful difference in many places, with Intel.

Will the consoles drive that much desired push to 4K content? I strongly doubt it. Its a picture on a TV in the end, not any different from 720, 1080 or 1440p. Sit back a bit and you won't even notice the difference.

1080p is as relevant as its ever been and will remain so for the foreseeable future. Resolution upgrades past this point ALWAYS involve a monitor diagonal upgrade to go with it. A large number of gamers just doesn't have the desire, the space or the need for anything bigger. And the advantages of sticking to this 'low' res are clear: superb performance at the highest possible detail levels, with relatively cheap GPUs, and an easy path to 120 fps fixed which is absolutely glorious. I'll take a fast monitor over a slow as molasses TV with shit color accuracy any day of the week, regardless of diagonals. Once OLED can be transplanted to a monitor with good endurance, I'll start thinking differently. Until then, we're being sold ancient crap with lots of marketing sauce, mostly, and its all a choice of evils.

You realise not everyone would buy a 3090 or 80 to 1080,p game on, 1440_4k is where these cards are aimed at making the CPU disparity negligible at best.
1080p wins mean absolutely nothing to me for example ,and at 4k most CPUs are equal, ATM.

And why would any of this anger intel owners, probably 99% couldn't give a rat's ass in reality the 1% of enthusiasts are not the norm.
Yes the niche will indeed have a niche, They are not more than a skint bit of the 1% of enthusiasts.

4k120 is appealing tbf, but expensive , do you believe lot's of people have a few thousand to spend on a GPU and monitor.

Hardware unboxed tested the Radeon 7 ,1080ti and 2080ti with present driver's recently, go check it out, yes the 2080Ti wins all but there are instances it gets beat by the 1080Ti.

It simply wasn't as good as some hoped, and certainly wasn't anything like the value some wanted.

We all do want our games to run and look as good as possible, but that is measured against cost for nearly everyone, the next guy saying 1500£ is nothing should note 98% of reader's of that opinion disagree.

Eh? I don't follow. First, 1080p wins mean nothing, and then, 1440p wins on a 2080ti don't say anything because it only applies to a tiny niche while 4K120 is too expensive? Those 1440p wins aren't any different on a lower res either when it comes to the limitations of the CPU. The gap will remain, even with much weaker GPUs. But even more importantly, that 2080ti performance will soon be available to midrange buyers at 500 bucks.

Are you being honest with yourself here?
 
Last edited:
Joined
Dec 30, 2010
Messages
2,099 (0.43/day)
I'll probably get buried by Nvidia fanboys here, but I've been doing some thinking. If by some people on the net RTX 3090 is around 20% faster than RTX 3080, isn't this card basically an RTX 3080Ti with a bunch of VRAM slapped on it? But this time instead of charging $1200 for it, Nvidia had this clever idea to call it a Titan replacement card and charge +$300 knowing that people who were willing to spend $1200 on previous top card will spend extra $300 on this card.

So RTX 70 segment costs the same $499 aagain. RTX 80 segment costs $699 again. But the top segment is now + $300. Performance gains being as from 700 to 900 series and from 900 to 1000 series. Which used to cost considerably less. 1080Ti was $699

it's called buying the full / binned chip. They can ask premium for it since there's no competition yet.
 
Joined
Mar 10, 2010
Messages
11,878 (2.30/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
I think what's most telling is that despite laptops dying left and right (3-5 years is being generous for midrangers and below) and 720p being very much yesterday's mainstream, we still have a 50% higher amount of people on 1366x768 than we have on 1440p.

CPU matters. There is no question and its apparently still the last bastion for Intel, if you want high refresh rates. I mean we can walk past those 1440p benches as if nothing happened, but realistically... 4K is nothing and 1440p is for gaming, the next '1080p mainstream station'... You can safely ignore 4K for gaming even with a 3090 out and the tiny subset that uses the resolution can still drop to something lower; (Monitor sales won't tell the whole story) but it never happens the other way around (1080p owners won't DSR 4K, what's the point). As GPUs get faster, the importance of the fastest possible CPU do increase again. AMD is going to have to keep pushing hard on that to keep pace, and its still a meaningful difference in many places, with Intel.

Will the consoles drive that much desired push to 4K content? I strongly doubt it. Its a picture on a TV in the end, not any different from 720, 1080 or 1440p. Sit back a bit and you won't even notice the difference.

1080p is as relevant as its ever been and will remain so for the foreseeable future. Resolution upgrades past this point ALWAYS involve a monitor diagonal upgrade to go with it. A large number of gamers just doesn't have the desire, the space or the need for anything bigger. And the advantages of sticking to this 'low' res are clear: superb performance at the highest possible detail levels, with relatively cheap GPUs, and an easy path to 120 fps fixed which is absolutely glorious. I'll take a fast monitor over a slow as molasses TV with shit color accuracy any day of the week, regardless of diagonals. Once OLED can be transplanted to a monitor with good endurance, I'll start thinking differently. Until then, we're being sold ancient crap with lots of marketing sauce, mostly, and its all a choice of evils.




Eh? I don't follow. First, 1080p wins mean nothing, and then, 1440p wins on a 2080ti don't say anything because it only applies to a tiny niche while 4K120 is too expensive? Those 1440p wins aren't any different on a lower res either when it comes to the limitations of the CPU. The gap will remain, even with much weaker GPUs. But even more importantly, that 2080ti performance will soon be available to midrange buyers at 500 bucks.

Are you being honest with yourself here?
It's a stretch to say the 2080ti was a success was a point.
And given perspective, Intel CPUs are not That much better for gaming than AMD was another point made in reply to only Intel man and that's the main point I was getting at, it's hard with a tangential argument passed back then a an unquoted sneaky reply.
 
Top