Thursday, September 10th 2020

NVIDIA GeForce RTX 3090 Looks Huge When Installed

Here's the first picture of an NVIDIA GeForce RTX 3090 Founders Edition card installed in an tower case. The triple-slot card measures 31.3 cm in length, and is 13.8 cm tall. Its design is essentially an upscale of that of the RTX 3080. The card still pulls power from a single 12-pin power connector, with an adapter included for two 8-pin connectors to convert to the 12-pin. The typical board power of the card is rated at 350 W. This particular card in the leak, posted on ChipHell forums, is pre-production as VideoCardz comments, given that some parts of its metal superstructure lack the chrome finish of the card NVIDIA CEO Jen-Hsun Huang unveiled on September 1. The RTX 3090 launches on September 24.
Sources: VideoCardz, ChipHell Forums
Add your own comment

122 Comments on NVIDIA GeForce RTX 3090 Looks Huge When Installed

#101
ratirt
Not sure why people think this card is Titan? It is called 3090 not Titan for one thing and second, The Titans have always had full chip enabled and this one doesn't (well most of them there was this XP and X Titan crap NV pulled off with 1080 TI era).
Posted on Reply
#102
bug
ratirt
Not sure why people think this card is Titan? It is called 3090 not Titan for one thing and second, The Titans have always had full chip enabled and this one doesn't (well most of them there was this XP and X Titan crap NV pulled off with 1080 TI era).
Well, it was confusing with Turing, too. With 2080Ti being so similar to the Titan (save for the price, but even then, they were both in the "crazy" territory).

Neither card serves any purpose, except letting Nvidia tell AMD: see? whatever you can do, we can build two more tiers on top. Halo products through and through.
Posted on Reply
#103
EarthDog
ratirt
Not sure why people think this card is Titan? It is called 3090 not Titan for one thing and second, The Titans have always had full chip enabled and this one doesn't (well most of them there was this XP and X Titan crap NV pulled off with 1080 TI era).
We've went over this (in multiple threads) on why some(me) feel this way. There may be a Titan, but NV literally talked about Titan class and in the same breath he took the 3090 out of his fancy ass oven. :p

Also, just before that, they state verbally AND on the slide that the 3080 is the "FLAGSHIP". Either one will come out later (doubtful to me - why considering what the 3090 is) or they are ditching the Titan name. Why, if the 3090 isn't a titan replacement (as he inferred in the video) is the 3080 a flagship? What is the 3090 then?
Posted on Reply
#104
Vayra86
Another-Stin
I wasn't talking to OP specifically, as I don't know OP's situation.
How's posting your opinion any different than me posting mine? :)

If someone makes minimum wage and buys a 3090, I just hope they're not going into debt in doing so. It's just so not worth it after experiencing the pain of paying down debts myself.
The fact that you became so defensive on this, though, shows that it must hit home. I was raised in a blue collar family and took on debt to get my Msc in ECE. I understand, from a first-hand perspective, what it's like... And I also understand the freedom of being disciplined enough to live well below your means in order to live a better life tomorrow. I didn't mean to get into a big, philosophical or financial discussion. I've just seen so many posts on "can't wait to pick up a 3090", and I'm just surprised is all. It's not really targeted for gamers, and it seems like most enthusiasts are still on 1440p or 1440p ultrawide anyways, so why buy all that VRAM when there's likely a 3080 (Ti?) 20GB coming next year? Sorry to derail the convo!

Back to tech -- I think the 3090 is an overpriced 3080 Ti, plus I personally wouldn't use all that VRAM. I'm looking forward to tinkering with the 3080, tho! :)
Its a tech. forum.

Tech: you get enthusiasts here so a much higher pecentage wanting the biggest hardware they can find. Useful? Dude, its big. Its the reason pimp cars, drive motorcycles etc. for fun.
Forum: you get 90% lies and bs fed to you. The better half saying they'll buy every next product is just that.
EarthDog
We've went over this (in multiple threads) on why some(me) feel this way. There may be a Titan, but NV literally talked about Titan class and in the same breath he took the 3090 out of his fancy ass oven. :p

Also, just before that, they state verbally AND on the slide that the 3080 is the "FLAGSHIP". Either one will come out later (doubtful to me - why considering what the 3090 is) or they are ditching the Titan name. Why, if the 3090 isn't a titan replacement (as he inferred in the video) is the 3080 a flagship? What is the 3090 then?
Simple. A graphics card

/thread ;)
Posted on Reply
#105
Assimilator
BluesFanUK
There should be a wall of shame for all the plebs who bought a 2080ti and for the ones considering a 3090. Bonkers monkey regardless of how flush you are.
Last time I checked, people are allowed to spend their hard-earned money the way they want. Jealousy makes you nasty.
EarthDog
We've went over this (in multiple threads) on why some(me) feel this way. There may be a Titan, but NV literally talked about Titan class and in the same breath he took the 3090 out of his fancy ass oven. :p

Also, just before that, they state verbally AND on the slide that the 3080 is the "FLAGSHIP". Either one will come out later (doubtful to me - why considering what the 3090 is) or they are ditching the Titan name. Why, if the 3090 isn't a titan replacement (as he inferred in the video) is the 3080 a flagship? What is the 3090 then?
Why does it matter what it's called, FFS?
Posted on Reply
#106
EarthDog
Assimilator
Why does it matter what it's called, FFS?
You've confused my simple response to a question as someone who GAF, me thinks. :p

Ask the others. ;)
Posted on Reply
#107
HenrySomeone
BoboOOZ
I actually agree with most of what's been said in that video and I remember well when many of those things happened, and I still don't like it one bit.

But in this case, I think there's no need to call out the conspiracy yet, wait for Gamer's Nexus review of how that design affects thermals of different case setups. I'm pretty sure for many cases the outcome will be positive (improving the airflow in the case, although outputting more heat).
If anyone should be displeased, it would have to be Intel owners because let's face it - that's what 95%+ of buyers of this bad boy are going to pair them with; the same as with 2080Ti, but obviously even more so, due to the much increased performance - you don't want a (notable in many, massive in some cases) cpu bottleneck in every other game...

Posted on Reply
#108
theoneandonlymrk
HenrySomeone
If anyone should be displeased, it would have to be Intel owners because let's face it - that's what 95%+ of buyers of this bad boy are going to pair them with; the same as with 2080Ti, but obviously even more so, due to the much increased performance - you don't want a (notable in many, massive in some cases) cpu bottleneck in every other game...


You realise not everyone would buy a 3090 or 80 to 1080,p game on, 1440_4k is where these cards are aimed at making the CPU disparity negligible at best.
1080p wins mean absolutely nothing to me for example ,and at 4k most CPUs are equal, ATM.

And why would any of this anger intel owners, probably 99% couldn't give a rat's ass in reality the 1% of enthusiasts are not the norm.
Posted on Reply
#109
HenrySomeone
Joke's on you bud - the first of the above graphs is 1440p....with a 2080Ti and everywhere there are already notable differences at that resolution with a top Turing card, there are now going to be even at 4k with the Ampere champion. The only case where those wouldn't matter much is if you have a 4k 60Hz display AND don't intend to upgrade to a higher refresh rate one anytime soon, but I suspect most future 3090 owners will...
Posted on Reply
#110
theoneandonlymrk
HenrySomeone
Joke's on you bud - the first of the above graphs is 1440p....with a 2080Ti and everywhere there are already notable differences at that resolution with a top Turing card, there are now going to be even at 4k with the Ampere champion. The only case where those wouldn't matter much is if you have a 4k 60Hz display AND don't intend to upgrade to a higher refresh rate one anytime soon, but I suspect most future 3090 owners will...
Yes the niche will indeed have a niche, They are not more than a skint bit of the 1% of enthusiasts.

4k120 is appealing tbf, but expensive , do you believe lot's of people have a few thousand to spend on a GPU and monitor.

Hardware unboxed tested the Radeon 7 ,1080ti and 2080ti with present driver's recently, go check it out, yes the 2080Ti wins all but there are instances it gets beat by the 1080Ti.

It simply wasn't as good as some hoped, and certainly wasn't anything like the value some wanted.

We all do want our games to run and look as good as possible, but that is measured against cost for nearly everyone, the next guy saying 1500£ is nothing should note 98% of reader's of that opinion disagree.
Posted on Reply
#111
Toxicscream
I will put it inside a Corsair 1000D and it will look normal size.
Posted on Reply
#112
phanbuey
if the fan is meant to pull air through that card then they mounted the blades backwards. That is a fan oriented to push air through... if it spins the other way to to pull air then it's definitely mounted backwards
Posted on Reply
#113
Caring1
HenrySomeone
If anyone should be displeased, it would have to be Intel owners because let's face it - that's what 95%+ of buyers of this bad boy are going to pair them with...
I'm calling Troll post.
Posted on Reply
#114
HenrySomeone
To give you just one example (that you can quickly start checking out) - out of several dozen users on this forum who have a 2080Ti listed as their gpu (or one of their gpus) I think there is only 1 who has it paired with Ryzen of any kind. Now, 3090 will be at least around 50% faster and trust me, people who dish out 1.5 - 1.8k $ for a top of the line gpu don't want to see it being bottlenecked by an inferior cpu, especially not one that is just a couple bucks cheaper than the competing, faster option.
Posted on Reply
#115
John Naylor
theoneandonlymrk
You realise not everyone would buy a 3090 or 80 to 1080,p game on, 1440_4k is where these cards are aimed at making the CPU disparity negligible at best.
1080p wins mean absolutely nothing to me for example ,and at 4k most CPUs are equal, ATM.
Until now... frankly 4k did not provide the user experience that 1440p did, at least to my eyes. I didn't see the logic to doing 4k until it can do 1440p in ULMB @ 120 hz or better. I think we will see a drop on the relative market saturation of the xx60 versus xx70 versus xx80 as more people will drop a tier because they are not gaming at 4k..... only 2.24% of steam users are at 4k while 2 out of every 3 are still at 1080. Only 6.6% are at 1440p

1024 x 768 0.35%
1280 x 800 0.55%
1280 x 1024 1.12%
1280 x 720 0.35%
1360 x 768 1.51%
1366 x 768 9.53%
1440 x 900 3.05%
1600 x 900 2.46%
1680 x 1050 1.83%
1920 x 1080 65.55%
1920 x 1200 0.77%
2560 x 1440 6.59%
2560 x 1080 1.14%
3440 x 1440 0.90%
3840 x 2160 2.24%
Other 2.07%

The thing to consider tho is the same that we have always had to consider .... the relative performance depends on what you are measuring. We've always had the argument whereby both sides proved they were right simply by choosing what to test and which games to test with.

RAM Speed Doesn't Matter:

Look at average fps and you could show that that in most cases it didn't matter and that was because the GPU was the bottleneck. But look at other things such as:

a) Certain Games Like STRIKER and F1 and RAM speed did matter.
b) Go with SLI / CF and RAM Speed mattered
c) Look at Min. fps and RAM Speed mattered

This was because the GPU was the bottleneck at average fps ... in other situations it was not.

RAM Quantity Doesn't Matter as long as ya have xx GB:

Look at average fps and you could show that that in most cases it didn't matter and that was because the GPU was the bottleneck. But look at other things such as:

a) Certain Games and RAM Quantity did matter.
b) Go with SLI / CF and RAM Quantity mattered.
c) Look at Min. fps and RAM Quantity mattered

Again, this was because the GPU was the bottleneck at average fps ... in other situations it was not.


At 140p / 4k, CPU performance Doesn't Matter:

Yes, if you are a gamer, there's hardly a reason to buy anything more than the Intel 10400 at any resolution and as you say... the differences lessen at higher resolutions. However, that 1s not universay

tpucdn.com/review/intel-core-i5-10400f/images/relative-performance-games-1920-1080.png
tpucdn.com/review/intel-core-i5-10400f/images/relative-performance-games-38410-2160.png

A 9900k paired with a 2080 Ti delivered 49.1 average fps (41.0 @ 99th percentile) at 1440p in MS Flight Simulator Ultra Settings
A ($500 MSRP) 3900X paired with a 2080 Ti delivered 43.9 average fps (34.5 @ 99th percentile) at 1440p in MS Flight Simulator Ultra Settings

That's a performance difference of 12% (19% @ 99th percentile)

Point here being that, while average fps deserves it's "go to" yardstick for relative performance between cards, as the average to minimum ratio will most times be relative, the relative performance of CPUs and RAM can be significantly impacted by other factors. This is also true with VRAM ... At 1080p, you will rarely see a difference between a 3 GB / 4 GB card or 4 / 8 GB card .... but if you all big on Hitman and Tomb Raider ... it will be something to consider. Witcher 3 and most other games was less than 5% performance difference and that can be attributed solely to the 11% shader count difference.

Would love to see TPU include 99th percentile numbers in the TPU reviews ... would also like to see overclocked performance. But while most sites that do provide this info. only test a handful of games, it's asking a bit much to do this with TPUS 18 - 23 game test suite.
Posted on Reply
#116
theoneandonlymrk
John Naylor
Until now... frankly 4k did not provide the user experience that 1440p did, at least to my eyes. I didn't see the logic to doing 4k until it can do 1440p in ULMB @ 120 hz or better. I think we will see a drop on the relative market saturation of the xx60 versus xx70 versus xx80 as more people will drop a tier because they are not gaming at 4k..... only 2.24% of steam users are at 4k while 2 out of every 3 are still at 1080. Only 6.6% are at 1440p

1024 x 768 0.35%
1280 x 800 0.55%
1280 x 1024 1.12%
1280 x 720 0.35%
1360 x 768 1.51%
1366 x 768 9.53%
1440 x 900 3.05%
1600 x 900 2.46%
1680 x 1050 1.83%
1920 x 1080 65.55%
1920 x 1200 0.77%
2560 x 1440 6.59%
2560 x 1080 1.14%
3440 x 1440 0.90%
3840 x 2160 2.24%
Other 2.07%

The thing to consider tho is the same that we have always had to consider .... the relative performance depends on what you are measuring. We've always had the argument whereby both sides proved they were right simply by choosing what to test and which games to test with.

RAM Speed Doesn't Matter:

Look at average fps and you could show that that in most cases it didn't matter and that was because the GPU was the bottleneck. But look at other things such as:

a) Certain Games Like STRIKER and F1 and RAM speed did matter.
b) Go with SLI / CF and RAM Speed mattered
c) Look at Min. fps and RAM Speed mattered

This was because the GPU was the bottleneck at average fps ... in other situations it was not.

RAM Quantity Doesn't Matter as long as ya have xx GB:

Look at average fps and you could show that that in most cases it didn't matter and that was because the GPU was the bottleneck. But look at other things such as:

a) Certain Games and RAM Quantity did matter.
b) Go with SLI / CF and RAM Quantity mattered.
c) Look at Min. fps and RAM Quantity mattered

Again, this was because the GPU was the bottleneck at average fps ... in other situations it was not.


At 140p / 4k, CPU performance Doesn't Matter:

Yes, if you are a gamer, there's hardly a reason to buy anything more than the Intel 10400 at any resolution and as you say... the differences lessen at higher resolutions. However, that 1s not universay

tpucdn.com/review/intel-core-i5-10400f/images/relative-performance-games-1920-1080.png
tpucdn.com/review/intel-core-i5-10400f/images/relative-performance-games-38410-2160.png

A 9900k paired with a 2080 Ti delivered 49.1 average fps (41.0 @ 99th percentile) at 1440p in MS Flight Simulator Ultra Settings
A ($500 MSRP) 3900X paired with a 2080 Ti delivered 43.9 average fps (34.5 @ 99th percentile) at 1440p in MS Flight Simulator Ultra Settings

That's a performance difference of 12% (19% @ 99th percentile)

Point here being that, while average fps deserves it's "go to" yardstick for relative performance between cards, as the average to minimum ratio will most times be relative, the relative performance of CPUs and RAM can be significantly impacted by other factors. This is also true with VRAM ... At 1080p, you will rarely see a difference between a 3 GB / 4 GB card or 4 / 8 GB card .... but if you all big on Hitman and Tomb Raider ... it will be something to consider. Witcher 3 and most other games was less than 5% performance difference and that can be attributed solely to the 11% shader count difference.

Would love to see TPU include 99th percentile numbers in the TPU reviews ... would also like to see overclocked performance. But while most sites that do provide this info. only test a handful of games, it's asking a bit much to do this with TPUS 18 - 23 game test suite.
Do you have template's?.
I'm not overplaying 4k!? But you are under playing it , I have enjoyed it and 1080p144Hz, but 1080p less so.
Posted on Reply
#117
moproblems99
Nothing is big in a Tower 900. I fear not.
John Naylor
Other 2.07%
Wtf is other? 640x480?
Posted on Reply
#118
bug
moproblems99
Nothing is big in a Tower 900. I fear not.



Wtf is other? 640x480?
Probably custom resolutions.
But hey, I was playing Test Drive in CGA and 320x200 ;)
Posted on Reply
#119
Aqeel Shahzad
Feeling Nostalgia

Courtesy of Nvidia when they released 8800 ultra
Posted on Reply
#120
Vayra86
John Naylor
Until now... frankly 4k did not provide the user experience that 1440p did, at least to my eyes. I didn't see the logic to doing 4k until it can do 1440p in ULMB @ 120 hz or better. I think we will see a drop on the relative market saturation of the xx60 versus xx70 versus xx80 as more people will drop a tier because they are not gaming at 4k..... only 2.24% of steam users are at 4k while 2 out of every 3 are still at 1080. Only 6.6% are at 1440p

1024 x 768 0.35%
1280 x 800 0.55%
1280 x 1024 1.12%
1280 x 720 0.35%
1360 x 768 1.51%
1366 x 768 9.53%
1440 x 900 3.05%
1600 x 900 2.46%
1680 x 1050 1.83%
1920 x 1080 65.55%
1920 x 1200 0.77%
2560 x 1440 6.59%
2560 x 1080 1.14%
3440 x 1440 0.90%
3840 x 2160 2.24%
Other 2.07%

The thing to consider tho is the same that we have always had to consider .... the relative performance depends on what you are measuring. We've always had the argument whereby both sides proved they were right simply by choosing what to test and which games to test with.

RAM Speed Doesn't Matter:

Look at average fps and you could show that that in most cases it didn't matter and that was because the GPU was the bottleneck. But look at other things such as:

a) Certain Games Like STRIKER and F1 and RAM speed did matter.
b) Go with SLI / CF and RAM Speed mattered
c) Look at Min. fps and RAM Speed mattered

This was because the GPU was the bottleneck at average fps ... in other situations it was not.

RAM Quantity Doesn't Matter as long as ya have xx GB:

Look at average fps and you could show that that in most cases it didn't matter and that was because the GPU was the bottleneck. But look at other things such as:

a) Certain Games and RAM Quantity did matter.
b) Go with SLI / CF and RAM Quantity mattered.
c) Look at Min. fps and RAM Quantity mattered

Again, this was because the GPU was the bottleneck at average fps ... in other situations it was not.


At 140p / 4k, CPU performance Doesn't Matter:

Yes, if you are a gamer, there's hardly a reason to buy anything more than the Intel 10400 at any resolution and as you say... the differences lessen at higher resolutions. However, that 1s not universay

tpucdn.com/review/intel-core-i5-10400f/images/relative-performance-games-1920-1080.png
tpucdn.com/review/intel-core-i5-10400f/images/relative-performance-games-38410-2160.png

A 9900k paired with a 2080 Ti delivered 49.1 average fps (41.0 @ 99th percentile) at 1440p in MS Flight Simulator Ultra Settings
A ($500 MSRP) 3900X paired with a 2080 Ti delivered 43.9 average fps (34.5 @ 99th percentile) at 1440p in MS Flight Simulator Ultra Settings

That's a performance difference of 12% (19% @ 99th percentile)

Point here being that, while average fps deserves it's "go to" yardstick for relative performance between cards, as the average to minimum ratio will most times be relative, the relative performance of CPUs and RAM can be significantly impacted by other factors. This is also true with VRAM ... At 1080p, you will rarely see a difference between a 3 GB / 4 GB card or 4 / 8 GB card .... but if you all big on Hitman and Tomb Raider ... it will be something to consider. Witcher 3 and most other games was less than 5% performance difference and that can be attributed solely to the 11% shader count difference.

Would love to see TPU include 99th percentile numbers in the TPU reviews ... would also like to see overclocked performance. But while most sites that do provide this info. only test a handful of games, it's asking a bit much to do this with TPUS 18 - 23 game test suite.
I think what's most telling is that despite laptops dying left and right (3-5 years is being generous for midrangers and below) and 720p being very much yesterday's mainstream, we still have a 50% higher amount of people on 1366x768 than we have on 1440p.

CPU matters. There is no question and its apparently still the last bastion for Intel, if you want high refresh rates. I mean we can walk past those 1440p benches as if nothing happened, but realistically... 4K is nothing and 1440p is for gaming, the next '1080p mainstream station'... You can safely ignore 4K for gaming even with a 3090 out and the tiny subset that uses the resolution can still drop to something lower; (Monitor sales won't tell the whole story) but it never happens the other way around (1080p owners won't DSR 4K, what's the point). As GPUs get faster, the importance of the fastest possible CPU do increase again. AMD is going to have to keep pushing hard on that to keep pace, and its still a meaningful difference in many places, with Intel.

Will the consoles drive that much desired push to 4K content? I strongly doubt it. Its a picture on a TV in the end, not any different from 720, 1080 or 1440p. Sit back a bit and you won't even notice the difference.

1080p is as relevant as its ever been and will remain so for the foreseeable future. Resolution upgrades past this point ALWAYS involve a monitor diagonal upgrade to go with it. A large number of gamers just doesn't have the desire, the space or the need for anything bigger. And the advantages of sticking to this 'low' res are clear: superb performance at the highest possible detail levels, with relatively cheap GPUs, and an easy path to 120 fps fixed which is absolutely glorious. I'll take a fast monitor over a slow as molasses TV with shit color accuracy any day of the week, regardless of diagonals. Once OLED can be transplanted to a monitor with good endurance, I'll start thinking differently. Until then, we're being sold ancient crap with lots of marketing sauce, mostly, and its all a choice of evils.
theoneandonlymrk
You realise not everyone would buy a 3090 or 80 to 1080,p game on, 1440_4k is where these cards are aimed at making the CPU disparity negligible at best.
1080p wins mean absolutely nothing to me for example ,and at 4k most CPUs are equal, ATM.

And why would any of this anger intel owners, probably 99% couldn't give a rat's ass in reality the 1% of enthusiasts are not the norm.
theoneandonlymrk
Yes the niche will indeed have a niche, They are not more than a skint bit of the 1% of enthusiasts.

4k120 is appealing tbf, but expensive , do you believe lot's of people have a few thousand to spend on a GPU and monitor.

Hardware unboxed tested the Radeon 7 ,1080ti and 2080ti with present driver's recently, go check it out, yes the 2080Ti wins all but there are instances it gets beat by the 1080Ti.

It simply wasn't as good as some hoped, and certainly wasn't anything like the value some wanted.

We all do want our games to run and look as good as possible, but that is measured against cost for nearly everyone, the next guy saying 1500£ is nothing should note 98% of reader's of that opinion disagree.
Eh? I don't follow. First, 1080p wins mean nothing, and then, 1440p wins on a 2080ti don't say anything because it only applies to a tiny niche while 4K120 is too expensive? Those 1440p wins aren't any different on a lower res either when it comes to the limitations of the CPU. The gap will remain, even with much weaker GPUs. But even more importantly, that 2080ti performance will soon be available to midrange buyers at 500 bucks.

Are you being honest with yourself here?
Posted on Reply
#121
Jism
DuxCro
I'll probably get buried by Nvidia fanboys here, but I've been doing some thinking. If by some people on the net RTX 3090 is around 20% faster than RTX 3080, isn't this card basically an RTX 3080Ti with a bunch of VRAM slapped on it? But this time instead of charging $1200 for it, Nvidia had this clever idea to call it a Titan replacement card and charge +$300 knowing that people who were willing to spend $1200 on previous top card will spend extra $300 on this card.

So RTX 70 segment costs the same $499 aagain. RTX 80 segment costs $699 again. But the top segment is now + $300. Performance gains being as from 700 to 900 series and from 900 to 1000 series. Which used to cost considerably less. 1080Ti was $699
it's called buying the full / binned chip. They can ask premium for it since there's no competition yet.
Posted on Reply
#122
theoneandonlymrk
Vayra86
I think what's most telling is that despite laptops dying left and right (3-5 years is being generous for midrangers and below) and 720p being very much yesterday's mainstream, we still have a 50% higher amount of people on 1366x768 than we have on 1440p.

CPU matters. There is no question and its apparently still the last bastion for Intel, if you want high refresh rates. I mean we can walk past those 1440p benches as if nothing happened, but realistically... 4K is nothing and 1440p is for gaming, the next '1080p mainstream station'... You can safely ignore 4K for gaming even with a 3090 out and the tiny subset that uses the resolution can still drop to something lower; (Monitor sales won't tell the whole story) but it never happens the other way around (1080p owners won't DSR 4K, what's the point). As GPUs get faster, the importance of the fastest possible CPU do increase again. AMD is going to have to keep pushing hard on that to keep pace, and its still a meaningful difference in many places, with Intel.

Will the consoles drive that much desired push to 4K content? I strongly doubt it. Its a picture on a TV in the end, not any different from 720, 1080 or 1440p. Sit back a bit and you won't even notice the difference.

1080p is as relevant as its ever been and will remain so for the foreseeable future. Resolution upgrades past this point ALWAYS involve a monitor diagonal upgrade to go with it. A large number of gamers just doesn't have the desire, the space or the need for anything bigger. And the advantages of sticking to this 'low' res are clear: superb performance at the highest possible detail levels, with relatively cheap GPUs, and an easy path to 120 fps fixed which is absolutely glorious. I'll take a fast monitor over a slow as molasses TV with shit color accuracy any day of the week, regardless of diagonals. Once OLED can be transplanted to a monitor with good endurance, I'll start thinking differently. Until then, we're being sold ancient crap with lots of marketing sauce, mostly, and its all a choice of evils.




Eh? I don't follow. First, 1080p wins mean nothing, and then, 1440p wins on a 2080ti don't say anything because it only applies to a tiny niche while 4K120 is too expensive? Those 1440p wins aren't any different on a lower res either when it comes to the limitations of the CPU. The gap will remain, even with much weaker GPUs. But even more importantly, that 2080ti performance will soon be available to midrange buyers at 500 bucks.

Are you being honest with yourself here?
It's a stretch to say the 2080ti was a success was a point.
And given perspective, Intel CPUs are not That much better for gaming than AMD was another point made in reply to only Intel man and that's the main point I was getting at, it's hard with a tangential argument passed back then a an unquoted sneaky reply.
Posted on Reply
Add your own comment