Friday, August 28th 2020

NVIDIA GeForce RTX 3090 and 3080 Specifications Leaked

Just ahead of the September launch, specifications of NVIDIA's upcoming RTX Ampere lineup have been leaked by industry sources over at VideoCardz. According to the website, three alleged GeForce SKUs are being launched in September - RTX 3090, RTX 3080, and RTX 3070. The new lineup features major improvements: 2nd generation ray-tracing cores and 3rd generation tensor cores made for AI and ML. When it comes to connectivity and I/O, the new cards use the PCIe 4.0 interface and have support for the latest display outputs like HDMI 2.1 and DisplayPort 1.4a.

The GeForce RTX 3090 comes with 24 GB of GDDR6X memory running on a 384-bit bus at 19.5 Gbps. This gives a memory bandwidth capacity of 936 GB/s. The card features the GA102-300 GPU with 5,248 CUDA cores running at 1695 MHz, and is rated for 350 W TGP (board power). While the Founders Edition cards will use NVIDIA's new 12-pin power connector, non-Founders Edition cards, from board partners like ASUS, MSI and Gigabyte, will be powered by two 8-pin connectors. Next up is specs for the GeForce RTX 3080, a GA102-200 based card that has 4,352 CUDA cores running at 1710 MHz, paired with 10 GB of GDDR6X memory running at 19 Gbps. The memory is connected with a 320-bit bus that achieves 760 GB/s bandwidth. The board is rated at 320 W and the card is designed to be powered by dual 8-pin connectors. And finally, there is the GeForce RTX 3070, which is built around the GA104-300 GPU with a yet unknown number of CUDA cores. We only know that it has the older non-X GDDR6 memory that runs at 16 Gbps speed on a 256-bit bus. The GPUs are supposedly manufactured on TSMC's 7 nm process, possibly the EUV variant.
Source: VideoCardz
Add your own comment

216 Comments on NVIDIA GeForce RTX 3090 and 3080 Specifications Leaked

#176
John Naylor
ZoneDymo
No, but you do have to answer why that gap is so insanely hugh, more then twice the ram? borderline 2.5? that is just insane.
And again, the midrange of old, RX480 had 8gb of ram and the gtx1060 had 6gb of ram...to have an RTX3080 now with 10gb is just pathetic imo with the eye on progression and placement.
Peeps have been complaining about VRAM for generations of cards, and real world testing has nott borne it up. Every time test sites have compared the same GPU w/ different RAM sizes, in almost every game, there was no observable impact in performance.

alienbabeltech.com/main/gtx-770-4gb-vs-2gb-tested/3/
2GB vs 4 GB 770... when everyone was saying 2Gb was not enough, this test showed otherwise.

"There isn’t a lot of difference between the cards at 1920×1080 or at 2560×1600. We only start to see minimal differences at 5760×1080, and even so, there is rarely a frame or two difference. ... There is one last thing to note with Max Payne 3: It would not normally allow one to set 4xAA at 5760×1080 with any 2GB card as it ***claims*** to require 2750MB. However, when we replaced the 4GB GTX 770 with the 2GB version, the game allowed the setting. And there were no slowdowns, stuttering, nor any performance differences that we could find between the two GTX 770s.

Same here .... www.guru3d.com/articles_pages/gigabyte_geforce_gtx_960_g1_gaming_4gb_review,12.html
Same here .... www.extremetech.com/gaming/213069-is-4gb-of-vram-enough-amds-fury-x-faces-off-with-nvidias-gtx-980-ti-titan-x
Same here .... www.pugetsystems.com/labs/articles/Video-Card-Performance-2GB-vs-4GB-Memory-154/

Yes, you can find some games that will show a difference, most;ly SIMs w/ bad console ports

And lets remember ... the 3GB version of the 1060 did just fine. They were not the same GPU, the 3 GB version had 10% less shaders which gave the 6 GB an VRAM independent speed advantage. The extra shaders gave the 6 GB version a 6 % speed advantage over the 3 GB ... So when going to 1440p, if there was even a hint of impact due to VRAM, that 6% should be miuch bigger ... it wasn't....only saw a difference at 4k.

Based upon actual testing at lower res's and scaling up accordingly, my expectations for the 2080 were 12k, so was surprised at the odd 11 number.... for 3080, I thot they'd do 12 ... so 10 tells me thet Nvidia must know more than we know. No sense putting it in if it's not used.... no different that have an 8+6 power connector on a 225 watt card. Just because the connectors and cable can pull 225 watts (+ 75 from the slot) doesn't mean it will ever happen.

Nvidia’s Brandon Bell has addressed this topic more than once saying that the utilities that are available "all report the amount of memory requested by the GPU, not the actual memory usage. Cards will larger memory will request more memory, but that doesn’t mean that they actually use it. They simply request it because the memory is available.” The card manufacturers gave us more RAM because customers would buy it. But for a 1060 ... the test results proved we don't need more than 3 GB at 1080p, the 6 GB version didn't add anything to the mix other than more shaders.

So now for the why 10 question ?

When they did the 1060, 3 GB, why did they disable 10% of the shaders ... didn't save any money ? Let's look at W1zzard's conclusion:

"Typically, GPU vendors use the exact same GPU for SKUs of different memory capacity, just not in this case. NVIDIA decided to reduce the shader count of the GTX 1060 3 GB to 1152 from the 1280 on the 6 GB version. This rough 10% reduction in shaders lets the company increase the performance difference between the 3 GB and 6 GB version, which will probably lure potential customers closer toward the 6 GB version. "

In other words, that needed to kill 10% of the shaders because otherwise.... the performance would be the same and folks would have no reason to spring for the extra $$ for the 6 GB card. Same with the 970's 3.5 GB ... it was clearly done to gimp the 970 and provide a performance gap between the 980. When I heard there was a 3080 and a 3090, coming, I expected 12 and 16 GB.. No I can't help but wonder .... is the 12 GB the sweet spot for 4k and is the use of the 10 GB this generation's the little "gimp" needed to make the cost increase to the 3090 attactive ?
Posted on Reply
#177
RandallFlagg
I like what you're saying on the VRAM thing, based on my experience it's irrelevant in *most* games.

One exception though, MMOs. Specifically, if you've ever played a MMO and been in an area where there are like 200+ other players with many different textures for their gear, VRAM definitely comes into play. Smaller VRAM cards simply can't hold all those textures, at least not with any quality. What you wind up with is half the people look normal and the other half look like they're in underwear.

Anyway, that is kind of an edge case. Even in those settings if it's truly bothersome typically one can reduce texture quality and problem solved.
Posted on Reply
#178
Vayra86
John Naylor
Peeps have been complaining about VRAM for generations of cards, and real world testing has nott borne it up. Every time test sites have compared the same GPU w/ different RAM sizes, in almost every game, there was no observable impact in performance.

alienbabeltech.com/main/gtx-770-4gb-vs-2gb-tested/3/
2GB vs 4 GB 770... when everyone was saying 2Gb was not enough, this test showed otherwise.

"There isn’t a lot of difference between the cards at 1920×1080 or at 2560×1600. We only start to see minimal differences at 5760×1080, and even so, there is rarely a frame or two difference. ... There is one last thing to note with Max Payne 3: It would not normally allow one to set 4xAA at 5760×1080 with any 2GB card as it ***claims*** to require 2750MB. However, when we replaced the 4GB GTX 770 with the 2GB version, the game allowed the setting. And there were no slowdowns, stuttering, nor any performance differences that we could find between the two GTX 770s.

Same here .... www.guru3d.com/articles_pages/gigabyte_geforce_gtx_960_g1_gaming_4gb_review,12.html
Same here .... www.extremetech.com/gaming/213069-is-4gb-of-vram-enough-amds-fury-x-faces-off-with-nvidias-gtx-980-ti-titan-x
Same here .... www.pugetsystems.com/labs/articles/Video-Card-Performance-2GB-vs-4GB-Memory-154/

Yes, you can find some games that will show a difference, most;ly SIMs w/ bad console ports

And lets remember ... the 3GB version of the 1060 did just fine. They were not the same GPU, the 3 GB version had 10% less shaders which gave the 6 GB an VRAM independent speed advantage. The extra shaders gave the 6 GB version a 6 % speed advantage over the 3 GB ... So when going to 1440p, if there was even a hint of impact due to VRAM, that 6% should be miuch bigger ... it wasn't....only saw a difference at 4k.

Based upon actual testing at lower res's and scaling up accordingly, my expectations for the 2080 were 12k, so was surprised at the odd 11 number.... for 3080, I thot they'd do 12 ... so 10 tells me thet Nvidia must know more than we know. No sense putting it in if it's not used.... no different that have an 8+6 power connector on a 225 watt card. Just because the connectors and cable can pull 225 watts (+ 75 from the slot) doesn't mean it will ever happen.

Nvidia’s Brandon Bell has addressed this topic more than once saying that the utilities that are available "all report the amount of memory requested by the GPU, not the actual memory usage. Cards will larger memory will request more memory, but that doesn’t mean that they actually use it. They simply request it because the memory is available.” The card manufacturers gave us more RAM because customers would buy it. But for a 1060 ... the test results proved we don't need more than 3 GB at 1080p, the 6 GB version didn't add anything to the mix other than more shaders.

So now for the why 10 question ?

When they did the 1060, 3 GB, why did they disable 10% of the shaders ... didn't save any money ? Let's look at W1zzard's conclusion:

"Typically, GPU vendors use the exact same GPU for SKUs of different memory capacity, just not in this case. NVIDIA decided to reduce the shader count of the GTX 1060 3 GB to 1152 from the 1280 on the 6 GB version. This rough 10% reduction in shaders lets the company increase the performance difference between the 3 GB and 6 GB version, which will probably lure potential customers closer toward the 6 GB version. "

In other words, that needed to kill 10% of the shaders because otherwise.... the performance would be the same and folks would have no reason to spring for the extra $$ for the 6 GB card. Same with the 970's 3.5 GB ... it was clearly done to gimp the 970 and provide a performance gap between the 980. When I heard there was a 3080 and a 3090, coming, I expected 12 and 16 GB.. No I can't help but wonder .... is the 12 GB the sweet spot for 4k and is the use of the 10 GB this generation's the little "gimp" needed to make the cost increase to the 3090 attactive ?
You might be on the money there, but even so, the shader deficit is a LOT more than 10% on the 3080. Its about a 1000 shaders less?

And to be fair, I'm not that silly to think that this move will somehow push people doubting a 3080, suddenly to a 3090 that is a whole lot more expensive. They will probably look at their 2080ti and think, meh... I'm gonna sit on this 11GB for a while - heck, it even does RT already. Remember that 2080ti was already a stretch in terms of price. Will they nail that again? Doubtful, especially when the gimp of a lower tier is so obvious. Its not something one might be keen to reward. More likely I think, is that the 3080 is the step up aimed at 2080/2080S and 2070S buyers. And they gain 2GB compared to what they had.

I think a bigger part of the 10GB truth is really that they had some sort of issue getting more on it. Either a cost issue, or yield/die related. That is why I'm so so interested in how they cut this one up and how the VRAM is wired.
Posted on Reply
#179
reflex75
That's a shame only 10GB for a new high end GPU, while new consoles will have more (16GB) and be cheaper!
Posted on Reply
#180
xman2007
RoutedScripter
I come here as someone who does not like spoilers ... just days away this seems like a total psycho obsession with some of the people who think they're doing something noble with leaks ... at least the news media, if they are eager to profit off the leaks because of drama and traffic, should put up big spoiler warnings and some standards in this regard, I'm so sick of this, no, I do not know what the leak is, I only came here to say this, I will be going on tech-site blackout until I watch the proper reveal. Yes I was hiding my eyes not to do a single peek of the content or any comments, I did not read any posts here in this thread either.
Just read the title and avoid the thread, whats with the drama Britney?
Posted on Reply
#181
lexluthermiester
rtwjunkie
From what I've seen, adapters will be included.
If adapters are included why make the new connector in the first place? Makes no sense at all.
reflex75
That's a shame only 10GB for a new high end GPU, while new consoles will have more (16GB) and be cheaper!
Consoles with 16GB(which NONE are out yet BTW) have to share that 16GB system wide. Whereas 10GB dedicated for the GPU is in ADDITION to the RAM the system already has. You need to keep that technological dynamic in mind.
Posted on Reply
#182
rtwjunkie
PC Gaming Enthusiast
lexluthermiester
If adapters are included why make the new connector in the first place? Makes no sense at all.


Consoles with 16GB(which NONE are out yet BTW) have to share that 16GB system wide. Whereas 10GB dedicated for the GPU is in ADDITION to the RAM the system already has. You need to keep that technological dynamic in mind.
Remember years ago when power requirements were changing quickly? We got adapters in nearly every video card box. It’ll be the same until it is presumed everyone has new PSU’s in about 5 years.
Posted on Reply
#183
lexluthermiester
rtwjunkie
Remember years ago when power requirements were changing quickly? We got adapters in nearly every video card box. It’ll be the same until it is presumed everyone has new PSU’s in about 5 years.
While I see your point, the change is still unneeded. Current 8+6pin or 8+8pin power works perfectly well. Other than physical size, there is no benefit from this new connector.
Posted on Reply
#184
rtwjunkie
PC Gaming Enthusiast
lexluthermiester
While I see your point, the change is still unneeded. Current 8+6pin or 8+8pin power works perfectly well. Other than physical size, there is no benefit from this new connector.
:toast:
Just so you know, I didn’t make this decision and I am not for it. I’m just relaying information.
Posted on Reply
#185
lexluthermiester
rtwjunkie
:toast:
Just so you know, I didn’t make this decision and I am not for it. I’m just relaying information.
You didn't?!? Well hot damn... LOL! No worries mate. ;)
Posted on Reply
#186
TranceHead
lexluthermiester
While I see your point, the change is still unneeded. Current 8+6pin or 8+8pin power works perfectly well. Other than physical size, there is no benefit from this new connector.
Whats wrong with replacing multiple connectors with 1 single connector?
I welcome it
Posted on Reply
#187
yotano211
I would hate to see the 3000 cards in laptops, they would have to be heavily throttled to get it not to blow up the laptop.
The 2070 is a 175w card but in laptops its 115w, this 3070 is rumored to be 320w but will Nvidia stay with the same wattage of 115w or maybe increase it to 150w which is 2080 mobile territory.
I guess all laptops will be max-q designs with the 3000 cards.
Posted on Reply
#188
DuxCro
RoutedScripter
I come here as someone who does not like spoilers ... just days away this seems like a total psycho obsession with some of the people who think they're doing something noble with leaks ... at least the news media, if they are eager to profit off the leaks because of drama and traffic, should put up big spoiler warnings and some standards in this regard, I'm so sick of this, no, I do not know what the leak is, I only came here to say this, I will be going on tech-site blackout until I watch the proper reveal. Yes I was hiding my eyes not to do a single peek of the content or any comments, I did not read any posts here in this thread either.
Spoilers? What is this? Game? Movie? No! It's a computer component. You want me to spoil the entire "plot" for you? OK. Here it goes: Nvidia releases another ultra overpriced series of RTX cards, people still buy, Nvidia laughs as it gets showered in money, end credits.
Posted on Reply
#189
kanecvr
JAB Creations
Who the hell is team red? Okay, gotta figure this one out...
  • Team Blue: Intel.
  • Team Green: AMD.
  • Team Lime: Nvidia.
Yeah, there used to be ATI. "But! But! Nvidia already has green!" No, they're lime and AMD has been around way the hell longer than Nvidia.

I can't wait for the 6950XTX.
Modern amd logo is orange. Radeon tecnology group is red.

As for the poll, I'm not spending 1000$ for a gpu. I'm only upgrading when i can get a 4k capable (60 fps) gpu for 400-500$.
Posted on Reply
#190
AsRock
TPU addict
JAB Creations
They're not being weird, they're trying to distract people that they feel threatened by AMD who no longer has a stock value of under $2 from the crony tactics of both Intel and Nvidia so naturally they're going to do their absolute best. Why?

"Nvidia has the best card at $36,700! So when I spend $170 I'll somehow magically get the best card I can!" - Fanboyism

Now you take that with "Oh, but it's actually smaller than two eight pins!" which is intended for Nvidia fan boys to basically say,"I don't care that my room is 20 degrees warmer and my electricity bill doubled! I need those 14 FPS because twice the FPS of my 144Hz monitor isn't enough for some very objective unquestionable reason!"

That is the problem with cronies, they know how weaker physiological mindsets work and they have no qualms about taking advantage of people.
Wouldn't the smaller connector help with airflow ?, even more so with how this card is designed.
Posted on Reply
#191
Dwarden
DisplayPort 1.4a ? 2.0 was finalized in year 2016 ...
wth is with PC companies failing to use latest PC standards for PC components
Posted on Reply
#192
Vayra86
yotano211
I would hate to see the 3000 cards in laptops, they would have to be heavily throttled to get it not to blow up the laptop.
The 2070 is a 175w card but in laptops its 115w, this 3070 is rumored to be 320w but will Nvidia stay with the same wattage of 115w or maybe increase it to 150w which is 2080 mobile territory.
I guess all laptops will be max-q designs grossly underpowered and overpriced with the 3000 cards.
FTFY

Its another reason I really don't get this set of products. Even for a Max-Q, that would require some pretty creative shifting with tiers and performance. Effectively your mobile x80 is slower than a desktop x60 or something. I mean how much binning can you do...
Dwarden
DisplayPort 1.4a ? 2.0 was finalized in year 2016 ...
wth is with PC companies failing to use latest PC standards for PC components
Max out that profit, that's what.
Posted on Reply
#193
medi01
RandallFlagg
PhysX did not evaporate. It became ubiquitous to the point...
Of not being used in games.
But let's pretend we are talking about CPU PhysX here, to make a lame point, shall we...
Posted on Reply
#194
semantics
medi01
Of not being used in games.
But let's pretend we are talking about CPU PhysX here, to make a lame point, shall we...
Parts of PhysX are of UE4 and Unity granted the ol cuda acceleration is pretty dead but nvidia's physics package is still pretty prevalent.
Posted on Reply
#195
RoutedScripter
DuxCro
Spoilers? What is this? Game? Movie? No! It's a computer component. You want me to spoil the entire "plot" for you? OK. Here it goes: Nvidia releases another ultra overpriced series of RTX cards, people still buy, Nvidia laughs as it gets showered in money, end credits.
That's your opinion, yes it is a spoiler to me, and probably other people who are tolerating this for so long. We can't browse any of the tech sites normally because of this drama and the stupid impatience-fetish you people are into. The media will report for those who are interested sure, but what about those who do not wish to be spoiled ... so I have to avoid tech sites for weeks every time such reveals are heating up.

When I was 17 year old I sure liked all the leaks and stuff and had a good time dramatizing ... but those were the times when I had no responsibilites, I had all the time in the world, I could sit for hours and dribble on what new tech will be in the next console ... been there done that, but I got fed up with the spoilers, it feels so much better to see the E3 or some game show event how it's meant to be seen, some cheap-ass website that is all about pre-launch rumor monetization, leaking it in an almost criminally disrespectful way is the biggest slap in the face ever, not only to the fans but also to the very hard working employess who prepared those events, to the speakers who prepared those speeches, you are ruining so many experiences with this stupid rumor leak obsession.

But that's my opinion, calling it stupid, sure, it's free speech, go ahead, we can't force a private forum or site to censor, ofcourse not, but it would be a good idea to try to find compromise and to overhaul the website news sections so the news writers and staff could post spoiler warnings for articles and other forum tools to also put up some warnings- when navigating the forum.

Again, I'm not sure how big this leak is in detail but from the slight thing I've seen it seems big ... I did not see it tho, I successfully hidden my eyes out :p

If I was nvidia right now, after SO MUCH HYPE for MONTHS, where sites wrote they never seen such hype before for a GPU ... it was building up so nice, now some idiot messes it all up just 3 days prior to reveal ... if I was Nvidia, I would go nuclear, I'd practically cancel and delay everything for 30 days, delay the review GPUs, and even recall them back, yes, forcefully canceling the shipments and rerouting them while in-transit so neither the reviewers would get it, it's the reviewers that are mostly responsible for spreading and the leaks, this is how most people get spoiled. That would be the most epic reply back, so epic that all the Leak-Obsession-Disorder losers who would cry about it would be doing it for nothing, industry analysts and pundits would be impressed by the boldness of the move and I think spiritually this would have infact helped the company image in the long run and in the key people, yes the 15 year old gamer-head losers would hate it but it wouldn't matter because everyone including the investors would know everyone would still buy it after the delay, no sales for 30 days might hurt them a bit sure, but if I was the CEO i'd take it, my pride stands first!
Posted on Reply
#196
lexluthermiester
TranceHead
Whats wrong with replacing multiple connectors with 1 single connector?
I welcome it
Simple, not all cards need that many power lines. For many cards one 6pin/8pin is enough. So why build a connector that has a bunch of power lines when only a few are needed? It's needless and wasteful.
Dwarden
DisplayPort 1.4a ? 2.0 was finalized in year 2016 ...
wth is with PC companies failing to use latest PC standards for PC components
You need to read that wikipedia article you're quoting a little closer. 2.0 was on the roadmap in 2016 but it wasn't finalized until June 26th of 2019. Additionally, DP2.0 modulation ICs are expensive and offer marginal benefit to the consumer over 1.4a based ICs. DP2.0 is best suited for commercial and industrial applications ATM.
Posted on Reply
#197
medi01
semantics
Parts of PhysX are of UE4 and Unity granted the ol cuda acceleration is pretty dead but nvidia's physics package is still pretty prevalent.
In other words, CPU PhysX, not very relevant to this thread don't you think?
Posted on Reply
#198
JAB Creations
AsRock
Wouldn't the smaller connector help with airflow ?, even more so with how this card is designed.
Generally yes but what will the actual TDP of the 3090 be compared to the 2080ti? Keeping in mind that Nvidia (according to Tom) might be trying to make their line up less confusing to consumers (I won't hold my breath). The heat output is going to be enormous because Nvidia is obsessed with mindset because there are so many people who will pointlessly throw themselves at a corporate identity simply because they have the best at 1 million dollars per card when they can afford $200.

Frankly I would really like to see AMD release the 6000 series with a setting that allows you to choose a TDP target. Like a slider for the card power, even if it has steps like 25 watts or something. I know AMD will still crank the power up, I don't need 300FPS, I am still using a 60Hz screen for now and am interested in 120/144 later this year but options are limited in the area of the market I'm looking as I game maybe 2% of the time I'm at my rig.
Posted on Reply
#199
theoneandonlymrk
RoutedScripter
That's your opinion, yes it is a spoiler to me, and probably other people who are tolerating this for so long. We can't browse any of the tech sites normally because of this drama and the stupid impatience-fetish you people are into. The media will report for those who are interested sure, but what about those who do not wish to be spoiled ... so I have to avoid tech sites for weeks every time such reveals are heating up.

When I was 17 year old I sure liked all the leaks and stuff and had a good time dramatizing ... but those were the times when I had no responsibilites, I had all the time in the world, I could sit for hours and dribble on what new tech will be in the next console ... been there done that, but I got fed up with the spoilers, it feels so much better to see the E3 or some game show event how it's meant to be seen, some cheap-ass website that is all about pre-launch rumor monetization, leaking it in an almost criminally disrespectful way is the biggest slap in the face ever, not only to the fans but also to the very hard working employess who prepared those events, to the speakers who prepared those speeches, you are ruining so many experiences with this stupid rumor leak obsession.

But that's my opinion, calling it stupid, sure, it's free speech, go ahead, we can't force a private forum or site to censor, ofcourse not, but it would be a good idea to try to find compromise and to overhaul the website news sections so the news writers and staff could post spoiler warnings for articles and other forum tools to also put up some warnings- when navigating the forum.

Again, I'm not sure how big this leak is in detail but from the slight thing I've seen it seems big ... I did not see it tho, I successfully hidden my eyes out :p

If I was nvidia right now, after SO MUCH HYPE for MONTHS, where sites wrote they never seen such hype before for a GPU ... it was building up so nice, now some idiot messes it all up just 3 days prior to reveal ... if I was Nvidia, I would go nuclear, I'd practically cancel and delay everything for 30 days, delay the review GPUs, and even recall them back, yes, forcefully canceling the shipments and rerouting them while in-transit so neither the reviewers would get it, it's the reviewers that are mostly responsible for spreading and the leaks, this is how most people get spoiled. That would be the most epic reply back, so epic that all the Leak-Obsession-Disorder losers who would cry about it would be doing it for nothing, industry analysts and pundits would be impressed by the boldness of the move and I think spiritually this would have infact helped the company image in the long run and in the key people, yes the 15 year old gamer-head losers would hate it but it wouldn't matter because everyone including the investors would know everyone would still buy it after the delay, no sales for 30 days might hurt them a bit sure, but if I was the CEO i'd take it, my pride stands first!
Were you not about for the last ten years, every GPU release had a hype train.
And if not new unreleased tech, what then do we discuss, do we just compare benchmarks, fix issues, some like a less dry discussion.
You can not read them.
Posted on Reply
#200
AsRock
TPU addict
JAB Creations
Generally yes but what will the actual TDP of the 3090 be compared to the 2080ti? Keeping in mind that Nvidia (according to Tom) might be trying to make their line up less confusing to consumers (I won't hold my breath). The heat output is going to be enormous because Nvidia is obsessed with mindset because there are so many people who will pointlessly throw themselves at a corporate identity simply because they have the best at 1 million dollars per card when they can afford $200.

Frankly I would really like to see AMD release the 6000 series with a setting that allows you to choose a TDP target. Like a slider for the card power, even if it has steps like 25 watts or something. I know AMD will still crank the power up, I don't need 300FPS, I am still using a 60Hz screen for now and am interested in 120/144 later this year but options are limited in the area of the market I'm looking as I game maybe 2% of the time I'm at my rig.
From what i keep seeing is that the 3090 be around 320w and 220w for the 3080, i guess we will see soon enough. I get the feeling that the 3090 was actually made to be their next Titan with the amount of memory it has.
Posted on Reply
Add your own comment