• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 3090 and 3080 Specifications Leaked

Joined
Feb 18, 2009
Messages
1,825 (0.33/day)
Location
Slovenia
System Name Multiple - Win7, Win10, Kubuntu
Processor Intel Core i7 3820 OC@ 4.0 GHz
Motherboard Asus P9X79
Cooling Noctua NH-L12
Memory Corsair Vengeance 32GB 1333MHz
Video Card(s) Sapphire ATI Radeon RX 480 8GB
Storage Samsung SSD: 970 EVO 1TB, 2x870 EVO 250GB,860 Evo 250GB,850 Evo 250GB, WD 4x1TB, 2x2TB, 4x4TB
Display(s) Asus PB328Q 32' 1440p@75hz
Case Cooler Master CM Storm Trooper
Power Supply Corsair HX750, HX550, Galaxy 520W
Mouse Multiple, Razer Mamba Elite, Logitech M500
Keyboard Multiple - Lenovo, HP, Dell, Logitech
I come here as someone who does not like spoilers ... just days away this seems like a total psycho obsession with some of the people who think they're doing something noble with leaks ... at least the news media, if they are eager to profit off the leaks because of drama and traffic, should put up big spoiler warnings and some standards in this regard, I'm so sick of this, no, I do not know what the leak is, I only came here to say this, I will be going on tech-site blackout until I watch the proper reveal. Yes I was hiding my eyes not to do a single peek of the content or any comments, I did not read any posts here in this thread either.
 
Joined
Mar 18, 2015
Messages
2,960 (0.90/day)
Location
Long Island
No, but you do have to answer why that gap is so insanely hugh, more then twice the ram? borderline 2.5? that is just insane.
And again, the midrange of old, RX480 had 8gb of ram and the gtx1060 had 6gb of ram...to have an RTX3080 now with 10gb is just pathetic imo with the eye on progression and placement.

Peeps have been complaining about VRAM for generations of cards, and real world testing has nott borne it up. Every time test sites have compared the same GPU w/ different RAM sizes, in almost every game, there was no observable impact in performance.

2GB vs 4 GB 770... when everyone was saying 2Gb was not enough, this test showed otherwise.

"There isn’t a lot of difference between the cards at 1920×1080 or at 2560×1600. We only start to see minimal differences at 5760×1080, and even so, there is rarely a frame or two difference. ... There is one last thing to note with Max Payne 3: It would not normally allow one to set 4xAA at 5760×1080 with any 2GB card as it ***claims*** to require 2750MB. However, when we replaced the 4GB GTX 770 with the 2GB version, the game allowed the setting. And there were no slowdowns, stuttering, nor any performance differences that we could find between the two GTX 770s.

Same here .... https://www.guru3d.com/articles_pages/gigabyte_geforce_gtx_960_g1_gaming_4gb_review,12.html
Same here .... http://www.extremetech.com/gaming/2...y-x-faces-off-with-nvidias-gtx-980-ti-titan-x
Same here .... https://www.pugetsystems.com/labs/articles/Video-Card-Performance-2GB-vs-4GB-Memory-154/

Yes, you can find some games that will show a difference, most;ly SIMs w/ bad console ports

And lets remember ... the 3GB version of the 1060 did just fine. They were not the same GPU, the 3 GB version had 10% less shaders which gave the 6 GB an VRAM independent speed advantage. The extra shaders gave the 6 GB version a 6 % speed advantage over the 3 GB ... So when going to 1440p, if there was even a hint of impact due to VRAM, that 6% should be miuch bigger ... it wasn't....only saw a difference at 4k.

Based upon actual testing at lower res's and scaling up accordingly, my expectations for the 2080 were 12k, so was surprised at the odd 11 number.... for 3080, I thot they'd do 12 ... so 10 tells me thet Nvidia must know more than we know. No sense putting it in if it's not used.... no different that have an 8+6 power connector on a 225 watt card. Just because the connectors and cable can pull 225 watts (+ 75 from the slot) doesn't mean it will ever happen.

Nvidia’s Brandon Bell has addressed this topic more than once saying that the utilities that are available "all report the amount of memory requested by the GPU, not the actual memory usage. Cards will larger memory will request more memory, but that doesn’t mean that they actually use it. They simply request it because the memory is available.” The card manufacturers gave us more RAM because customers would buy it. But for a 1060 ... the test results proved we don't need more than 3 GB at 1080p, the 6 GB version didn't add anything to the mix other than more shaders.

So now for the why 10 question ?

When they did the 1060, 3 GB, why did they disable 10% of the shaders ... didn't save any money ? Let's look at W1zzard's conclusion:

"Typically, GPU vendors use the exact same GPU for SKUs of different memory capacity, just not in this case. NVIDIA decided to reduce the shader count of the GTX 1060 3 GB to 1152 from the 1280 on the 6 GB version. This rough 10% reduction in shaders lets the company increase the performance difference between the 3 GB and 6 GB version, which will probably lure potential customers closer toward the 6 GB version. "

In other words, that needed to kill 10% of the shaders because otherwise.... the performance would be the same and folks would have no reason to spring for the extra $$ for the 6 GB card. Same with the 970's 3.5 GB ... it was clearly done to gimp the 970 and provide a performance gap between the 980. When I heard there was a 3080 and a 3090, coming, I expected 12 and 16 GB.. No I can't help but wonder .... is the 12 GB the sweet spot for 4k and is the use of the 10 GB this generation's the little "gimp" needed to make the cost increase to the 3090 attactive ?
 
Joined
Jan 27, 2015
Messages
1,642 (0.49/day)
System Name Legion
Processor i7-12700KF
Motherboard Asus Z690-Plus TUF Gaming WiFi D5
Cooling Arctic Liquid Freezer 2 240mm AIO
Memory PNY MAKO DDR5-6000 C36-36-36-76
Video Card(s) PowerColor Hellhound 6700 XT 12GB
Storage WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Montech Air X
Power Supply Corsair CX750M
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
I like what you're saying on the VRAM thing, based on my experience it's irrelevant in *most* games.

One exception though, MMOs. Specifically, if you've ever played a MMO and been in an area where there are like 200+ other players with many different textures for their gear, VRAM definitely comes into play. Smaller VRAM cards simply can't hold all those textures, at least not with any quality. What you wind up with is half the people look normal and the other half look like they're in underwear.

Anyway, that is kind of an edge case. Even in those settings if it's truly bothersome typically one can reduce texture quality and problem solved.
 
Joined
Sep 17, 2014
Messages
20,776 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Peeps have been complaining about VRAM for generations of cards, and real world testing has nott borne it up. Every time test sites have compared the same GPU w/ different RAM sizes, in almost every game, there was no observable impact in performance.

2GB vs 4 GB 770... when everyone was saying 2Gb was not enough, this test showed otherwise.

"There isn’t a lot of difference between the cards at 1920×1080 or at 2560×1600. We only start to see minimal differences at 5760×1080, and even so, there is rarely a frame or two difference. ... There is one last thing to note with Max Payne 3: It would not normally allow one to set 4xAA at 5760×1080 with any 2GB card as it ***claims*** to require 2750MB. However, when we replaced the 4GB GTX 770 with the 2GB version, the game allowed the setting. And there were no slowdowns, stuttering, nor any performance differences that we could find between the two GTX 770s.

Same here .... https://www.guru3d.com/articles_pages/gigabyte_geforce_gtx_960_g1_gaming_4gb_review,12.html
Same here .... http://www.extremetech.com/gaming/2...y-x-faces-off-with-nvidias-gtx-980-ti-titan-x
Same here .... https://www.pugetsystems.com/labs/articles/Video-Card-Performance-2GB-vs-4GB-Memory-154/

Yes, you can find some games that will show a difference, most;ly SIMs w/ bad console ports

And lets remember ... the 3GB version of the 1060 did just fine. They were not the same GPU, the 3 GB version had 10% less shaders which gave the 6 GB an VRAM independent speed advantage. The extra shaders gave the 6 GB version a 6 % speed advantage over the 3 GB ... So when going to 1440p, if there was even a hint of impact due to VRAM, that 6% should be miuch bigger ... it wasn't....only saw a difference at 4k.

Based upon actual testing at lower res's and scaling up accordingly, my expectations for the 2080 were 12k, so was surprised at the odd 11 number.... for 3080, I thot they'd do 12 ... so 10 tells me thet Nvidia must know more than we know. No sense putting it in if it's not used.... no different that have an 8+6 power connector on a 225 watt card. Just because the connectors and cable can pull 225 watts (+ 75 from the slot) doesn't mean it will ever happen.

Nvidia’s Brandon Bell has addressed this topic more than once saying that the utilities that are available "all report the amount of memory requested by the GPU, not the actual memory usage. Cards will larger memory will request more memory, but that doesn’t mean that they actually use it. They simply request it because the memory is available.” The card manufacturers gave us more RAM because customers would buy it. But for a 1060 ... the test results proved we don't need more than 3 GB at 1080p, the 6 GB version didn't add anything to the mix other than more shaders.

So now for the why 10 question ?

When they did the 1060, 3 GB, why did they disable 10% of the shaders ... didn't save any money ? Let's look at W1zzard's conclusion:

"Typically, GPU vendors use the exact same GPU for SKUs of different memory capacity, just not in this case. NVIDIA decided to reduce the shader count of the GTX 1060 3 GB to 1152 from the 1280 on the 6 GB version. This rough 10% reduction in shaders lets the company increase the performance difference between the 3 GB and 6 GB version, which will probably lure potential customers closer toward the 6 GB version. "

In other words, that needed to kill 10% of the shaders because otherwise.... the performance would be the same and folks would have no reason to spring for the extra $$ for the 6 GB card. Same with the 970's 3.5 GB ... it was clearly done to gimp the 970 and provide a performance gap between the 980. When I heard there was a 3080 and a 3090, coming, I expected 12 and 16 GB.. No I can't help but wonder .... is the 12 GB the sweet spot for 4k and is the use of the 10 GB this generation's the little "gimp" needed to make the cost increase to the 3090 attactive ?

You might be on the money there, but even so, the shader deficit is a LOT more than 10% on the 3080. Its about a 1000 shaders less?

And to be fair, I'm not that silly to think that this move will somehow push people doubting a 3080, suddenly to a 3090 that is a whole lot more expensive. They will probably look at their 2080ti and think, meh... I'm gonna sit on this 11GB for a while - heck, it even does RT already. Remember that 2080ti was already a stretch in terms of price. Will they nail that again? Doubtful, especially when the gimp of a lower tier is so obvious. Its not something one might be keen to reward. More likely I think, is that the 3080 is the step up aimed at 2080/2080S and 2070S buyers. And they gain 2GB compared to what they had.

I think a bigger part of the 10GB truth is really that they had some sort of issue getting more on it. Either a cost issue, or yield/die related. That is why I'm so so interested in how they cut this one up and how the VRAM is wired.
 
Last edited:

reflex75

New Member
Joined
May 15, 2020
Messages
4 (0.00/day)
That's a shame only 10GB for a new high end GPU, while new consoles will have more (16GB) and be cheaper!
 
Joined
Jan 25, 2006
Messages
1,470 (0.22/day)
Processor Ryzen 1600AF @4.2Ghz 1.35v
Motherboard MSI B450M PRO-A-MAX
Cooling Deepcool Gammaxx L120t
Memory 16GB Team Group Dark Pro Sammy-B-die 3400mhz 14.15.14.30-1.4v
Video Card(s) XFX RX 5600 XT THICC II PRO
Storage 240GB Brave eagle SSD/ 2TB Seagate Barracuda
Display(s) Dell SE2719HR
Case MSI Mag Vampiric 011C AMD Ryzen Edition
Power Supply EVGA 600W 80+
Software Windows 10 Pro
I come here as someone who does not like spoilers ... just days away this seems like a total psycho obsession with some of the people who think they're doing something noble with leaks ... at least the news media, if they are eager to profit off the leaks because of drama and traffic, should put up big spoiler warnings and some standards in this regard, I'm so sick of this, no, I do not know what the leak is, I only came here to say this, I will be going on tech-site blackout until I watch the proper reveal. Yes I was hiding my eyes not to do a single peek of the content or any comments, I did not read any posts here in this thread either.
Just read the title and avoid the thread, whats with the drama Britney?
 
Joined
Jul 5, 2013
Messages
25,559 (6.52/day)
From what I've seen, adapters will be included.
If adapters are included why make the new connector in the first place? Makes no sense at all.

That's a shame only 10GB for a new high end GPU, while new consoles will have more (16GB) and be cheaper!
Consoles with 16GB(which NONE are out yet BTW) have to share that 16GB system wide. Whereas 10GB dedicated for the GPU is in ADDITION to the RAM the system already has. You need to keep that technological dynamic in mind.
 

rtwjunkie

PC Gaming Enthusiast
Supporter
Joined
Jul 25, 2008
Messages
13,909 (2.43/day)
Location
Louisiana -Laissez les bons temps rouler!
System Name Bayou Phantom
Processor Core i7-8700k 4.4Ghz @ 1.18v
Motherboard ASRock Z390 Phantom Gaming 6
Cooling All air: 2x140mm Fractal exhaust; 3x 140mm Cougar Intake; Enermax T40F Black CPU cooler
Memory 2x 16GB Mushkin Redline DDR-4 3200
Video Card(s) EVGA RTX 2080 Ti Xc
Storage 1x 500 MX500 SSD; 2x 6TB WD Black; 1x 4TB WD Black; 1x400GB VelRptr; 1x 4TB WD Blue storage (eSATA)
Display(s) HP 27q 27" IPS @ 2560 x 1440
Case Fractal Design Define R4 Black w/Titanium front -windowed
Audio Device(s) Soundblaster Z
Power Supply Seasonic X-850
Mouse Coolermaster Sentinel III (large palm grip!)
Keyboard Logitech G610 Orion mechanical (Cherry Brown switches)
Software Windows 10 Pro 64-bit (Start10 & Fences 3.0 installed)
If adapters are included why make the new connector in the first place? Makes no sense at all.


Consoles with 16GB(which NONE are out yet BTW) have to share that 16GB system wide. Whereas 10GB dedicated for the GPU is in ADDITION to the RAM the system already has. You need to keep that technological dynamic in mind.
Remember years ago when power requirements were changing quickly? We got adapters in nearly every video card box. It’ll be the same until it is presumed everyone has new PSU’s in about 5 years.
 
Joined
Jul 5, 2013
Messages
25,559 (6.52/day)
Remember years ago when power requirements were changing quickly? We got adapters in nearly every video card box. It’ll be the same until it is presumed everyone has new PSU’s in about 5 years.
While I see your point, the change is still unneeded. Current 8+6pin or 8+8pin power works perfectly well. Other than physical size, there is no benefit from this new connector.
 

rtwjunkie

PC Gaming Enthusiast
Supporter
Joined
Jul 25, 2008
Messages
13,909 (2.43/day)
Location
Louisiana -Laissez les bons temps rouler!
System Name Bayou Phantom
Processor Core i7-8700k 4.4Ghz @ 1.18v
Motherboard ASRock Z390 Phantom Gaming 6
Cooling All air: 2x140mm Fractal exhaust; 3x 140mm Cougar Intake; Enermax T40F Black CPU cooler
Memory 2x 16GB Mushkin Redline DDR-4 3200
Video Card(s) EVGA RTX 2080 Ti Xc
Storage 1x 500 MX500 SSD; 2x 6TB WD Black; 1x 4TB WD Black; 1x400GB VelRptr; 1x 4TB WD Blue storage (eSATA)
Display(s) HP 27q 27" IPS @ 2560 x 1440
Case Fractal Design Define R4 Black w/Titanium front -windowed
Audio Device(s) Soundblaster Z
Power Supply Seasonic X-850
Mouse Coolermaster Sentinel III (large palm grip!)
Keyboard Logitech G610 Orion mechanical (Cherry Brown switches)
Software Windows 10 Pro 64-bit (Start10 & Fences 3.0 installed)
While I see your point, the change is still unneeded. Current 8+6pin or 8+8pin power works perfectly well. Other than physical size, there is no benefit from this new connector.
:toast:
Just so you know, I didn’t make this decision and I am not for it. I’m just relaying information.
 
Joined
Feb 22, 2019
Messages
70 (0.04/day)
While I see your point, the change is still unneeded. Current 8+6pin or 8+8pin power works perfectly well. Other than physical size, there is no benefit from this new connector.
Whats wrong with replacing multiple connectors with 1 single connector?
I welcome it
 
Joined
Feb 18, 2012
Messages
2,715 (0.61/day)
System Name MSI GP76
Processor intel i7 11800h
Cooling 2 laptop fans
Memory 32gb of 3000mhz DDR4
Video Card(s) Nvidia 3070
Storage x2 PNY 8tb cs2130 m.2 SSD--16tb of space
Display(s) 17.3" IPS 1920x1080 240Hz
Power Supply 280w laptop power supply
Mouse Logitech m705
Keyboard laptop keyboard
Software lots of movies and Windows 10 with win 7 shell
Benchmark Scores Good enough for me
I would hate to see the 3000 cards in laptops, they would have to be heavily throttled to get it not to blow up the laptop.
The 2070 is a 175w card but in laptops its 115w, this 3070 is rumored to be 320w but will Nvidia stay with the same wattage of 115w or maybe increase it to 150w which is 2080 mobile territory.
I guess all laptops will be max-q designs with the 3000 cards.
 

Dux

Joined
May 17, 2016
Messages
511 (0.18/day)
I come here as someone who does not like spoilers ... just days away this seems like a total psycho obsession with some of the people who think they're doing something noble with leaks ... at least the news media, if they are eager to profit off the leaks because of drama and traffic, should put up big spoiler warnings and some standards in this regard, I'm so sick of this, no, I do not know what the leak is, I only came here to say this, I will be going on tech-site blackout until I watch the proper reveal. Yes I was hiding my eyes not to do a single peek of the content or any comments, I did not read any posts here in this thread either.
Spoilers? What is this? Game? Movie? No! It's a computer component. You want me to spoil the entire "plot" for you? OK. Here it goes: Nvidia releases another ultra overpriced series of RTX cards, people still buy, Nvidia laughs as it gets showered in money, end credits.
 
Joined
Dec 6, 2016
Messages
152 (0.06/day)
System Name The cube
Processor AMD Ryzen 5700g
Motherboard Gigabyte B550M Aorus Elite
Cooling Thermalright ARO-M14
Memory 16GB Corsair Vengeance LPX 3800mhz
Video Card(s) Powercolor Radeon RX 6900XT Red Devil
Storage Kingston 1TB NV2| 2x 1TB 2.5" Hitachi 7200rpm | 2TB 2.5" Toshiba USB 3.0
Display(s) Samsung Odyssey G5 32" + LG 24MP59G 24"
Case Chieftec CI-02B-OP
Audio Device(s) Creative X-Fi Extreme Audio PCI-E (SB1040)
Power Supply Corsair HX1200
Mouse Razer Basilisk X Hyperspeed
Keyboard Razer Ornata Chroma
Software Win10 x64 PRO
Benchmark Scores Mobile: Asus Strix Advantage G713QY | Ryzen 7 5900HX | 16GB Micron 3200MHz CL21 | RX 6800M 12GB |
Who the hell is team red? Okay, gotta figure this one out...

  • Team Blue: Intel.
  • Team Green: AMD.
  • Team Lime: Nvidia.
Yeah, there used to be ATI. "But! But! Nvidia already has green!" No, they're lime and AMD has been around way the hell longer than Nvidia.

I can't wait for the 6950XTX.

Modern amd logo is orange. Radeon tecnology group is red.

As for the poll, I'm not spending 1000$ for a gpu. I'm only upgrading when i can get a 4k capable (60 fps) gpu for 400-500$.
 

AsRock

TPU addict
Joined
Jun 23, 2007
Messages
18,851 (3.08/day)
Location
UK\USA
Processor AMD 3900X \ AMD 7700X
Motherboard ASRock AM4 X570 Pro 4 \ ASUS X670Xe TUF
Cooling D15
Memory Patriot 2x16GB PVS432G320C6K \ G.Skill Flare X5 F5-6000J3238F 2x16GB
Video Card(s) eVga GTX1060 SSC \ XFX RX 6950XT RX-695XATBD9
Storage Sammy 860, MX500, Sabrent Rocket 4 Sammy Evo 980 \ 1xSabrent Rocket 4+, Sammy 2x990 Pro
Display(s) Samsung 1080P \ LG 43UN700
Case Fractal Design Pop Air 2x140mm fans from Torrent \ Fractal Design Torrent 2 SilverStone FHP141x2
Audio Device(s) Yamaha RX-V677 \ Yamaha CX-830+Yamaha MX-630 Infinity RS4000\Paradigm P Studio 20, Blue Yeti
Power Supply Seasonic Prime TX-750 \ Corsair RM1000X Shift
Mouse Steelseries Sensei wireless \ Steelseries Sensei wireless
Keyboard Logitech K120 \ Wooting Two HE
Benchmark Scores Meh benchmarks.
They're not being weird, they're trying to distract people that they feel threatened by AMD who no longer has a stock value of under $2 from the crony tactics of both Intel and Nvidia so naturally they're going to do their absolute best. Why?

"Nvidia has the best card at $36,700! So when I spend $170 I'll somehow magically get the best card I can!" - Fanboyism

Now you take that with "Oh, but it's actually smaller than two eight pins!" which is intended for Nvidia fan boys to basically say,"I don't care that my room is 20 degrees warmer and my electricity bill doubled! I need those 14 FPS because twice the FPS of my 144Hz monitor isn't enough for some very objective unquestionable reason!"

That is the problem with cronies, they know how weaker physiological mindsets work and they have no qualms about taking advantage of people.

Wouldn't the smaller connector help with airflow ?, even more so with how this card is designed.
 
Joined
Nov 11, 2005
Messages
35 (0.01/day)
DisplayPort 1.4a ? 2.0 was finalized in year 2016 ...
wth is with PC companies failing to use latest PC standards for PC components
 
Joined
Sep 17, 2014
Messages
20,776 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
I would hate to see the 3000 cards in laptops, they would have to be heavily throttled to get it not to blow up the laptop.
The 2070 is a 175w card but in laptops its 115w, this 3070 is rumored to be 320w but will Nvidia stay with the same wattage of 115w or maybe increase it to 150w which is 2080 mobile territory.
I guess all laptops will be max-q designs grossly underpowered and overpriced with the 3000 cards.


FTFY

Its another reason I really don't get this set of products. Even for a Max-Q, that would require some pretty creative shifting with tiers and performance. Effectively your mobile x80 is slower than a desktop x60 or something. I mean how much binning can you do...

DisplayPort 1.4a ? 2.0 was finalized in year 2016 ...
wth is with PC companies failing to use latest PC standards for PC components

Max out that profit, that's what.
 
Joined
Jul 9, 2015
Messages
3,413 (1.07/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
PhysX did not evaporate. It became ubiquitous to the point...
Of not being used in games.
But let's pretend we are talking about CPU PhysX here, to make a lame point, shall we...
 
Joined
Jan 13, 2011
Messages
219 (0.05/day)
Of not being used in games.
But let's pretend we are talking about CPU PhysX here, to make a lame point, shall we...
Parts of PhysX are of UE4 and Unity granted the ol cuda acceleration is pretty dead but nvidia's physics package is still pretty prevalent.
 
Joined
Feb 18, 2009
Messages
1,825 (0.33/day)
Location
Slovenia
System Name Multiple - Win7, Win10, Kubuntu
Processor Intel Core i7 3820 OC@ 4.0 GHz
Motherboard Asus P9X79
Cooling Noctua NH-L12
Memory Corsair Vengeance 32GB 1333MHz
Video Card(s) Sapphire ATI Radeon RX 480 8GB
Storage Samsung SSD: 970 EVO 1TB, 2x870 EVO 250GB,860 Evo 250GB,850 Evo 250GB, WD 4x1TB, 2x2TB, 4x4TB
Display(s) Asus PB328Q 32' 1440p@75hz
Case Cooler Master CM Storm Trooper
Power Supply Corsair HX750, HX550, Galaxy 520W
Mouse Multiple, Razer Mamba Elite, Logitech M500
Keyboard Multiple - Lenovo, HP, Dell, Logitech
Spoilers? What is this? Game? Movie? No! It's a computer component. You want me to spoil the entire "plot" for you? OK. Here it goes: Nvidia releases another ultra overpriced series of RTX cards, people still buy, Nvidia laughs as it gets showered in money, end credits.

That's your opinion, yes it is a spoiler to me, and probably other people who are tolerating this for so long. We can't browse any of the tech sites normally because of this drama and the stupid impatience-fetish you people are into. The media will report for those who are interested sure, but what about those who do not wish to be spoiled ... so I have to avoid tech sites for weeks every time such reveals are heating up.

When I was 17 year old I sure liked all the leaks and stuff and had a good time dramatizing ... but those were the times when I had no responsibilites, I had all the time in the world, I could sit for hours and dribble on what new tech will be in the next console ... been there done that, but I got fed up with the spoilers, it feels so much better to see the E3 or some game show event how it's meant to be seen, some cheap-ass website that is all about pre-launch rumor monetization, leaking it in an almost criminally disrespectful way is the biggest slap in the face ever, not only to the fans but also to the very hard working employess who prepared those events, to the speakers who prepared those speeches, you are ruining so many experiences with this stupid rumor leak obsession.

But that's my opinion, calling it stupid, sure, it's free speech, go ahead, we can't force a private forum or site to censor, ofcourse not, but it would be a good idea to try to find compromise and to overhaul the website news sections so the news writers and staff could post spoiler warnings for articles and other forum tools to also put up some warnings- when navigating the forum.

Again, I'm not sure how big this leak is in detail but from the slight thing I've seen it seems big ... I did not see it tho, I successfully hidden my eyes out :p

If I was nvidia right now, after SO MUCH HYPE for MONTHS, where sites wrote they never seen such hype before for a GPU ... it was building up so nice, now some idiot messes it all up just 3 days prior to reveal ... if I was Nvidia, I would go nuclear, I'd practically cancel and delay everything for 30 days, delay the review GPUs, and even recall them back, yes, forcefully canceling the shipments and rerouting them while in-transit so neither the reviewers would get it, it's the reviewers that are mostly responsible for spreading and the leaks, this is how most people get spoiled. That would be the most epic reply back, so epic that all the Leak-Obsession-Disorder losers who would cry about it would be doing it for nothing, industry analysts and pundits would be impressed by the boldness of the move and I think spiritually this would have infact helped the company image in the long run and in the key people, yes the 15 year old gamer-head losers would hate it but it wouldn't matter because everyone including the investors would know everyone would still buy it after the delay, no sales for 30 days might hurt them a bit sure, but if I was the CEO i'd take it, my pride stands first!
 
Last edited:
Joined
Jul 5, 2013
Messages
25,559 (6.52/day)
Whats wrong with replacing multiple connectors with 1 single connector?
I welcome it
Simple, not all cards need that many power lines. For many cards one 6pin/8pin is enough. So why build a connector that has a bunch of power lines when only a few are needed? It's needless and wasteful.

DisplayPort 1.4a ? 2.0 was finalized in year 2016 ...
wth is with PC companies failing to use latest PC standards for PC components
You need to read that wikipedia article you're quoting a little closer. 2.0 was on the roadmap in 2016 but it wasn't finalized until June 26th of 2019. Additionally, DP2.0 modulation ICs are expensive and offer marginal benefit to the consumer over 1.4a based ICs. DP2.0 is best suited for commercial and industrial applications ATM.
 
Joined
Jul 9, 2015
Messages
3,413 (1.07/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Parts of PhysX are of UE4 and Unity granted the ol cuda acceleration is pretty dead but nvidia's physics package is still pretty prevalent.
In other words, CPU PhysX, not very relevant to this thread don't you think?
 
Joined
May 13, 2015
Messages
631 (0.19/day)
Processor AMD Ryzen 3800X / AMD 8350
Motherboard ASRock X570 Phantom Gaming X / Gigabyte 990FXA-UD5 Revision 3.0
Cooling Stock / Corsair H100
Memory 32GB / 24GB
Video Card(s) Sapphire RX 6800 / AMD Radeon 290X (Toggling until 6950XT)
Storage C:\ 1TB SSD, D:\ RAID-1 1TB SSD, 2x4TB-RAID-1
Display(s) Samsung U32E850R
Case be quiet! Dark Base Pro 900 Black rev. 2 / Fractal Design
Audio Device(s) Creative Sound Blaster X-Fi
Power Supply EVGA Supernova 1300G2 / EVGA Supernova 850G+
Mouse Logitech M-U0007
Keyboard Logitech G110 / Logitech G110
Wouldn't the smaller connector help with airflow ?, even more so with how this card is designed.

Generally yes but what will the actual TDP of the 3090 be compared to the 2080ti? Keeping in mind that Nvidia (according to Tom) might be trying to make their line up less confusing to consumers (I won't hold my breath). The heat output is going to be enormous because Nvidia is obsessed with mindset because there are so many people who will pointlessly throw themselves at a corporate identity simply because they have the best at 1 million dollars per card when they can afford $200.

Frankly I would really like to see AMD release the 6000 series with a setting that allows you to choose a TDP target. Like a slider for the card power, even if it has steps like 25 watts or something. I know AMD will still crank the power up, I don't need 300FPS, I am still using a 60Hz screen for now and am interested in 120/144 later this year but options are limited in the area of the market I'm looking as I game maybe 2% of the time I'm at my rig.
 
Joined
Mar 10, 2010
Messages
11,878 (2.31/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
That's your opinion, yes it is a spoiler to me, and probably other people who are tolerating this for so long. We can't browse any of the tech sites normally because of this drama and the stupid impatience-fetish you people are into. The media will report for those who are interested sure, but what about those who do not wish to be spoiled ... so I have to avoid tech sites for weeks every time such reveals are heating up.

When I was 17 year old I sure liked all the leaks and stuff and had a good time dramatizing ... but those were the times when I had no responsibilites, I had all the time in the world, I could sit for hours and dribble on what new tech will be in the next console ... been there done that, but I got fed up with the spoilers, it feels so much better to see the E3 or some game show event how it's meant to be seen, some cheap-ass website that is all about pre-launch rumor monetization, leaking it in an almost criminally disrespectful way is the biggest slap in the face ever, not only to the fans but also to the very hard working employess who prepared those events, to the speakers who prepared those speeches, you are ruining so many experiences with this stupid rumor leak obsession.

But that's my opinion, calling it stupid, sure, it's free speech, go ahead, we can't force a private forum or site to censor, ofcourse not, but it would be a good idea to try to find compromise and to overhaul the website news sections so the news writers and staff could post spoiler warnings for articles and other forum tools to also put up some warnings- when navigating the forum.

Again, I'm not sure how big this leak is in detail but from the slight thing I've seen it seems big ... I did not see it tho, I successfully hidden my eyes out :p

If I was nvidia right now, after SO MUCH HYPE for MONTHS, where sites wrote they never seen such hype before for a GPU ... it was building up so nice, now some idiot messes it all up just 3 days prior to reveal ... if I was Nvidia, I would go nuclear, I'd practically cancel and delay everything for 30 days, delay the review GPUs, and even recall them back, yes, forcefully canceling the shipments and rerouting them while in-transit so neither the reviewers would get it, it's the reviewers that are mostly responsible for spreading and the leaks, this is how most people get spoiled. That would be the most epic reply back, so epic that all the Leak-Obsession-Disorder losers who would cry about it would be doing it for nothing, industry analysts and pundits would be impressed by the boldness of the move and I think spiritually this would have infact helped the company image in the long run and in the key people, yes the 15 year old gamer-head losers would hate it but it wouldn't matter because everyone including the investors would know everyone would still buy it after the delay, no sales for 30 days might hurt them a bit sure, but if I was the CEO i'd take it, my pride stands first!
Were you not about for the last ten years, every GPU release had a hype train.
And if not new unreleased tech, what then do we discuss, do we just compare benchmarks, fix issues, some like a less dry discussion.
You can not read them.
 
Top