• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 3090 and 3080 Specifications Leaked

Joined
Feb 18, 2009
Messages
1,806 (0.41/day)
Location
Slovenia
Processor Intel Core i7 3820 OC@ 4.0 GHz
Motherboard Asus P9X79
Cooling Noctua NH-L12
Memory Corsair Vengeance 32GB 1333MHz
Video Card(s) Sapphire ATI Radeon RX 480 8GB
Storage Samsung SSD 860 Evo, 850 Evo, WD1002FAEX,4xWD10EZEX,2xWDGreen2TB,WD30EFRX,WD40EFRX
Display(s) Asus PB328Q 32' 1440p@75hz
Case Cooler Master CM Storm Trooper
Power Supply Corsair HX750
Mouse Razer Mamba Elite
Software Win7 - Win10 - Kubuntu
I come here as someone who does not like spoilers ... just days away this seems like a total psycho obsession with some of the people who think they're doing something noble with leaks ... at least the news media, if they are eager to profit off the leaks because of drama and traffic, should put up big spoiler warnings and some standards in this regard, I'm so sick of this, no, I do not know what the leak is, I only came here to say this, I will be going on tech-site blackout until I watch the proper reveal. Yes I was hiding my eyes not to do a single peek of the content or any comments, I did not read any posts here in this thread either.
 
Joined
Mar 18, 2015
Messages
2,929 (1.32/day)
Location
Long Island
No, but you do have to answer why that gap is so insanely hugh, more then twice the ram? borderline 2.5? that is just insane.
And again, the midrange of old, RX480 had 8gb of ram and the gtx1060 had 6gb of ram...to have an RTX3080 now with 10gb is just pathetic imo with the eye on progression and placement.

Peeps have been complaining about VRAM for generations of cards, and real world testing has nott borne it up. Every time test sites have compared the same GPU w/ different RAM sizes, in almost every game, there was no observable impact in performance.

2GB vs 4 GB 770... when everyone was saying 2Gb was not enough, this test showed otherwise.

"There isn’t a lot of difference between the cards at 1920×1080 or at 2560×1600. We only start to see minimal differences at 5760×1080, and even so, there is rarely a frame or two difference. ... There is one last thing to note with Max Payne 3: It would not normally allow one to set 4xAA at 5760×1080 with any 2GB card as it ***claims*** to require 2750MB. However, when we replaced the 4GB GTX 770 with the 2GB version, the game allowed the setting. And there were no slowdowns, stuttering, nor any performance differences that we could find between the two GTX 770s.

Same here .... https://www.guru3d.com/articles_pages/gigabyte_geforce_gtx_960_g1_gaming_4gb_review,12.html
Same here .... http://www.extremetech.com/gaming/2...y-x-faces-off-with-nvidias-gtx-980-ti-titan-x
Same here .... https://www.pugetsystems.com/labs/articles/Video-Card-Performance-2GB-vs-4GB-Memory-154/

Yes, you can find some games that will show a difference, most;ly SIMs w/ bad console ports

And lets remember ... the 3GB version of the 1060 did just fine. They were not the same GPU, the 3 GB version had 10% less shaders which gave the 6 GB an VRAM independent speed advantage. The extra shaders gave the 6 GB version a 6 % speed advantage over the 3 GB ... So when going to 1440p, if there was even a hint of impact due to VRAM, that 6% should be miuch bigger ... it wasn't....only saw a difference at 4k.

Based upon actual testing at lower res's and scaling up accordingly, my expectations for the 2080 were 12k, so was surprised at the odd 11 number.... for 3080, I thot they'd do 12 ... so 10 tells me thet Nvidia must know more than we know. No sense putting it in if it's not used.... no different that have an 8+6 power connector on a 225 watt card. Just because the connectors and cable can pull 225 watts (+ 75 from the slot) doesn't mean it will ever happen.

Nvidia’s Brandon Bell has addressed this topic more than once saying that the utilities that are available "all report the amount of memory requested by the GPU, not the actual memory usage. Cards will larger memory will request more memory, but that doesn’t mean that they actually use it. They simply request it because the memory is available.” The card manufacturers gave us more RAM because customers would buy it. But for a 1060 ... the test results proved we don't need more than 3 GB at 1080p, the 6 GB version didn't add anything to the mix other than more shaders.

So now for the why 10 question ?

When they did the 1060, 3 GB, why did they disable 10% of the shaders ... didn't save any money ? Let's look at W1zzard's conclusion:

"Typically, GPU vendors use the exact same GPU for SKUs of different memory capacity, just not in this case. NVIDIA decided to reduce the shader count of the GTX 1060 3 GB to 1152 from the 1280 on the 6 GB version. This rough 10% reduction in shaders lets the company increase the performance difference between the 3 GB and 6 GB version, which will probably lure potential customers closer toward the 6 GB version. "

In other words, that needed to kill 10% of the shaders because otherwise.... the performance would be the same and folks would have no reason to spring for the extra $$ for the 6 GB card. Same with the 970's 3.5 GB ... it was clearly done to gimp the 970 and provide a performance gap between the 980. When I heard there was a 3080 and a 3090, coming, I expected 12 and 16 GB.. No I can't help but wonder .... is the 12 GB the sweet spot for 4k and is the use of the 10 GB this generation's the little "gimp" needed to make the cost increase to the 3090 attactive ?
 
Joined
Jan 27, 2015
Messages
675 (0.30/day)
System Name Legion
Processor i9-10850K
Motherboard Asus Prime Z490M Plus
Cooling Air
Memory G.Skill Ripjaws V 32GB (2 x 16GB) DDR4-3200 F4-3200C16D-32GVK
Video Card(s) EVGA GeForce RTX 2060 KO Ultra
Storage Inland Premium 256GB SSD 3D NAND M.2 2280 PCIe NVMe 3.0 x4 + WD Blue 1TB SATA SSD
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Lian Li 205M
Power Supply PowerSpec 650W 80+ Bronze Semi-Modular PS 650BSM
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
I like what you're saying on the VRAM thing, based on my experience it's irrelevant in *most* games.

One exception though, MMOs. Specifically, if you've ever played a MMO and been in an area where there are like 200+ other players with many different textures for their gear, VRAM definitely comes into play. Smaller VRAM cards simply can't hold all those textures, at least not with any quality. What you wind up with is half the people look normal and the other half look like they're in underwear.

Anyway, that is kind of an edge case. Even in those settings if it's truly bothersome typically one can reduce texture quality and problem solved.
 
Joined
Sep 17, 2014
Messages
14,700 (6.13/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) MSI GTX 1080 Gaming X @ 2100/5500
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define C TG
Audio Device(s) Situational :)
Power Supply EVGA G2 750W
Mouse Logitech G502 Protheus Spectrum
Keyboard Lenovo Thinkpad Trackpoint II (Best K/B ever... <3)
Software W10 x64
Peeps have been complaining about VRAM for generations of cards, and real world testing has nott borne it up. Every time test sites have compared the same GPU w/ different RAM sizes, in almost every game, there was no observable impact in performance.

2GB vs 4 GB 770... when everyone was saying 2Gb was not enough, this test showed otherwise.

"There isn’t a lot of difference between the cards at 1920×1080 or at 2560×1600. We only start to see minimal differences at 5760×1080, and even so, there is rarely a frame or two difference. ... There is one last thing to note with Max Payne 3: It would not normally allow one to set 4xAA at 5760×1080 with any 2GB card as it ***claims*** to require 2750MB. However, when we replaced the 4GB GTX 770 with the 2GB version, the game allowed the setting. And there were no slowdowns, stuttering, nor any performance differences that we could find between the two GTX 770s.

Same here .... https://www.guru3d.com/articles_pages/gigabyte_geforce_gtx_960_g1_gaming_4gb_review,12.html
Same here .... http://www.extremetech.com/gaming/2...y-x-faces-off-with-nvidias-gtx-980-ti-titan-x
Same here .... https://www.pugetsystems.com/labs/articles/Video-Card-Performance-2GB-vs-4GB-Memory-154/

Yes, you can find some games that will show a difference, most;ly SIMs w/ bad console ports

And lets remember ... the 3GB version of the 1060 did just fine. They were not the same GPU, the 3 GB version had 10% less shaders which gave the 6 GB an VRAM independent speed advantage. The extra shaders gave the 6 GB version a 6 % speed advantage over the 3 GB ... So when going to 1440p, if there was even a hint of impact due to VRAM, that 6% should be miuch bigger ... it wasn't....only saw a difference at 4k.

Based upon actual testing at lower res's and scaling up accordingly, my expectations for the 2080 were 12k, so was surprised at the odd 11 number.... for 3080, I thot they'd do 12 ... so 10 tells me thet Nvidia must know more than we know. No sense putting it in if it's not used.... no different that have an 8+6 power connector on a 225 watt card. Just because the connectors and cable can pull 225 watts (+ 75 from the slot) doesn't mean it will ever happen.

Nvidia’s Brandon Bell has addressed this topic more than once saying that the utilities that are available "all report the amount of memory requested by the GPU, not the actual memory usage. Cards will larger memory will request more memory, but that doesn’t mean that they actually use it. They simply request it because the memory is available.” The card manufacturers gave us more RAM because customers would buy it. But for a 1060 ... the test results proved we don't need more than 3 GB at 1080p, the 6 GB version didn't add anything to the mix other than more shaders.

So now for the why 10 question ?

When they did the 1060, 3 GB, why did they disable 10% of the shaders ... didn't save any money ? Let's look at W1zzard's conclusion:

"Typically, GPU vendors use the exact same GPU for SKUs of different memory capacity, just not in this case. NVIDIA decided to reduce the shader count of the GTX 1060 3 GB to 1152 from the 1280 on the 6 GB version. This rough 10% reduction in shaders lets the company increase the performance difference between the 3 GB and 6 GB version, which will probably lure potential customers closer toward the 6 GB version. "

In other words, that needed to kill 10% of the shaders because otherwise.... the performance would be the same and folks would have no reason to spring for the extra $$ for the 6 GB card. Same with the 970's 3.5 GB ... it was clearly done to gimp the 970 and provide a performance gap between the 980. When I heard there was a 3080 and a 3090, coming, I expected 12 and 16 GB.. No I can't help but wonder .... is the 12 GB the sweet spot for 4k and is the use of the 10 GB this generation's the little "gimp" needed to make the cost increase to the 3090 attactive ?

You might be on the money there, but even so, the shader deficit is a LOT more than 10% on the 3080. Its about a 1000 shaders less?

And to be fair, I'm not that silly to think that this move will somehow push people doubting a 3080, suddenly to a 3090 that is a whole lot more expensive. They will probably look at their 2080ti and think, meh... I'm gonna sit on this 11GB for a while - heck, it even does RT already. Remember that 2080ti was already a stretch in terms of price. Will they nail that again? Doubtful, especially when the gimp of a lower tier is so obvious. Its not something one might be keen to reward. More likely I think, is that the 3080 is the step up aimed at 2080/2080S and 2070S buyers. And they gain 2GB compared to what they had.

I think a bigger part of the 10GB truth is really that they had some sort of issue getting more on it. Either a cost issue, or yield/die related. That is why I'm so so interested in how they cut this one up and how the VRAM is wired.
 
Last edited:

reflex75

New Member
Joined
May 15, 2020
Messages
4 (0.01/day)
That's a shame only 10GB for a new high end GPU, while new consoles will have more (16GB) and be cheaper!
 
Joined
Jan 25, 2006
Messages
1,471 (0.26/day)
Processor Ryzen 1600AF @4.2Ghz 1.35v
Motherboard MSI B450M PRO-A-MAX
Cooling Deepcool Gammaxx L120t
Memory 16GB Team Group Dark Pro Sammy-B-die 3400mhz 14.15.14.30-1.4v
Video Card(s) XFX RX 5600 XT THICC II PRO
Storage 240GB Brave eagle SSD/ 2TB Seagate Barracuda
Display(s) Dell SE2719HR
Case MSI Mag Vampiric 011C AMD Ryzen Edition
Power Supply EVGA 600W 80+
Software Windows 10 Pro
I come here as someone who does not like spoilers ... just days away this seems like a total psycho obsession with some of the people who think they're doing something noble with leaks ... at least the news media, if they are eager to profit off the leaks because of drama and traffic, should put up big spoiler warnings and some standards in this regard, I'm so sick of this, no, I do not know what the leak is, I only came here to say this, I will be going on tech-site blackout until I watch the proper reveal. Yes I was hiding my eyes not to do a single peek of the content or any comments, I did not read any posts here in this thread either.
Just read the title and avoid the thread, whats with the drama Britney?
 
Joined
Jul 5, 2013
Messages
12,547 (4.42/day)
Location
USA
System Name GPD-Q9
Processor Rockchip RK-3288 1.8ghz quad core
Motherboard GPD Q9_V6_150528
Cooling Passive
Memory 2GB DDR3
Video Card(s) Mali T764
Storage 16GB Samsung NAND
Display(s) IPS 1024x600
From what I've seen, adapters will be included.
If adapters are included why make the new connector in the first place? Makes no sense at all.

That's a shame only 10GB for a new high end GPU, while new consoles will have more (16GB) and be cheaper!
Consoles with 16GB(which NONE are out yet BTW) have to share that 16GB system wide. Whereas 10GB dedicated for the GPU is in ADDITION to the RAM the system already has. You need to keep that technological dynamic in mind.
 

rtwjunkie

PC Gaming Enthusiast
Supporter
Joined
Jul 25, 2008
Messages
13,747 (2.96/day)
Location
Louisiana -Laissez les bons temps rouler!
System Name Bayou Phantom
Processor Core i7-8700k 4.4Ghz @ 1.18v
Motherboard ASRock Z390 Phantom Gaming 6
Cooling All air: 2x140mm Fractal exhaust; 3x 140mm Cougar Intake; Enermax T40F Black CPU cooler
Memory 2x 16GB Mushkin Redline DDR-4 3200
Video Card(s) EVGA RTX 2080 Ti Xc
Storage 1x 500 MX500 SSD; 2x 6TB WD Black; 1x 4TB WD Black; 1x400GB VelRptr; 1x 4TB WD Blue storage (eSATA)
Display(s) HP 27q 27" IPS @ 2560 x 1440
Case Fractal Design Define R4 Black w/Titanium front -windowed
Audio Device(s) Soundblaster Z
Power Supply Seasonic X-850
Mouse Coolermaster Sentinel III (large palm grip!)
Keyboard Logitech G610 Orion mechanical (Cherry Brown switches)
Software Windows 10 Pro 64-bit (Start10 & Fences 3.0 installed)
If adapters are included why make the new connector in the first place? Makes no sense at all.


Consoles with 16GB(which NONE are out yet BTW) have to share that 16GB system wide. Whereas 10GB dedicated for the GPU is in ADDITION to the RAM the system already has. You need to keep that technological dynamic in mind.
Remember years ago when power requirements were changing quickly? We got adapters in nearly every video card box. It’ll be the same until it is presumed everyone has new PSU’s in about 5 years.
 
Joined
Jul 5, 2013
Messages
12,547 (4.42/day)
Location
USA
System Name GPD-Q9
Processor Rockchip RK-3288 1.8ghz quad core
Motherboard GPD Q9_V6_150528
Cooling Passive
Memory 2GB DDR3
Video Card(s) Mali T764
Storage 16GB Samsung NAND
Display(s) IPS 1024x600
Remember years ago when power requirements were changing quickly? We got adapters in nearly every video card box. It’ll be the same until it is presumed everyone has new PSU’s in about 5 years.
While I see your point, the change is still unneeded. Current 8+6pin or 8+8pin power works perfectly well. Other than physical size, there is no benefit from this new connector.
 

rtwjunkie

PC Gaming Enthusiast
Supporter
Joined
Jul 25, 2008
Messages
13,747 (2.96/day)
Location
Louisiana -Laissez les bons temps rouler!
System Name Bayou Phantom
Processor Core i7-8700k 4.4Ghz @ 1.18v
Motherboard ASRock Z390 Phantom Gaming 6
Cooling All air: 2x140mm Fractal exhaust; 3x 140mm Cougar Intake; Enermax T40F Black CPU cooler
Memory 2x 16GB Mushkin Redline DDR-4 3200
Video Card(s) EVGA RTX 2080 Ti Xc
Storage 1x 500 MX500 SSD; 2x 6TB WD Black; 1x 4TB WD Black; 1x400GB VelRptr; 1x 4TB WD Blue storage (eSATA)
Display(s) HP 27q 27" IPS @ 2560 x 1440
Case Fractal Design Define R4 Black w/Titanium front -windowed
Audio Device(s) Soundblaster Z
Power Supply Seasonic X-850
Mouse Coolermaster Sentinel III (large palm grip!)
Keyboard Logitech G610 Orion mechanical (Cherry Brown switches)
Software Windows 10 Pro 64-bit (Start10 & Fences 3.0 installed)
While I see your point, the change is still unneeded. Current 8+6pin or 8+8pin power works perfectly well. Other than physical size, there is no benefit from this new connector.
:toast:
Just so you know, I didn’t make this decision and I am not for it. I’m just relaying information.
 
Joined
Jul 5, 2013
Messages
12,547 (4.42/day)
Location
USA
System Name GPD-Q9
Processor Rockchip RK-3288 1.8ghz quad core
Motherboard GPD Q9_V6_150528
Cooling Passive
Memory 2GB DDR3
Video Card(s) Mali T764
Storage 16GB Samsung NAND
Display(s) IPS 1024x600
:toast:
Just so you know, I didn’t make this decision and I am not for it. I’m just relaying information.
You didn't?!? Well hot damn... LOL! No worries mate. ;)
 

TranceHead

New Member
Joined
Feb 22, 2019
Messages
27 (0.03/day)
While I see your point, the change is still unneeded. Current 8+6pin or 8+8pin power works perfectly well. Other than physical size, there is no benefit from this new connector.
Whats wrong with replacing multiple connectors with 1 single connector?
I welcome it
 
Joined
Feb 18, 2012
Messages
2,258 (0.68/day)
System Name MSI GE75 Raider
Processor i9 9880h
Cooling 2 laptop fans
Memory 32gb of 3000mhz DDR4
Video Card(s) Nvidia 2080
Storage x2 2tb Intel 660p SSD nvme, WD Red SA500 2.5in 4TB
Display(s) 17.3" IPS 1920x1080 144Hz
Power Supply 280w laptop power supply
Mouse Logitech m705
Keyboard laptop keyboard
Software lots of movies and Windows 10 with win 7 shell
Benchmark Scores Good enough for me
I would hate to see the 3000 cards in laptops, they would have to be heavily throttled to get it not to blow up the laptop.
The 2070 is a 175w card but in laptops its 115w, this 3070 is rumored to be 320w but will Nvidia stay with the same wattage of 115w or maybe increase it to 150w which is 2080 mobile territory.
I guess all laptops will be max-q designs with the 3000 cards.
 
Joined
May 17, 2016
Messages
315 (0.18/day)
System Name DUX
Processor Ryzen 9 5900X
Motherboard MSI X570 Gaming Plus
Cooling Fortron Windale 6 blue LED
Memory Crucial Balistix Sport 3800MHz CL16,19,19,39
Video Card(s) RTX 2080 Super
Storage ADATA 512GB M.2
Case Zalman Z1 NEO
Audio Device(s) Kingston HyperX Cloud II
Power Supply Corsair TX850 Gold
I come here as someone who does not like spoilers ... just days away this seems like a total psycho obsession with some of the people who think they're doing something noble with leaks ... at least the news media, if they are eager to profit off the leaks because of drama and traffic, should put up big spoiler warnings and some standards in this regard, I'm so sick of this, no, I do not know what the leak is, I only came here to say this, I will be going on tech-site blackout until I watch the proper reveal. Yes I was hiding my eyes not to do a single peek of the content or any comments, I did not read any posts here in this thread either.
Spoilers? What is this? Game? Movie? No! It's a computer component. You want me to spoil the entire "plot" for you? OK. Here it goes: Nvidia releases another ultra overpriced series of RTX cards, people still buy, Nvidia laughs as it gets showered in money, end credits.
 
Joined
Dec 6, 2016
Messages
90 (0.06/day)
System Name The old timer
Processor Intel Core i7 3930k @ 4.2GHz
Motherboard Asus P9X79 Deluxe
Cooling Noctua NH15
Memory 16GB Corsair Vengeance PRO 2400mhz CL10 (4x4gb)
Video Card(s) Asus VEGA 64 Gamer Strix OC
Storage Samsung 512GB SA | 2x 1TB 2.5" Hitachi 7200rpm | 2TB 2.5" Toshiba USB 3.0
Display(s) Twin 27" Dell u2713h 2k monitors
Case Cooler Master HAF XB Evo
Audio Device(s) Creative X-Fi Extreme Audio PCI-E (SB1040)
Power Supply Corsair RM1000X
Mouse Logitech G600
Keyboard Razer Ornata Chroma
Software Win10 x64 PRO
Who the hell is team red? Okay, gotta figure this one out...

  • Team Blue: Intel.
  • Team Green: AMD.
  • Team Lime: Nvidia.
Yeah, there used to be ATI. "But! But! Nvidia already has green!" No, they're lime and AMD has been around way the hell longer than Nvidia.

I can't wait for the 6950XTX.

Modern amd logo is orange. Radeon tecnology group is red.

As for the poll, I'm not spending 1000$ for a gpu. I'm only upgrading when i can get a 4k capable (60 fps) gpu for 400-500$.
 

AsRock

TPU addict
Joined
Jun 23, 2007
Messages
16,991 (3.37/day)
Location
UK\US
Processor 2500k \ AMD 3900X+NH-D15
Motherboard ASRock Z68 \ ASRock AM4 X570 Pro 4
Memory Samsung low profile 2x8GB \ Patriot 2x16GB PVS432G320C6K
Video Card(s) eVga GTX1060 SSC \ XFX R9 390X
Storage 2xIntel 80Gb (SATA2) Crucial MX500 \ Samsung 860 1TB +Samsung Evo 250GB+500GB Sabrent 1TB Rocket
Display(s) Samsung 1080P \ Toshiba HDTV 1080P
Case HTPC400 \ Thermaltake Armor case ( VE2000BWS ), With Zalman fan controller ( wattage usage ).
Audio Device(s) Yamaha RX-A820 \ Yamaha CX-830+Yamaha MX-630 Infinity RS4000, Blue Yeti
Power Supply PC&Power 750w \ Seasonic 750w MKII
Mouse Steelseries Sensei wireless \ Steelseries Sensei wireless
Keyboard Logitech K120 \ ROCCAT MK Pro ( modded amber leds )
Benchmark Scores Meh benchmarks.
They're not being weird, they're trying to distract people that they feel threatened by AMD who no longer has a stock value of under $2 from the crony tactics of both Intel and Nvidia so naturally they're going to do their absolute best. Why?

"Nvidia has the best card at $36,700! So when I spend $170 I'll somehow magically get the best card I can!" - Fanboyism

Now you take that with "Oh, but it's actually smaller than two eight pins!" which is intended for Nvidia fan boys to basically say,"I don't care that my room is 20 degrees warmer and my electricity bill doubled! I need those 14 FPS because twice the FPS of my 144Hz monitor isn't enough for some very objective unquestionable reason!"

That is the problem with cronies, they know how weaker physiological mindsets work and they have no qualms about taking advantage of people.

Wouldn't the smaller connector help with airflow ?, even more so with how this card is designed.
 
Joined
Nov 11, 2005
Messages
11 (0.00/day)
DisplayPort 1.4a ? 2.0 was finalized in year 2016 ...
wth is with PC companies failing to use latest PC standards for PC components
 
Joined
Sep 17, 2014
Messages
14,700 (6.13/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) MSI GTX 1080 Gaming X @ 2100/5500
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define C TG
Audio Device(s) Situational :)
Power Supply EVGA G2 750W
Mouse Logitech G502 Protheus Spectrum
Keyboard Lenovo Thinkpad Trackpoint II (Best K/B ever... <3)
Software W10 x64
I would hate to see the 3000 cards in laptops, they would have to be heavily throttled to get it not to blow up the laptop.
The 2070 is a 175w card but in laptops its 115w, this 3070 is rumored to be 320w but will Nvidia stay with the same wattage of 115w or maybe increase it to 150w which is 2080 mobile territory.
I guess all laptops will be max-q designs grossly underpowered and overpriced with the 3000 cards.


FTFY

Its another reason I really don't get this set of products. Even for a Max-Q, that would require some pretty creative shifting with tiers and performance. Effectively your mobile x80 is slower than a desktop x60 or something. I mean how much binning can you do...

DisplayPort 1.4a ? 2.0 was finalized in year 2016 ...
wth is with PC companies failing to use latest PC standards for PC components

Max out that profit, that's what.
 
Joined
Jul 9, 2015
Messages
2,767 (1.31/day)
System Name My all round PC
Processor i5 750
Motherboard ASUS P7P55D-E
Memory 8GB
Video Card(s) Sapphire 380 OC... sold, waiting for Navi
Storage 256GB Samsung SSD + 2Tb + 1.5Tb
Display(s) Samsung 40" A650 TV
Case Thermaltake Chaser mk-I Tower
Power Supply 425w Enermax MODU 82+
Software Windows 10
PhysX did not evaporate. It became ubiquitous to the point...
Of not being used in games.
But let's pretend we are talking about CPU PhysX here, to make a lame point, shall we...
 
Joined
Jan 13, 2011
Messages
210 (0.06/day)
Of not being used in games.
But let's pretend we are talking about CPU PhysX here, to make a lame point, shall we...
Parts of PhysX are of UE4 and Unity granted the ol cuda acceleration is pretty dead but nvidia's physics package is still pretty prevalent.
 
Joined
Feb 18, 2009
Messages
1,806 (0.41/day)
Location
Slovenia
Processor Intel Core i7 3820 OC@ 4.0 GHz
Motherboard Asus P9X79
Cooling Noctua NH-L12
Memory Corsair Vengeance 32GB 1333MHz
Video Card(s) Sapphire ATI Radeon RX 480 8GB
Storage Samsung SSD 860 Evo, 850 Evo, WD1002FAEX,4xWD10EZEX,2xWDGreen2TB,WD30EFRX,WD40EFRX
Display(s) Asus PB328Q 32' 1440p@75hz
Case Cooler Master CM Storm Trooper
Power Supply Corsair HX750
Mouse Razer Mamba Elite
Software Win7 - Win10 - Kubuntu
Spoilers? What is this? Game? Movie? No! It's a computer component. You want me to spoil the entire "plot" for you? OK. Here it goes: Nvidia releases another ultra overpriced series of RTX cards, people still buy, Nvidia laughs as it gets showered in money, end credits.

That's your opinion, yes it is a spoiler to me, and probably other people who are tolerating this for so long. We can't browse any of the tech sites normally because of this drama and the stupid impatience-fetish you people are into. The media will report for those who are interested sure, but what about those who do not wish to be spoiled ... so I have to avoid tech sites for weeks every time such reveals are heating up.

When I was 17 year old I sure liked all the leaks and stuff and had a good time dramatizing ... but those were the times when I had no responsibilites, I had all the time in the world, I could sit for hours and dribble on what new tech will be in the next console ... been there done that, but I got fed up with the spoilers, it feels so much better to see the E3 or some game show event how it's meant to be seen, some cheap-ass website that is all about pre-launch rumor monetization, leaking it in an almost criminally disrespectful way is the biggest slap in the face ever, not only to the fans but also to the very hard working employess who prepared those events, to the speakers who prepared those speeches, you are ruining so many experiences with this stupid rumor leak obsession.

But that's my opinion, calling it stupid, sure, it's free speech, go ahead, we can't force a private forum or site to censor, ofcourse not, but it would be a good idea to try to find compromise and to overhaul the website news sections so the news writers and staff could post spoiler warnings for articles and other forum tools to also put up some warnings- when navigating the forum.

Again, I'm not sure how big this leak is in detail but from the slight thing I've seen it seems big ... I did not see it tho, I successfully hidden my eyes out :p

If I was nvidia right now, after SO MUCH HYPE for MONTHS, where sites wrote they never seen such hype before for a GPU ... it was building up so nice, now some idiot messes it all up just 3 days prior to reveal ... if I was Nvidia, I would go nuclear, I'd practically cancel and delay everything for 30 days, delay the review GPUs, and even recall them back, yes, forcefully canceling the shipments and rerouting them while in-transit so neither the reviewers would get it, it's the reviewers that are mostly responsible for spreading and the leaks, this is how most people get spoiled. That would be the most epic reply back, so epic that all the Leak-Obsession-Disorder losers who would cry about it would be doing it for nothing, industry analysts and pundits would be impressed by the boldness of the move and I think spiritually this would have infact helped the company image in the long run and in the key people, yes the 15 year old gamer-head losers would hate it but it wouldn't matter because everyone including the investors would know everyone would still buy it after the delay, no sales for 30 days might hurt them a bit sure, but if I was the CEO i'd take it, my pride stands first!
 
Last edited:
Joined
Jul 5, 2013
Messages
12,547 (4.42/day)
Location
USA
System Name GPD-Q9
Processor Rockchip RK-3288 1.8ghz quad core
Motherboard GPD Q9_V6_150528
Cooling Passive
Memory 2GB DDR3
Video Card(s) Mali T764
Storage 16GB Samsung NAND
Display(s) IPS 1024x600
Whats wrong with replacing multiple connectors with 1 single connector?
I welcome it
Simple, not all cards need that many power lines. For many cards one 6pin/8pin is enough. So why build a connector that has a bunch of power lines when only a few are needed? It's needless and wasteful.

DisplayPort 1.4a ? 2.0 was finalized in year 2016 ...
wth is with PC companies failing to use latest PC standards for PC components
You need to read that wikipedia article you're quoting a little closer. 2.0 was on the roadmap in 2016 but it wasn't finalized until June 26th of 2019. Additionally, DP2.0 modulation ICs are expensive and offer marginal benefit to the consumer over 1.4a based ICs. DP2.0 is best suited for commercial and industrial applications ATM.
 
Joined
Jul 9, 2015
Messages
2,767 (1.31/day)
System Name My all round PC
Processor i5 750
Motherboard ASUS P7P55D-E
Memory 8GB
Video Card(s) Sapphire 380 OC... sold, waiting for Navi
Storage 256GB Samsung SSD + 2Tb + 1.5Tb
Display(s) Samsung 40" A650 TV
Case Thermaltake Chaser mk-I Tower
Power Supply 425w Enermax MODU 82+
Software Windows 10
Parts of PhysX are of UE4 and Unity granted the ol cuda acceleration is pretty dead but nvidia's physics package is still pretty prevalent.
In other words, CPU PhysX, not very relevant to this thread don't you think?
 
Joined
May 13, 2015
Messages
242 (0.11/day)
Processor AMD Ryzen 3800X / AMD 8350
Motherboard ASRock X570 Phantom Gaming X / Gigabyte 990FXA-UD5 Revision 3.0
Cooling Stock / Corsair H100
Memory 32GB / 24GB
Video Card(s) AMD Radeon 290X (Toggling until 6950XT)
Storage C:\ 1TB SSD, D:\ RAID-1 1TB SSD, 2x4TB-RAID-1
Display(s) Samsung U32E850R
Case be quiet! Dark Base Pro 900 Black rev. 2 / Fractal Design
Power Supply EVGA Supernova 1300G2 / EVGA Supernova 850G+
Mouse Logitech M-U0007
Keyboard Logitech G110 / Logitech G110
Wouldn't the smaller connector help with airflow ?, even more so with how this card is designed.

Generally yes but what will the actual TDP of the 3090 be compared to the 2080ti? Keeping in mind that Nvidia (according to Tom) might be trying to make their line up less confusing to consumers (I won't hold my breath). The heat output is going to be enormous because Nvidia is obsessed with mindset because there are so many people who will pointlessly throw themselves at a corporate identity simply because they have the best at 1 million dollars per card when they can afford $200.

Frankly I would really like to see AMD release the 6000 series with a setting that allows you to choose a TDP target. Like a slider for the card power, even if it has steps like 25 watts or something. I know AMD will still crank the power up, I don't need 300FPS, I am still using a 60Hz screen for now and am interested in 120/144 later this year but options are limited in the area of the market I'm looking as I game maybe 2% of the time I'm at my rig.
 
Joined
Mar 10, 2010
Messages
8,767 (2.16/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R7 3800X@4.350/525/ Intel 8750H
Motherboard Crosshair hero7 @bios 2703/?
Cooling 360EK extreme rad+ 360$EK slim all push, cpu Monoblock Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in two sticks./16Gb
Video Card(s) Sapphire refference Rx vega 64 EK waterblocked/Rtx 2060
Storage Silicon power qlc nvmex3 in raid 0/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd
Display(s) Samsung UAE28"850R 4k freesync, LG 49" 4K 60hz ,Oculus
Case Lianli p0-11 dynamic
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
That's your opinion, yes it is a spoiler to me, and probably other people who are tolerating this for so long. We can't browse any of the tech sites normally because of this drama and the stupid impatience-fetish you people are into. The media will report for those who are interested sure, but what about those who do not wish to be spoiled ... so I have to avoid tech sites for weeks every time such reveals are heating up.

When I was 17 year old I sure liked all the leaks and stuff and had a good time dramatizing ... but those were the times when I had no responsibilites, I had all the time in the world, I could sit for hours and dribble on what new tech will be in the next console ... been there done that, but I got fed up with the spoilers, it feels so much better to see the E3 or some game show event how it's meant to be seen, some cheap-ass website that is all about pre-launch rumor monetization, leaking it in an almost criminally disrespectful way is the biggest slap in the face ever, not only to the fans but also to the very hard working employess who prepared those events, to the speakers who prepared those speeches, you are ruining so many experiences with this stupid rumor leak obsession.

But that's my opinion, calling it stupid, sure, it's free speech, go ahead, we can't force a private forum or site to censor, ofcourse not, but it would be a good idea to try to find compromise and to overhaul the website news sections so the news writers and staff could post spoiler warnings for articles and other forum tools to also put up some warnings- when navigating the forum.

Again, I'm not sure how big this leak is in detail but from the slight thing I've seen it seems big ... I did not see it tho, I successfully hidden my eyes out :p

If I was nvidia right now, after SO MUCH HYPE for MONTHS, where sites wrote they never seen such hype before for a GPU ... it was building up so nice, now some idiot messes it all up just 3 days prior to reveal ... if I was Nvidia, I would go nuclear, I'd practically cancel and delay everything for 30 days, delay the review GPUs, and even recall them back, yes, forcefully canceling the shipments and rerouting them while in-transit so neither the reviewers would get it, it's the reviewers that are mostly responsible for spreading and the leaks, this is how most people get spoiled. That would be the most epic reply back, so epic that all the Leak-Obsession-Disorder losers who would cry about it would be doing it for nothing, industry analysts and pundits would be impressed by the boldness of the move and I think spiritually this would have infact helped the company image in the long run and in the key people, yes the 15 year old gamer-head losers would hate it but it wouldn't matter because everyone including the investors would know everyone would still buy it after the delay, no sales for 30 days might hurt them a bit sure, but if I was the CEO i'd take it, my pride stands first!
Were you not about for the last ten years, every GPU release had a hype train.
And if not new unreleased tech, what then do we discuss, do we just compare benchmarks, fix issues, some like a less dry discussion.
You can not read them.
 
Top