• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 3080 Ti Founders Edition

Joined
Apr 10, 2020
Messages
480 (0.33/day)
Pascal was the last great generation from nvidia the 1060 offered unbelievable value for 1080p gaming, the 1070 gave you 980ti performance at 50% less power, the 1080 was great for 1440p, and lastly how can you forget the 1080ti, a card so good, it blew the industry away with its capabilities and armed with 11gb of gddr5x.

ever since rtx 2000 series nvidia made no real improvements in performance and power efficiency, just focusing on raytracing and DLSS, rtx 3000 is even worse, abysmal power efficiency( up to 500w on rtx 3090!!!!), lackluster VRAM ( aside from 3060 and 3090), overheating gddr6x memory, no stock, and pointless SKUs like the this 3080ti,... wtf is going on at nvidia ??!!!

just when high resolution high refreshrate gaming started to becomes a reallity for everyone nvidia went full L since rtx 2000 series, no one wants ray tracing, we want 4k 144fps gaming, look how rtx 3060 promises rtx 2060 Super performance at 170watts, the 2060 super gave you gtx 1080 performance at 190watts, the gtx 1080 was 180watts gpu !!! NO REAL POWER EFFICIENCY IMPROVEMENTS SINCE 2016 !!!! AND THEY CHARGE YOU MORE

It's maybe not all bad... According to RedGamingTech leaks Intel XE & DG2 dGPU program is shaping up quite well (3070-3080 level of performance) and more importantly, Raja has allegedly got Intel's support to primarily target $200-300 mainstream market. Plus no AIB partners and putting strict distributors & retailer pricing policies in place, just like with it's CPUs. 2022 dGPU market might look much, much better if mining craze ends. Granted RDNA3 and Ampere next gen will be a tier above Intel offerings performance wise, but hey I'll gladly buy 3070-3080 level of performance GPU for 300 bucks if it has half decent drivers instead of what Ngreedia & AMD will try to charge for their new dGPU lineups.
 
Last edited:
Joined
Jan 20, 2019
Messages
1,272 (0.66/day)
Location
London, UK
System Name ❶ Oooh (2024) ❷ Aaaah (2021) ❸ Ahemm (2017)
Processor ❶ 5800X3D ❷ i7-9700K ❸ i7-7700K
Motherboard ❶ X570-F ❷ Z390-E ❸ Z270-E
Cooling ❶ ALFIII 360 ❷ X62 + X72 (GPU mod) ❸ X62
Memory ❶ 32-3600/16 ❷ 32-3200/16 ❸ 16-3200/16
Video Card(s) ❶ 3080 X Trio ❷ 2080TI (AIOmod) ❸ 1080TI
Storage ❶ NVME/SSD/HDD ❷ <SAME ❸ SSD/HDD
Display(s) ❶ 1440/165/IPS ❷ 1440/144/IPS ❸ 1080/144/IPS
Case ❶ BQ Silent 601 ❷ Cors 465X ❸ Frac Mesh C
Audio Device(s) ❶ HyperX C2 ❷ HyperX C2 ❸ Logi G432
Power Supply ❶ HX1200 Plat ❷ RM750X ❸ EVGA 650W G2
Mouse ❶ Logi G Pro ❷ Razer Bas V3 ❸ Logi G502
Keyboard ❶ Logi G915 TKL ❷ Anne P2 ❸ Logi G610
Benchmark Scores I have wrestled bandwidths, Tussled with voltages, Handcuffed Overclocks, Thrown Gigahertz in Jail
The only way we can solve the problem is going back to 360p resolution gaming and sticking with integrated graphics. Just sit closer to the screen so it feels like a 27"/34" panel... for the wide-screeners just squint your eyes. As long as we have "huge" graphical fidelity ambitions, we will always be robbed by profiteering Ngreed'ism/co. I even regretted purchasing a 1080 TI initially for £600/£700. Once up and running it didn't feel like me money's worth. Although it did turn out to be a nice investment eventually.
 
Joined
Dec 26, 2006
Messages
3,529 (0.56/day)
Location
Northern Ontario Canada
Processor Ryzen 5700x
Motherboard Gigabyte X570S Aero G R1.1 BiosF5g
Cooling Noctua NH-C12P SE14 w/ NF-A15 HS-PWM Fan 1500rpm
Memory Micron DDR4-3200 2x32GB D.S. D.R. (CT2K32G4DFD832A)
Video Card(s) AMD RX 6800 - Asus Tuf
Storage Kingston KC3000 1TB & 2TB & 4TB Corsair LPX
Display(s) LG 27UL550-W (27" 4k)
Case Be Quiet Pure Base 600 (no window)
Audio Device(s) Realtek ALC1220-VB
Power Supply SuperFlower Leadex V Gold Pro 850W ATX Ver2.52
Mouse Mionix Naos Pro
Keyboard Corsair Strafe with browns
Software W10 22H2 Pro x64
I wonder if W1zzard has to get additional insurance with all these cards in his studio????
 
Joined
Jul 19, 2016
Messages
476 (0.17/day)
What's the point? Nvidia could package up toenail shavings for a couple thousand and countless idiots would buy it, jacking up the prices for everyone else (not for toenails but graphics cards to be clear).

I liked PC gaming before the middle class kids or casuals got interested in it about 5 or 6 years ago. Now they all want the best graphics cards so save up a whole month's worth of their McDonald's counter salary to buy one. These people don't have bills or kids.
 
Joined
Oct 22, 2014
Messages
13,210 (3.80/day)
Location
Sunshine Coast
System Name Black Box
Processor Intel Xeon E3-1260L v5
Motherboard MSI E3 KRAIT Gaming v5
Cooling Tt tower + 120mm Tt fan
Memory G.Skill 16GB 3600 C18
Video Card(s) Asus GTX 970 Mini
Storage Kingston A2000 512Gb NVME
Display(s) AOC 24" Freesync 1m.s. 75Hz
Case Corsair 450D High Air Flow.
Audio Device(s) No need.
Power Supply FSP Aurum 650W
Mouse Yes
Keyboard Of course
Software W10 Pro 64 bit
What's the point? Nvidia could package up toenail shavings for a couple thousand and countless idiots would buy it, jacking up the prices for everyone else (not for toenails but graphics cards to be clear).

I liked PC gaming before the middle class kids or casuals got interested in it about 5 or 6 years ago. Now they all want the best graphics cards so save up a whole month's worth of their McDonald's counter salary to buy one. These people don't have bills or kids.
Wow, so many stupid assumptions.
 
Joined
Apr 23, 2017
Messages
21 (0.01/day)
Processor i7-13900K
Motherboard ROG Maximus Z690
Video Card(s) RTX 3090
@W1zzard

Thanks for the review!

Is there something wrong about the Cyberpunk 2077 ray-tracing results? The RX6000 series non-RT vs RT results are the same...

1622678212179.png
 
D

Deleted member 177333

Guest
Why is everybody so concerned with NVIDIA's MSRP? It's just a meaningless number. They probably didn't want to lower the x80 Ti MSRP compared to 2080 Ti, so they picked 1200, to not look bad when they announce 4080 Ti.

You will not be able to buy the 3080 Ti at that price, probably ever. As much as that sucks for all of us, that's what will happen. Look at what the card offers, compared to what's available at what price and make a decision based on that? I tried to go through some options and examples in my conclusion.

Eh the 2080 Ti was significantly overpriced at $1200. Its performance wasn't all that impressive. One of the key reasons I waited to pick one up used on ebay for just a bit above half of what its MSRP was (including a pre-installed wb to boot).

As you mentioned in a later post, though, people do need to stop buying this stuff at these price points or it won't ever go down.
 

64K

Joined
Mar 13, 2014
Messages
6,104 (1.65/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) MSI RTX 2070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Dell 27 inch 1440p 144 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
PC gaming isn't going away. It just needs a temporary adjustment in our thinking. People want to downplay the conditions wrought by the Pandemic and we have a hell of a mess to clean up going forward but it will happen.
 
Last edited:
Joined
Dec 25, 2020
Messages
4,620 (3.80/day)
Location
São Paulo, Brazil
System Name Project Kairi Mk. IV "Eternal Thunder"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard MSI MEG Z690 ACE (MS-7D27) BIOS 1G
Cooling Noctua NH-D15S + NF-F12 industrialPPC-3000 w/ Thermalright BCF and NT-H1
Memory G.SKILL Trident Z5 RGB 32GB DDR5-6800 F5-6800J3445G16GX2-TZ5RK @ 6400 MT/s 30-38-38-38-70-2
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 1x WD Black SN750 500 GB NVMe + 4x WD VelociRaptor HLFS 300 GB HDDs
Display(s) 55-inch LG G3 OLED
Case Cooler Master MasterFrame 700
Audio Device(s) EVGA Nu Audio (classic) + Sony MDR-V7 cans
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Logitech G305 Lightspeed K/DA
Keyboard Logitech K400 Plus
Software Windows 10 Enterprise 22H2
Benchmark Scores "Speed isn't life, it just makes it go faster."
You're giving it a Thumbs up/Pro for being 8nm? As opposed to what?

Personal opinion, but I believe that the Samsung 8 nm process isn't the reason these GPUs are so "power inefficient" (as in, hungry). This is most noticeable on the RTX 3070, I would go as far as saying that it's quite lean on power for the awesome amount of performance it provides, and I'm quite eager to see w1zz's review and the impact of GDDR6X on the 3070 Ti's power consumption and frametime stability, given that for all we know from rumor mills, they are coming with similar power limits to the vanilla 3070 variety. Being on this node is also positive for yield, as it doesn't have to compete with the numerous other products and orders that require TSMC 7 nm capacity, like AMD's entire product stack. "nm" is just marketing anyway, the actual transistor pitch isn't that small.

The biggest issue to me, so far, is the GDDR6X, it consumes an absolutely insane amount of power. This was measured in the middle of a 3DMark Time Spy Extreme run. Look at this, even at 61% memory controller load, the MVDDC (memory subsystem) is pushing 115W(!) of the 375W budget my card has... and there are games and workloads that demand more out of it.

mvddc.png


I must say, AMD's Infinity Cache solution to the bandwidth problem is simply ingenious and downright elegant over using hot and hungry PAM4 memory.
 
Joined
Dec 28, 2012
Messages
3,478 (0.84/day)
System Name Skunkworks
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software openSUSE tumbleweed/Mint 21.2
I love the peanut gallery throwing out anecdotes about how PC gaming is totally dying and everyone is going to go buy consoles. Yeah, I cant get a GPU right now so I'm gonna drop high end GPU money on a console that cant do 4k60/1440p144 at ALL and can barely do 4k30/1440p60 (1080p60 for PS5 since it cant even do 1440p LMFAO) with a totally closed environment with no competition and stuck with joystick controls. :rolleyes:

Whatever you're smoking to come up with that argument, you can keep it, cause its garbage.

Also daily reminder that the 8800 ultra launch for the equivalent of $1100 in 2006. Prices go up, prices go down. :roll: LOLCALMDOWN :roll:
 

Mussels

Freshwater Moderator
Staff member
Joined
Oct 6, 2004
Messages
58,413 (8.18/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
On the circuit board analysis page it would be nice to have some notes explaining the differences between the 3080FE, 3080Ti FE, and 3090FE - the chokes, VRMs, memory modules
Since the cards are so similar, something as simple as "the 3080 has X memory modules, Ti has 2 more, and 90 has them doubled onto the back of the PCB" would be really informative to those reading this first, without the background knowledge

According to the HWiNFO developer, GDDR6X modules are rated to throttle at around 110°C. They're toasty and consume a lot of power, any 3090 owner will attest to that :D
*Begins crying*
*Uses tears to fill my EK block and watercool the VRAM with an active backplate*


Yeah its a problem, and something the Ti should have resolved. They clearly just used the existing cooling setups with zero changes.
 
Joined
Sep 25, 2007
Messages
5,965 (0.98/day)
Location
New York
Processor AMD Ryzen 9 5950x, Ryzen 9 5980HX
Motherboard MSI X570 Tomahawk
Cooling Be Quiet Dark Rock Pro 4(With Noctua Fans)
Memory 32Gb Crucial 3600 Ballistix
Video Card(s) Gigabyte RTX 3080, Asus 6800M
Storage Adata SX8200 1TB NVME/WD Black 1TB NVME
Display(s) Dell 27 Inch 165Hz
Case Phanteks P500A
Audio Device(s) IFI Zen Dac/JDS Labs Atom+/SMSL Amp+Rivers Audio
Power Supply Corsair RM850x
Mouse Logitech G502 SE Hero
Keyboard Corsair K70 RGB Mk.2
VR HMD Samsung Odyssey Plus
Software Windows 10
Personal opinion, but I believe that the Samsung 8 nm process isn't the reason these GPUs are so "power inefficient" (as in, hungry). This is most noticeable on the RTX 3070, I would go as far as saying that it's quite lean on power for the awesome amount of performance it provides, and I'm quite eager to see w1zz's review and the impact of GDDR6X on the 3070 Ti's power consumption and frametime stability, given that for all we know from rumor mills, they are coming with similar power limits to the vanilla 3070 variety. Being on this node is also positive for yield, as it doesn't have to compete with the numerous other products and orders that require TSMC 7 nm capacity, like AMD's entire product stack. "nm" is just marketing anyway, the actual transistor pitch isn't that small.

The biggest issue to me, so far, is the GDDR6X, it consumes an absolutely insane amount of power. This was measured in the middle of a 3DMark Time Spy Extreme run. Look at this, even at 61% memory controller load, the MVDDC (memory subsystem) is pushing 115W(!) of the 375W budget my card has... and there are games and workloads that demand more out of it.

View attachment 202619

I must say, AMD's Infinity Cache solution to the bandwidth problem is simply ingenious and downright elegant over using hot and hungry PAM4 memory.

Agree 100%, even on my 3080 I see the memory hogging down power sometimes and you can undervolt the core by a good amount on ampere and get the power consumption down but the memory will still hog down power to the point that sometimes you can actually see the memory using more than the core when playing games that aren't using the core much often. AMDls solution was as you said pretty great in those regards.
 
Joined
Jul 13, 2016
Messages
2,839 (1.00/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
I love the peanut gallery throwing out anecdotes about how PC gaming is totally dying and everyone is going to go buy consoles. Yeah, I cant get a GPU right now so I'm gonna drop high end GPU money on a console that cant do 4k60/1440p144 at ALL and can barely do 4k30/1440p60 (1080p60 for PS5 since it cant even do 1440p LMFAO) with a totally closed environment with no competition and stuck with joystick controls. :rolleyes:

Whatever you're smoking to come up with that argument, you can keep it, cause its garbage.

Also daily reminder that the 8800 ultra launch for the equivalent of $1100 in 2006. Prices go up, prices go down. :roll: LOLCALMDOWN :roll:

You misunderstood the argument prior commenters were making.

The problem is the pricing increases of GPUs in general and the complete lack of any improvements in the budget market, not temporary pricing during the pandemic. The pandemic is a separate problem that inflates prices across the board.

The pandemic is not forever, what people are worried about is that even if it does go, that still leaves little room for budget options and it won't change the fact that Nvidia is still charging $1,200 for this card. Consoles on the other hand will return to their MSRP of $500.

Most people are aware of the drawbacks of consoles, you don't have to point that out. That said at $500, if Nvidia / AMD completely fail to address the budget market you can't really blame those people for considering console when in fact Nvidia / AMD aren't even providing products most people can afford. PC elitists seem to forget that the PC market is held up mostly by budget and midrange where the vast majority of gamers reside. No amount of "Well PC can do this..." will change the price. If a person can't afford it they can't buy it, if a person thinks it isn't worth it they will spend their money elsewhere.

Speaking of the 8800 ultra:

"The 8800 Ultra, retailing at a higher price,[clarification needed] is identical to the GTX architecturally, but features higher clocked shaders, core and memory. Nvidia later[when?] told the media the 8800 Ultra was a new stepping,[clarification needed] creating less heat[clarification needed] therefore clocking higher. Originally retailing from $800 to $1000, most users thought the card to be a poor value, offering only 10% more performance than the GTX but costing hundreds of dollars more. Prices dropped to as low as $200 before being discontinued on January 23, 2008."


At the time that card released it was roundly criticized by the press for being extremely poor value and that was for a 10% gain on a 30% price increase. The 3080 Ti is a 7% increase for 70% more money. I'm glad you brought that up because it just objectively shows how piss poor value the 3080 Ti is even compared to more extreme examples. Mind you that was still a single overpriced card. Nvidia has been increasing the ASP across their entire GPU stack, not just a single model.
 
Last edited:
Joined
Aug 11, 2020
Messages
245 (0.18/day)
Location
2nd Earth
Processor Ryzen 5700X
Motherboard Gigabyte AX-370 Gaming 5, BIOS F51h
Cooling MSI Core Frozr L
Memory 32GB 3200MHz CL16
Video Card(s) MSI GTX 1080 Ti Trio
Storage Crucial MX300 525GB + Samsung 970 Evo 1TB + 3TB 7.2k + 4TB 5.4k
Display(s) LG 34UC99 3440x1440 75Hz + LG 24MP88HM
Case Phanteks Enthoo Evolv ATX TG Galaxy Silver
Audio Device(s) Edifier XM6PF 2.1
Power Supply EVGA Supernova 750 G3
Mouse Steelseries Rival 3
Keyboard Razer Blackwidow Lite Stormtrooper Edition
Pascal was the last great generation from nvidia the 1060 offered unbelievable value for 1080p gaming, the 1070 gave you 980ti performance at 50% less power, the 1080 was great for 1440p, and lastly how can you forget the 1080ti, a card so good, it blew the industry away with its capabilities and armed with 11gb of gddr5x.

ever since rtx 2000 series nvidia made no real improvements in performance and power efficiency, just focusing on raytracing and DLSS, rtx 3000 is even worse, abysmal power efficiency( up to 500w on rtx 3090!!!!), lackluster VRAM ( aside from 3060 and 3090), overheating gddr6x memory, no stock, and pointless SKUs like the this 3080ti,... wtf is going on at nvidia ??!!!

just when high resolution high refreshrate gaming started to becomes a reallity for everyone nvidia went full L since rtx 2000 series, no one wants ray tracing, we want 4k 144fps gaming, look how rtx 3060 promises rtx 2060 Super performance at 170watts, the 2060 super gave you gtx 1080 performance at 190watts, the gtx 1080 was 180watts gpu !!! NO REAL POWER EFFICIENCY IMPROVEMENTS SINCE 2016 !!!! AND THEY CHARGE YOU MORE
I can still remember clearly when consumer and media were very amazed that GTX 1080 only uses 1x 8pin to deliver flagship performance. Even GTX 1080 Ti with 8+6pin was considered power hungry at that time. I thought we were heading to a good direction with 20, 30, and 40 series and beyond in terms of power efficiency, apparently not :(

1060 was a phenomenal card, Nvidia will not be able to beat it with the current increasing MSRP. xx60 will reach xx80's price in the near future, as @RedelZaVedno said here:
It's not meaningless in the long run... just look at the direction MSRP prices are headed: GTX 680 = $499 / 780TI =$699 / 980TI = $649 / 1080TI = $699 / 2080TI = $999 / 3080TI =$1.199... 240% price hike in 9 years (18% inflation in this period). Elevated MSRPs are here to stay even after mining graze ends.
 

r9

Joined
Jul 28, 2008
Messages
3,300 (0.57/day)
System Name Primary|Secondary|Poweredge r410|Dell XPS|SteamDeck
Processor i7 11700k|i7 9700k|2 x E5620 |i5 5500U|Zen 2 4c/8t
Memory 32GB DDR4|16GB DDR4|16GB DDR4|32GB ECC DDR3|8GB DDR4|16GB LPDDR5
Video Card(s) RX 7800xt|RX 6700xt |On-Board|On-Board|8 RDNA 2 CUs
Storage 2TB m.2|512GB SSD+1TB SSD|2x256GBSSD 2x2TBGB|256GB sata|512GB nvme
Display(s) 50" 4k TV | Dell 27" |22" |3.3"|7"
VR HMD Samsung Odyssey+ | Oculus Quest 2
Software Windows 11 Pro|Windows 10 Pro|Windows 10 Home| Server 2012 r2|Windows 10 Pro
Joined
Nov 11, 2016
Messages
3,065 (1.13/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) LG OLED CX48"
Case Corsair 5000D Air
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software win11
I can still remember clearly when consumer and media were very amazed that GTX 1080 only uses 1x 8pin to deliver flagship performance. Even GTX 1080 Ti with 8+6pin was considered power hungry at that time. I thought we were heading to a good direction with 20, 30, and 40 series and beyond in terms of power efficiency, apparently not :(

1060 was a phenomenal card, Nvidia will not be able to beat it with the current increasing MSRP. xx60 will reach xx80's price in the near future, as @RedelZaVedno said here:

Who cares about maximum power consumption when you can tweak the power limits to your liking, reducing power consumption will increase efficiency, demonstrated by the mobile GPU.

Infact you should be thankful that Nvidia/AMD keep increasing the maximum power limits on their desktop GPU because they have to design better VRMs to accomodate higher power consumption limits, better VRM --> higher VRM efficiency. Let say you have 6 phrase VRM that have 20W power loss at 150W TGP before, now you have 10+ phrase VRM that have only 10W power loss at 150W TGP

1660 Super was a super fine GPU at 230usd
 
Joined
Nov 19, 2019
Messages
103 (0.06/day)
Great review as usual.

Disappointing that $500 doesn't even get better thermal pads over the 3080. That card at msrp was exciting. This one not so much. Same number of vrms as well, although they repositioned one? Looks like very, very limited availability for the FE as well. I guess that was to be expected, but this time around seems even lower with best buy only selling in person at a limited number of stores.
 
Joined
Apr 30, 2011
Messages
2,652 (0.56/day)
Location
Greece
Processor AMD Ryzen 5 5600@80W
Motherboard MSI B550 Tomahawk
Cooling ZALMAN CNPS9X OPTIMA
Memory 2*8GB PATRIOT PVS416G400C9K@3733MT_C16
Video Card(s) Sapphire Radeon RX 6750 XT Pulse 12GB
Storage Sandisk SSD 128GB, Kingston A2000 NVMe 1TB, Samsung F1 1TB, WD Black 10TB
Display(s) AOC 27G2U/BK IPS 144Hz
Case SHARKOON M25-W 7.1 BLACK
Audio Device(s) Realtek 7.1 onboard
Power Supply Seasonic Core GC 500W
Mouse Sharkoon SHARK Force Black
Keyboard Trust GXT280
Software Win 7 Ultimate 64bit/Win 10 pro 64bit/Manjaro Linux
And here is why the use of 5800X is a bottleneck for top-of-the-line GPU reviews now-a-days since some games properly utilise more threads
1622699779518.png

Surely our @W1zzard tested somewhere else in the game but the difference between 3080 and 6900XT in his review is 0 compared to the 13% in the HU review.
 
Joined
Jul 13, 2016
Messages
2,839 (1.00/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
Who cares about maximum power consumption when you can tweak the power limits to your liking, reducing power consumption will increase efficiency, demonstrated by the mobile GPU.

Infact you should be thankful that Nvidia/AMD keep increasing the maximum power limits on their desktop GPU because they have to design better VRMs to accomodate higher power consumption limits, better VRM --> higher VRM efficiency. Let say you have 6 phrase VRM that have 20W power loss at 150W TGP before, now you have 10+ phrase VRM that have only 10W power loss at 150W TGP

1660 Super was a super fine GPU at 230usd

A vast majority of consumers aren't going to tweak power limits. IMO it's frankly annoying to have another program running in the background and another source of potential issues.

That's not a problem customers should have to solve either. This is just like AMD users who were claiming AMD Vega is power efficient once you under-volt. That's great and all but it doesn't mean squat to the vast majority of users. Companies should ship products that hit their target markets out of the box. Customers should not have to fiddle with products after the fact. That's for enthusiasts if they want to spend the extra effort.
 
Joined
Dec 14, 2011
Messages
944 (0.21/day)
Location
South-Africa
Processor AMD Ryzen 9 5900X
Motherboard ASUS ROG STRIX B550-F GAMING (WI-FI)
Cooling Corsair iCUE H115i Elite Capellix 280mm
Memory 32GB G.Skill DDR4 3600Mhz CL18
Video Card(s) ASUS RTX 3070 Ti TUF Gaming OC Edition
Storage Sabrent Rocket 1TB M.2
Display(s) Dell S3220DGF
Case Corsair iCUE 4000X
Audio Device(s) ASUS Xonar D2X
Power Supply Corsair AX760 Platinum
Mouse Razer DeathAdder V2 - Wireless
Keyboard Corsair K70 MK.2 Low-Profile Rapidfire
Software Microsoft Windows 11 Pro (64-bit)
Is anyone here on this forum a reseller and can actually get these cards at MSRP?
 
Joined
Mar 28, 2020
Messages
1,644 (1.10/day)
Personal opinion, but I believe that the Samsung 8 nm process isn't the reason these GPUs are so "power inefficient" (as in, hungry). This is most noticeable on the RTX 3070, I would go as far as saying that it's quite lean on power for the awesome amount of performance it provides, and I'm quite eager to see w1zz's review and the impact of GDDR6X on the 3070 Ti's power consumption and frametime stability, given that for all we know from rumor mills, they are coming with similar power limits to the vanilla 3070 variety. Being on this node is also positive for yield, as it doesn't have to compete with the numerous other products and orders that require TSMC 7 nm capacity, like AMD's entire product stack. "nm" is just marketing anyway, the actual transistor pitch isn't that small.

The biggest issue to me, so far, is the GDDR6X, it consumes an absolutely insane amount of power. This was measured in the middle of a 3DMark Time Spy Extreme run. Look at this, even at 61% memory controller load, the MVDDC (memory subsystem) is pushing 115W(!) of the 375W budget my card has... and there are games and workloads that demand more out of it.

View attachment 202619

I must say, AMD's Infinity Cache solution to the bandwidth problem is simply ingenious and downright elegant over using hot and hungry PAM4 memory.
I feel GDDR6X is a stop gap solution for a faster GDDR standard. Its almost similar to GDDR5X that never had a future beyond Nvidia's Pascal. As a result, of pushing such high clockspeed as compared to GDDR6, a lot of power is required. I wasn't very sure if GDDR6 uses that much power until I noticed the TGP of the RTX 3070 vs 3070 Ti. And in this case, its got only 8x GDDR6X 1GB. When you have 10, 12 or 24 of hot and power hungry RAM onboard, that will increase power requirement drastically. And I do agree that AMD's Infinity Cache is a great way to go around this power requirement and yet achieve better or comparable memory bandwidth.

As to Samsung's 8nm, while it is certainly more efficient than what its replacing, I don't necessarily think that its good. Its been proven that Samsung's 7nm is not as good as TSMC's 7nm, not to mention this supposed 8nm is basically Samsung's refined 10nm. Most of these RTX 3xxx runs at a fairly conservative clockspeed, i.e. around 1.8 Ghz, to keep power consumption in check. You can push it further into the 1.9GHz range, but that is generally with a +15% power limit applied. The saving grace here is probably Nvidia's Ampere architecture with ample memory bandwidth, and less of the 8nm Samsung node in my opinion.

A vast majority of consumers aren't going to tweak power limits. IMO it's frankly annoying to have another program running in the background and another source of potential issues.

That's not a problem customers should have to solve either. This is just like AMD users who were claiming AMD Vega is power efficient once you under-volt. That's great and all but it doesn't mean squat to the vast majority of users. Companies should ship products that hit their target markets out of the box. Customers should not have to fiddle with products after the fact. That's for enthusiasts if they want to spend the extra effort.
Companies ship product that works. Therefore, they ship with settings that are what they deem as "safe" to make sure the product works according to specs. They can't possibly test every chip that comes in and provide a custom setting each time.

In my opinion, its the people that are savvy that will figure out something is not right, and will try and fix it, i.e. fiddle with the power limits, etc. For people that are not savvy, they probably will live with it since while it runs hot, it works.
 
Joined
Feb 23, 2019
Messages
5,631 (2.98/day)
Location
Poland
Processor Ryzen 7 5800X3D
Motherboard Gigabyte X570 Aorus Elite
Cooling Thermalright Phantom Spirit 120 SE
Memory 2x16 GB Crucial Ballistix 3600 CL16 Rev E @ 3800 CL16
Video Card(s) RTX3080 Ti FE
Storage SX8200 Pro 1 TB, Plextor M6Pro 256 GB, WD Blue 2TB
Display(s) LG 34GN850P-B
Case SilverStone Primera PM01 RGB
Audio Device(s) SoundBlaster G6 | Fidelio X2 | Sennheiser 6XX
Power Supply SeaSonic Focus Plus Gold 750W
Mouse Endgame Gear XM1R
Keyboard Wooting Two HE
And here is why the use of 5800X is a bottleneck for top-of-the-line GPU reviews now-a-days since some games properly utilise more threads
View attachment 202635
Surely our @W1zzard tested somewhere else in the game but the difference between 3080 and 6900XT in his review is 0 compared to the 13% in the HU review.
Dude, watch any YT video with RivaTuner running on a 5950X and 3090. An engine designed around Jaguar cores isn't going to utilize 32 threads:
1622705977828.png
 
Joined
Jul 13, 2016
Messages
2,839 (1.00/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
Companies ship product that works. Therefore, they ship with settings that are what they deem as "safe" to make sure the product works according to specs. They can't possibly test every chip that comes in and provide a custom setting each time.

In my opinion, its the people that are savvy that will figure out something is not right, and will try and fix it, i.e. fiddle with the power limits, etc. For people that are not savvy, they probably will live with it since while it runs hot, it works.

This is simply not true given that both AMD (for CPUs and GPUs) and Nvidia have dynamic boost features that will give the end user extra performance depending on specific silicon quality and temperature. AMD's dynamic boosting for it's CPUs in particular does an excellent job to the point where manual tuning isn't needed and can actually yield less performance than the automatic boosting system.
 

las

Joined
Nov 14, 2012
Messages
1,533 (0.37/day)
System Name Obsolete / Waiting for Zen 5 or Arrow Lake
Processor i9-9900K @ 5.2 GHz @ 1.35v / No AVX Offset
Motherboard AsRock Z390 Taichi
Cooling Custom Water
Memory 32GB G.Skill @ 4000/CL15
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 280 Hz + 77" QD-OLED @ 144 Hz VRR
Case Fractal Design Meshify C
Audio Device(s) Asus Essence STX / Upgraded Op-Amps
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
3080 Ti and 3090 have their place .. For 4K-5K gaming

I will keep my 3080 till 4070-4080 launches in late 2022 tho or wait for refreshes in 2023 if pricing and availablity have not normalized by 2H 2022

Or maybe I will consider Radeon 7800XT/8800XT, if AMD can keep up the pace
 
Top