• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Announces GeForce Ampere RTX 3000 Series Graphics Cards: Over 10000 CUDA Cores

ppn

Joined
Aug 18, 2015
Messages
1,231 (0.39/day)
Full chips shoud be 5376/384 bit, 3584/256 bit, in reality are those shaders are some sort of hyperthreading.dual shader operations per clock.

We should evaluate performance per transistor. 28000 vs 18600. should result 50% better performance, provided the memory bandwidth also grows by 50%.
 
Joined
Jun 10, 2014
Messages
2,902 (0.80/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
Secondly, unless having the full chip delivers noticeably more performance, what value does it have to you if your chip is fully enabled? The Vega 64 Water Cooled edition barely outperformed the V64 or the V56, yet cost 2x more. That is just silly. I mean, you're welcome to your delusions, but don't go pushing them on others here. Cut-down chips are a fantastic way of getting better yields out of intrinsically imperfect manufacturing methods, allowing for far better products to reach end users.
If I may add, cut down chips may actually even be "better" in some ways. Like GTX 970 achieved higher performance per TFlop than GTX 980, because it effectively had more scheduling resources and cache per core. I believe we saw something similar with Vega56 vs. Vega64 too, where Vega56 did better than it "should" overclocked vs. its bigger brother.

LOL. And you actually believe what Jensen said in that reveal?
You are the one challenging the performance figures, you are the one to prove it. Or should I just take your word for it? :rolleyes:

Yes, i would call Ampere catastrophically ineffective garbage.
You are just going to keep digging yourself deeper, aren't you? ;)
Here is another shovel for you;
So if Ampere is catastrophically ineffective, what does that make Polars, Vega, Vega20 and Navi then?
 

Dux

Joined
May 17, 2016
Messages
511 (0.18/day)
If I may add, cut down chips may actually even be "better" in some ways. Like GTX 970 achieved higher performance per TFlop than GTX 980, because it effectively had more scheduling resources and cache per core. I believe we saw something similar with Vega56 vs. Vega64 too, where Vega56 did better than it "should" overclocked vs. its bigger brother.


You are the one challenging the performance figures, you are the one to prove it. Or should I just take your word for it? :rolleyes:


You are just going to keep digging yourself deeper, aren't you? ;)
Here is another shovel for you;
So if Ampere is catastrophically ineffective, what does that make Polars, Vega, Vega20 and Navi then?
Don't mistake me for AMD fanboy. I owned GTX 1080Ti and RTX 2060 Super. I'm just saying that historically Nvidia is full of shit. intel and AMD as well. I used simple math from available data. Independent reviews will tell the truth as always. But fanboys are gonna believe what they want to believe. And buy anything at any price. Trying to use logic and real life available data with fanboys is like trying to have logical conversation with a religious fanatic.
 
Joined
May 2, 2017
Messages
7,762 (3.04/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Don't mistake me for AMD fanboy. I owned GTX 1080Ti and RTX 2060 Super. I'm just saying that historically Nvidia is full of shit. intel and AMD as well. I used simple math from available data. Independent reviews will tell the truth as always. But fanboys are gonna believe what they want to believe. And buy anything at any price. Trying to use logic and real life available data with fanboys is like trying to have logical conversation with a religious fanatic.
You're part of the problem here, not part of the solution. Saying "Nvidia is full of shit" doesn't help promote the view that one should always wait for independent reviews whatsoever. All you're doing is riling people up and inciting unnecessary conflict. I don't trust Nvidia's cherry-picked performance numbers either, but I stil find myself arguing against you simply due to how you are presenting your "points".
 

Dux

Joined
May 17, 2016
Messages
511 (0.18/day)
You're part of the problem here, not part of the solution. Saying "Nvidia is full of shit" doesn't help promote the view that one should always wait for independent reviews whatsoever. All you're doing is riling people up and inciting unnecessary conflict. I don't trust Nvidia's cherry-picked performance numbers either, but I stil find myself arguing against you simply due to how you are presenting your "points".
I'm sorry. I'll try to never again use simple math if that offends you. :) I'm done with this topic. Peace out. :rockout:
OH. Quick edit. I will be picking up RTX 3080. Because i have connections over at gainward and i can get one at factory price. :)
 
Joined
Nov 11, 2016
Messages
3,067 (1.13/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) LG OLED CX48"
Case Corsair 5000D Air
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software win11
Don't mistake me for AMD fanboy. I owned GTX 1080Ti and RTX 2060 Super. I'm just saying that historically Nvidia is full of shit. intel and AMD as well. I used simple math from available data. Independent reviews will tell the truth as always. But fanboys are gonna believe what they want to believe. And buy anything at any price. Trying to use logic and real life available data with fanboys is like trying to have logical conversation with a religious fanatic.

Kinda funny I think you are full of BS too, where do you find numbers that say 2080 Ti is 10-15% faster than 1080 Ti ? your ass ?
 
Joined
May 2, 2017
Messages
7,762 (3.04/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
I'm sorry. I'll try to never again use simple math if that offends you. :) I'm done with this topic. Peace out. :rockout:
OH. Quick edit. I will be picking up RTX 3080. Because i have connections over at gainward and i can get one at factory price. :)
I will admit to losing interest in anything math related after middle school, but I certainly never came across any type of math that involved calling anyone "full of shit". That might just be me though. Besides that, trying to calculate gaming performance from on-paper specs of a new architecture with major changes (such as the doubled ALU count with who know what other changes, likely creating bottlenecks that were previously not there, dropping perf/Tflop) is arguably more dubious than Nvidia's claims. Their numbers are cherry-picked, but at least they have a basis in some tiny corner of reality. Purely theoretical calculations of performance add very little to the discussion, underscoring the fact that we should all be waiting for a wide selection of third party reviews before making any kind of decision. And, to be honest, we should also be waiting to see what the competition comes up with. RDNA 2 looks very promising, after all.
 
Joined
Sep 3, 2020
Messages
10 (0.01/day)
Processor i5 13600KF
Motherboard Z790 Steel Legend WiFi
Cooling Artic Freezer 2 420mm
Memory 32GB DDR5 6000mhz
Video Card(s) RTX 3080 FTW3 Ultra EVGA
Storage S70 Blade 1TB
Display(s) S2721DGF
Case Corsair 7000D Airflow
Power Supply Seasonic 1300w Gold
Mouse Logitech G Pro X Superlight
Coming from a RTX 2080, the 3080 seems enticing.
 

Dux

Joined
May 17, 2016
Messages
511 (0.18/day)
Ah, yes, the true sign of a superior intellect: ad hominem attacks, insults and pejoratives. Reported. If you can't present your arguments in at least a somewhat polite way, maybe you shouldn't be posting on forums?
Lets not insult each other here. People disagree with my opinons and i disagree with theirs sometimes. But we can have a polite discussion. :lovetpu:
 

95Viper

Super Moderator
Staff member
Joined
Oct 12, 2008
Messages
12,679 (2.23/day)
Hi there, everyone!
Keep it on topic.
Quit the arguing and try to keep the discussion on the technical side.
No name calling allowed and enough BS has been thrown and discussed.
As the Guidelines state:
Be polite and Constructive, if you have nothing nice to say then don't say anything at all.

Thank You and Have a Great Day
 

Dux

Joined
May 17, 2016
Messages
511 (0.18/day)
Well, here's a video comparing RTX 2080Ti and RTX 3080 in DOOM Eternal. From those screens where they showed identical scenes side by side i calculated average 40% difference. So my previous math was wrong. Calculated 30-35%. Well, maybe it wasn't wrong. Faster memory gives more advantage at higher resolutions. GDDR6 on 2080Ti vs GDDR6X on 3080. Could be less difference on lower resolutions.
So this is based on this one game. TDP difference between RTX 2080Ti and 3080 is 28%. 250W vs 320W. Performance difference jumping around 40%. So how can Nvidia claim they achieved 1,9X (+90%) performance per Watt? Were they refering to RTX performance?
 
Last edited by a moderator:
Joined
May 2, 2017
Messages
7,762 (3.04/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Well, here's a video comparing RTX 2080Ti and RTX 3080 in DOOM Eternal. From those screens where they showed identical scenes side by side i calculated average 40% difference. So my previous math was wrong. Calculated 30-35%. Well, maybe it wasn't wrong. Faster memory gives more advantage at higher resolutions. GDDR6 on 2080Ti vs GDDR6X on 3080. Could be less difference on lower resolutions.
So this is based on this one game. TDP difference between RTX 2080Ti and 3080 is 28%. 250W vs 320W. Performance difference jumping around 40%. So how can Nvidia claim they achieved 1,9X (+90%) performance per Watt? Were they refering to RTX performance?
That's pretty interesting indeed. Benchmarks will be well worth the read when they arrive.
 
Joined
Nov 20, 2012
Messages
422 (0.10/day)
Location
Hungary
System Name masina
Processor AMD Ryzen 5 3600
Motherboard ASUS TUF B550M
Cooling Scythe Kabuto 3 + Arctic BioniX P120 fan
Memory 16GB (2x8) DDR4-3200 CL16 Crucial Ballistix
Video Card(s) Radeon Pro WX 2100 2GB
Storage 500GB Crucial MX500, 640GB WD Black
Display(s) AOC C24G1
Case SilentiumPC AT6V
Power Supply Seasonic Focus GX 650W
Mouse Logitech G203
Keyboard Cooler Master MasterKeys L PBT
Software Win 10 Pro
Well, here's a video comparing RTX 2080Ti and RTX 3080 in DOOM Eternal. From those screens where they showed identical scenes side by side i calculated average 40% difference. So my previous math was wrong. Calculated 30-35%. Well, maybe it wasn't wrong. Faster memory gives more advantage at higher resolutions. GDDR6 on 2080Ti vs GDDR6X on 3080. Could be less difference on lower resolutions.
So this is based on this one game. TDP difference between RTX 2080Ti and 3080 is 28%. 250W vs 320W. Performance difference jumping around 40%. So how can Nvidia claim they achieved 1,9X (+90%) performance per Watt? Were they refering to RTX performance?

Most likely. Cherry picked RT game with DLSS or prosumer software. If you look at the perf/W graf they provided say 1.9X but in reality isn't that 1.5X?
Realistically Ampere is 1.35-1.40X faster regarding normal rasterization performance vs. Turing.

Problem is that most RT titles available currently are bound by DXR1.0 implementation so might not be able to show the full RT potential of Ampere over Turing, even though it will be still considerably faster in these titles.
 
Joined
Nov 11, 2016
Messages
3,067 (1.13/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) LG OLED CX48"
Case Corsair 5000D Air
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software win11
Well, here's a video comparing RTX 2080Ti and RTX 3080 in DOOM Eternal. From those screens where they showed identical scenes side by side i calculated average 40% difference. So my previous math was wrong. Calculated 30-35%. Well, maybe it wasn't wrong. Faster memory gives more advantage at higher resolutions. GDDR6 on 2080Ti vs GDDR6X on 3080. Could be less difference on lower resolutions.
So this is based on this one game. TDP difference between RTX 2080Ti and 3080 is 28%. 250W vs 320W. Performance difference jumping around 40%. So how can Nvidia claim they achieved 1,9X (+90%) performance per Watt? Were they refering to RTX performance?

Yeah the 1.9x Perf/Watt is derived from here



Basically the 2080 Super get 60fps at 250W TGP while 3080 can get 60fps with only 130W, meaning 2080 Super use 1.9X more watt per FPS.
Which is not correct, it should have been FPS divided by power use (Perf/Watt) which Ampere is somewhere between 35-45% more efficient (the higher power usage the more Ampere is becoming more efficient, Ampere scales much better with voltages than Turing it seems).
 
Joined
Jul 19, 2016
Messages
476 (0.17/day)
LOL. And you actually believe what Jensen said in that reveal? He also said 2 years ago that RTX 2080Ti gives 50% better performance that 1080Ti. In reality, it turned out more to be 10-15%. So not hard to beat your previous (turing) architecture when that brought minimalistic improvements to performance. RTX excluded ofc. RTX 2080 achieves around 45fps in Shadow of Tomb Raider in 4K and same settings as Digital Foundry used in their video. They say RTX 3080 is 80% faster. So simple math says that is around 80fps. RTX 2080Ti achieves around 60fps (source Guru3D) So performance difference RTX 2080Ti-3080 is 30-35% based on Tomb Raider. RTX 2080Ti 12nm manufacturing process, 250W TDP, 13.5TFLOPS. RTX 3080 8nm manufacturing process, 320W TDP, 30TFLOPS. And only 30-35% performance difference from this power hungry GPU with double the number of CUDA cores. Yes, i would call Ampere catastrophically ineffective garbage.
Also, where the F is Jensen pulling that data about Ampere having 1.9 times performance per Watt compared to Turing? It would mean that RTX 3080 at 250W TDP should have 90% higher performance tham RTX 2080TI. But no. It has 320W TDP and 30-35% higher performance. So power effectiveness of Ampere is minimally better Watt for Watt. Claim about 1.9X effectibness i straight out LIE.

I wonder if Jensen believes that bullshit coming out of his mouth?

Exactly. It appears here like people haven't witnessed an Nvidia/AMD launch before, or have collective ignorance.

How on earth are some of you spewing these marketing numbers from Nvidia and thus hyperbolic assertions about Ampere without a single reputable benchmark of any of the cards? You're like the perfect consumer drones in a dystopian future where consumerism has gone into overdrive.

I will say it again: let's come back to these comments in a couple weeks when W1zzard has put the cards through his benchmark suite. Lets see how these '200% faster' and '2-3X the performance' exclamations hold up when you blow away the smoke and shatter the mirrors to look at actual rasterization performance averages across more than 10 games..
 
Last edited:
Joined
Dec 22, 2011
Messages
3,890 (0.86/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
Hey, we still remember Polaris was gonna offer a whopping 2.5x perf/watt over cards like the 290X:


You're right, don't believe everything you read. ;)
 
Joined
Jul 9, 2015
Messages
3,413 (1.06/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Full chips shoud be 5376/384 bit, 3584/256 bit, in reality are those shaders are some sort of hyperthreading.dual shader operations per clock.

In the past CUs could do 1 fp and one int op.
Ampere CUs could do 1 fp and one int op or 2 fps.
So technically it's twice tflops, but the same CU.

Saying it is 2 CUs is like AMD claiming double number of Buldozer cores.
 
Joined
Sep 17, 2019
Messages
450 (0.27/day)
So I have a PSU 750W bronze, I know I need to upgrade but to which wattage
Probably not. You should invest some money into a watt meter to get the actual wattage being used coming from your outlet.

In simple terms, this is how I setup all of my PSU's for my computers. Seems to work well for me. It might work for you.

Right now I have a 850Watt 80+Gold standard PSU. Rated 90% Efficiency @ 50% load.
So that number that I use is 850 x 90% / 2 (50% load number) = 382 Watts at the 50% load level. I personally want to keep my wattage below this number as this will prolong the life of your PSU. I have a few that are in the 10 year range ATM and those comps I use to play games on mostly.

With my watt meter I measure my usage normally as well as while I'm playing my games on the computer I am using. I Do this for 2 or more hours recording the data for that rig.

My normal load level for my current computer is at 105 Watts. My Max watt level playing games is at 250W max. Well below the 50% threshold of my PSU. I should have no problems with any future upgrade however I'm not the type who buys on a whim. I do my research first, then buy it needed.

My comp is AMD 3600, 32GB G,Skill Flare X CL14 PC3200 Ram, MSI X570 A-Pro MB, Visiontek RX 5700 w/10% undervolt, 500gb Samsung 860EVO SSD, WD 2TB HDD, LG Blu Ray.

Again. This is how I set up my PSU's concerning watt usages and it has worked for me for at least 15 years.


As far as this whole new generation of Video cards?

When Navi Comes out then we can see all of the performance gains over last generation of cards.

HOWEVER... And this is something that people should be aware of.

1. Most of the world is running on 1080p OR Less.
2. Gaming monitors are mostly set at the 1080p level. recently 1440p is becoming more and more of the Sweet Spot as 4K is still too expensive for what you are getting.
3. Video gaming advancements are still 3 to 5 years behind the current technology.
4. Most people right now are tight on money these days.

It makes no difference if your monitor can not handle the additional power given with these new cards.

I am averaging @ over 165 fps @1440p, playing Overwatch on a 32 inch, 1440p, 165hz Pixio PX329 monitor that is using a RX 5700 video card that I got new for $280.00 9 months ago. Heh Grandpa here playing pew pew. Everything right now is working great.

Unless Nvidia (and AMD) can pull 50% real performance numbers over last years video cards, only those gerbils will be buying those video cards because most people's video cards they currently have on hand are still doing the job.

Unless there is a fantastic price vs performance ratio on these new upcomming video cards, I don't think I'll be throwing 5 to 7 hundred dollars for one.
 

ppn

Joined
Aug 18, 2015
Messages
1,231 (0.39/day)
Judging by the DF results we have 55% faster 70-tier card, and 75% faster 80-tier card. So Nvidia pulled a bulldozer on us. In their desire to impress with 2xFP performance nvidia created a GPGPU card that is going to mine well, not so good for games and for prices, although my 1060 mined 1Eth in one month, and that can buy me 3070 now. lol.. But still 1,66x more transistors fitted in the same 450mm2 die size 70-tier card. and that is being hampered by the ridiculous vram gimping. What is very dissapointing that 16GBps memory is used. there is not much difference here. 66% more transistors, and the same memory, 16% faster, that is garbage. should be 20GBps 16GB.
 
Joined
Jun 10, 2014
Messages
2,902 (0.80/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
Probably not. You should invest some money into a watt meter to get the actual wattage being used coming from your outlet.
But is that really needed though?
I'm pretty sure 750W is plenty unless he/she is overclocking heavily or using a HEDT CPU (and probably enough even then).
I would be much more concerned with the quality and age of the PSU.

I had one computer with a heavily used ~10 year old Corsair TX750 80+ bronze (I believe) which was very unstable, and I put in a Seasonic Focus+ Gold 650W, and it improved a lot.

In general I would prefer quality over quantity, I'll take a Seasonic 650W over a crappy 1000W any day. Still, when the budget allows it, I do choose some breathing room to try to keep the typical load around ~50% for long-term stability as you said.
 
Joined
Dec 26, 2006
Messages
3,532 (0.56/day)
Location
Northern Ontario Canada
Processor Ryzen 5700x
Motherboard Gigabyte X570S Aero G R1.1 BiosF5g
Cooling Noctua NH-C12P SE14 w/ NF-A15 HS-PWM Fan 1500rpm
Memory Micron DDR4-3200 2x32GB D.S. D.R. (CT2K32G4DFD832A)
Video Card(s) AMD RX 6800 - Asus Tuf
Storage Kingston KC3000 1TB & 2TB & 4TB Corsair LPX
Display(s) LG 27UL550-W (27" 4k)
Case Be Quiet Pure Base 600 (no window)
Audio Device(s) Realtek ALC1220-VB
Power Supply SuperFlower Leadex V Gold Pro 850W ATX Ver2.52
Mouse Mionix Naos Pro
Keyboard Corsair Strafe with browns
Software W10 22H2 Pro x64
HOWEVER... And this is something that people should be aware of.

1. Most of the world is running on 1080p OR Less.
2. Gaming monitors are mostly set at the 1080p level. recently 1440p is becoming more and more of the Sweet Spot as 4K is still too expensive for what you are getting.
3. Video gaming advancements are still 3 to 5 years behind the current technology.
4. Most people right now are tight on money these days.

My old RX480 runs terraria at 4k at 60 fps :)
 
Joined
Mar 10, 2010
Messages
11,878 (2.30/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
My old RX480 runs terraria at 4k at 60 fps :)
I was just playing ace combat 7 using my monitor plugged into my laptop Rtx 2060@4k80fps max settings, the Vegas as capable but we do all game different on different game's.
 
Joined
Sep 17, 2019
Messages
450 (0.27/day)
My old RX480 runs terraria at 4k at 60 fps :)
So does my 1070. And you can even get 4K monitors @$300+ range.

But there is a big difference between something that is running at 60hz and something running at the 144hz+ with a IP panel @4K. I've seen the difference and its nice... but not that nice when the monitor I'm looking at starts at the $600, then add a card that can take full advantage of the monitor in question and that is a lot of money.

That is why I purchased the Pixio 32 inch 1440p, 165hz monitor for under $300 though it is a bit of overkill for me as I'm so used to the 27 inch monitor.

Hardware Monitor did a review in late 2018

But it was Level 1 Tech that sold me on the Monitor.

And finally the 27 inch Brand Name monitor that I wanted cost more than this monitor.

Now back to the Nvidia 3000 series of video cards. You know that there are going to be limited supply of the 3000 series being sold at launch date... Right? Yea I am hearing the same rumors about limited supply issues and of course Price increases. So if those rumors do come true it will be just like the 2000 series limited supply launch.

I'll just wait until late October/Holiday season to pick up any additional components as needed as well as what AMD has to offer.

But I am interested to see the actual gaming performance over previous generations as well as how hot these cards will be generating the heat it creates.
 

rtwjunkie

PC Gaming Enthusiast
Supporter
Joined
Jul 25, 2008
Messages
13,909 (2.42/day)
Location
Louisiana -Laissez les bons temps rouler!
System Name Bayou Phantom
Processor Core i7-8700k 4.4Ghz @ 1.18v
Motherboard ASRock Z390 Phantom Gaming 6
Cooling All air: 2x140mm Fractal exhaust; 3x 140mm Cougar Intake; Enermax T40F Black CPU cooler
Memory 2x 16GB Mushkin Redline DDR-4 3200
Video Card(s) EVGA RTX 2080 Ti Xc
Storage 1x 500 MX500 SSD; 2x 6TB WD Black; 1x 4TB WD Black; 1x400GB VelRptr; 1x 4TB WD Blue storage (eSATA)
Display(s) HP 27q 27" IPS @ 2560 x 1440
Case Fractal Design Define R4 Black w/Titanium front -windowed
Audio Device(s) Soundblaster Z
Power Supply Seasonic X-850
Mouse Coolermaster Sentinel III (large palm grip!)
Keyboard Logitech G610 Orion mechanical (Cherry Brown switches)
Software Windows 10 Pro 64-bit (Start10 & Fences 3.0 installed)
He said the 3080 is a 4k 60fps card and way better than 2080Ti. That made me question a little because i thought 2080Ti was a 4k 60fps card am i wrong?
It was. And still is for older games from when it was released. But game tech moves on and newer games can be unable to be played at 4k on anything but the newest.

That 4k is a moving finish line always is what I have always said. Your 4k card today won’t be one soon.
 
Joined
Jun 16, 2013
Messages
1,457 (0.37/day)
Location
Australia
Ordered factory OC RX 5700XT last week...
@ a price point with PP that suited my budget.
Last time I had an Nvidia card in my then, Intel gaming rig was back in 2008!
There's something about mixing AMD platform with NVIDIA product that just doesn't sit well with me today. :fear:
 
Top