• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 2080 Ti Benchmarks Allegedly Leaked- Twice

Joined
Oct 22, 2014
Messages
7,165 (3.88/day)
Location
Sunshine Coast
System Name Black Box
Processor Intel Xeon E5-2680 10c/20t 2.8GHz @ 3.0GHz
Motherboard Asrock X79 Extreme 11
Cooling Coolermaster 240 RGB A.I.O.
Memory G. Skill 16Gb (4x4Gb) 2133Mhz
Video Card(s) Nvidia GTX 710
Storage Sandisk X 400 256Gb
Display(s) AOC 22" Freesync 1m.s. 75Hz
Case Corsair 450D High Air Flow.
Audio Device(s) No need.
Power Supply FSP Aurum 650W
Mouse Yes
Keyboard Of course
Software W10 Home Premium 64 bit
Ah, the mythical gimping.

Not optimizing new software for older arches is not gimping. Read a dictionary sometime. gimping is retroactively LOWERING performance of a previous product, which has been proven time and time again to be completely TRUE with nvidia....
Fixed that for you.
Gimping is lowering the performance of older hardware with new drivers, in comparison to older drivers. Perhaps you should read a dictionary sometimes.
Optimising new software for old hardware has nothing to do with it, they should be at least capable of running at prior levels of performance, not lower, which has been shown many times through forums like this one.
 
Joined
Feb 18, 2013
Messages
1,703 (0.69/day)
Location
KL, Malaysia
guess the whole "take a grain of salt" is getting lit as everyone is getting salty over the fact that Green Camp is "milking" people's wallet when they haven't even pre-order one or even benched one for themselves. Leaks are becoming the norm here coz impatient people who are too trigger happy are benching new cards using current drivers, which do not even unlock the card's full potential. Another thing about the whole "boycott Nvidia" because of how they set the prices, I doubt it'll hurt their stock when the animation & CGI industry have already ride the real time ray-tracing bandwagon, effectively replacing all their current Maxwell or Pascal based Quadro cards while saving up at the same time. For me, I'll be playing the waiting game until those who owned the cards signed the NDA, get the correct drivers & update the benchmark software so we can see how will it fare.
 
Joined
Dec 22, 2011
Messages
2,957 (1.03/day)
System Name Zimmer Frame Rates
Processor Intel i7 920 @ Stock speeds baby
Motherboard EVGA X58 3X SLI
Cooling True 120
Memory Corsair Vengeance 12GB
Video Card(s) Palit GTX 980 Ti Super JetStream
Storage Of course
Display(s) Crossover 27Q 27" 2560x1440
Case Antec 1200
Audio Device(s) Don't be silly
Power Supply XFX 650W Core
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 10
Benchmark Scores Epic
Joined
Sep 17, 2014
Messages
10,234 (5.43/day)
Location
Mars
Processor i7 8700k 4.7Ghz @ 1.26v
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) MSI GTX 1080 Gaming X @ 2100/5500
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Eizo Foris FG2421
Case Fractal Design Define C TG
Power Supply EVGA G2 750w
Mouse Logitech G502 Protheus Spectrum
Keyboard Sharkoon MK80 (Brown)
Software W10 x64
Why, are you planning to buy this card for 1080p gaming? :laugh::laugh::laugh::laugh::roll:
You obviously don't game a whole lot if you say this. Resolution is irrelevant, games get heavier and your and my 1080 will be struggling with 1080p before long. When Pascal launched, people said the exact same nonsense you typed up here about that 1080 but then applied it to 1440p. Go figure. A higher res only locks you into buying the most expensive GPUs all the time every time, while resolution doesn't even pay off that much in image quality. Who's laughing here? I'll take 1080p any day and get stable, high FPS at max settings.

As for the 2080ti and the sad people who need to post everywhere that they pre-ordered one... the joke's on you :D Enjoy those 10 Giga Rays at 30 fps.

25-30% best case seems accurate and as predicted. It also puts the 2080 and 2070 in perspective, those are literally worthless now. It also means the Pascal line up will remain stagnant in value making our second hand sales that much more valuable. Thanks Nvidia ;)
 
Last edited:

Mussels

Moderprator
Staff member
Joined
Oct 6, 2004
Messages
47,453 (8.60/day)
Location
Australalalalalaia.
System Name Big Fella
Processor Ryzen R7 2700X (stock/XFR OC)
Motherboard Asus B450-i ITX
Cooling Corsair H110 W/ Corsair ML RGB fans
Memory 16GB DDR4 3200 Corsair Vengeance RGB Pro
Video Card(s) MSI GTX 1080 Gaming X (BIOS mod to Gaming Z) w/ Corsair H55 AIO
Storage 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000 Pro NVME
Display(s) Phillips 328m6fjrmb (32" 1440p 144hz curved) + Sony KD-55X8500F (55" 4K HDR)
Case Fractal Design Nano S
Audio Device(s) Razer Leviathan + Corsair Void pro RGB, Blue Yeti mic
Power Supply Corsair HX 750i (Platinum, fan off til 300W)
Mouse Logitech G903 + PowerPlay mousepad
Keyboard Corsair K65 Rapidfire
Software Windows 10 pro x64 (all systems)
Benchmark Scores Laptops: i7-4510U + 840M 2GB (touchscreen) 275GB SSD + 16GB i7-2630QM + GT 540M + 8GB
You obviously don't game a whole lot if you say this. Resolution is irrelevant, games get heavier and your and my 1080 will be struggling with 1080p before long. When Pascal launched, people said the exact same nonsense you typed up here about that 1080 but then applied it to 1440p. Go figure. A higher res only locks you into buying the most expensive GPUs all the time every time, while resolution doesn't even pay off that much in image quality. Who's laughing here? I'll take 1080p any day and get stable, high FPS at max settings.

As for the 2080ti and the sad people who need to post everywhere that they pre-ordered one... the joke's on you :D Enjoy those 10 Giga Rays at 30 fps.

25-30% best case seems accurate and as predicted. It also puts the 2080 and 2070 in perspective, those are literally worthless now. It also means the Pascal line up will remain stagnant in value making our second hand sales that much more valuable. Thanks Nvidia ;)
for some games, higher res (and/or larger screen) is a competitive advantage
I was the only 4k gamer (now 2k 144) in a group of 1080p screen users, and i could see and snipe people with iron sights they couldnt see with 4x scopes
 
Joined
Sep 17, 2014
Messages
10,234 (5.43/day)
Location
Mars
Processor i7 8700k 4.7Ghz @ 1.26v
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) MSI GTX 1080 Gaming X @ 2100/5500
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Eizo Foris FG2421
Case Fractal Design Define C TG
Power Supply EVGA G2 750w
Mouse Logitech G502 Protheus Spectrum
Keyboard Sharkoon MK80 (Brown)
Software W10 x64
for some games, higher res (and/or larger screen) is a competitive advantage
I was the only 4k gamer (now 2k 144) in a group of 1080p screen users, and i could see and snipe people with iron sights they couldnt see with 4x scopes
Fair point. And in close combat you'd lose because you need more accuracy due to higher pixel counts = higher miss chance ;)

There is a reason people lower their res in competitive shooters, much more than there is upping it so you can camp in the bushes. It feels to me you're grasping at straws here, sry...
 

Mussels

Moderprator
Staff member
Joined
Oct 6, 2004
Messages
47,453 (8.60/day)
Location
Australalalalalaia.
System Name Big Fella
Processor Ryzen R7 2700X (stock/XFR OC)
Motherboard Asus B450-i ITX
Cooling Corsair H110 W/ Corsair ML RGB fans
Memory 16GB DDR4 3200 Corsair Vengeance RGB Pro
Video Card(s) MSI GTX 1080 Gaming X (BIOS mod to Gaming Z) w/ Corsair H55 AIO
Storage 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000 Pro NVME
Display(s) Phillips 328m6fjrmb (32" 1440p 144hz curved) + Sony KD-55X8500F (55" 4K HDR)
Case Fractal Design Nano S
Audio Device(s) Razer Leviathan + Corsair Void pro RGB, Blue Yeti mic
Power Supply Corsair HX 750i (Platinum, fan off til 300W)
Mouse Logitech G903 + PowerPlay mousepad
Keyboard Corsair K65 Rapidfire
Software Windows 10 pro x64 (all systems)
Benchmark Scores Laptops: i7-4510U + 840M 2GB (touchscreen) 275GB SSD + 16GB i7-2630QM + GT 540M + 8GB
Fair point. And in close combat you'd lose because you need more accuracy due to higher pixel counts = higher miss chance ;)

There is a reason people lower their res in competitive shooters, much more than there is upping it so you can camp in the bushes. It feels to me you're grasping at straws here, sry...
The only advantage i found moving from 40" 4k to 32" 1440p, is that i needed a lower mouse DPI setting so i was more accurate with snap movements
That may have been an issue with my mouse or even a latency issue on the TV (it wasnt exactly made for PC gaming), but its still relevant

The main reason people use the lower res monitors is because thats what those older games were designed for... even on a 40" TV you should see how tiny 4k was, and thats the real reason 1440p is more popular - because UI scaling isnt prevalent yet, even at an operating system/common apps level its still immature tech.
 
Joined
Sep 17, 2014
Messages
10,234 (5.43/day)
Location
Mars
Processor i7 8700k 4.7Ghz @ 1.26v
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) MSI GTX 1080 Gaming X @ 2100/5500
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Eizo Foris FG2421
Case Fractal Design Define C TG
Power Supply EVGA G2 750w
Mouse Logitech G502 Protheus Spectrum
Keyboard Sharkoon MK80 (Brown)
Software W10 x64
The only advantage i found moving from 40" 4k to 32" 1440p, is that i needed a lower mouse DPI setting so i was more accurate with snap movements

The main reason people use the lower res monitors is because
You've just given the answer yourself. A lower res is a lower amount of pixels to aim for making the chance of hitting the right one(s) much higher. You can also lower DPI which makes your hand's movement more pronounced and thus more active and accurate. Its simply easier to point something at a bigger target than it is at a smaller one.

This has nothing to do with UI scaling, in competitive shooters nobody looks at a UI, you just know what you have at all times (clip size, reload etc is all muscle memory). For immersive games and for strategy and isometric stuff, yes, high res and crappy UI visibility go hand in hand... I can imagine a higher res is nice to have for competitive games such as DOTA though.
 

Mussels

Moderprator
Staff member
Joined
Oct 6, 2004
Messages
47,453 (8.60/day)
Location
Australalalalalaia.
System Name Big Fella
Processor Ryzen R7 2700X (stock/XFR OC)
Motherboard Asus B450-i ITX
Cooling Corsair H110 W/ Corsair ML RGB fans
Memory 16GB DDR4 3200 Corsair Vengeance RGB Pro
Video Card(s) MSI GTX 1080 Gaming X (BIOS mod to Gaming Z) w/ Corsair H55 AIO
Storage 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000 Pro NVME
Display(s) Phillips 328m6fjrmb (32" 1440p 144hz curved) + Sony KD-55X8500F (55" 4K HDR)
Case Fractal Design Nano S
Audio Device(s) Razer Leviathan + Corsair Void pro RGB, Blue Yeti mic
Power Supply Corsair HX 750i (Platinum, fan off til 300W)
Mouse Logitech G903 + PowerPlay mousepad
Keyboard Corsair K65 Rapidfire
Software Windows 10 pro x64 (all systems)
Benchmark Scores Laptops: i7-4510U + 840M 2GB (touchscreen) 275GB SSD + 16GB i7-2630QM + GT 540M + 8GB
RTS games like DOTA is exactly where its useless, because you've got a locked perspective/field of view - you cant see further or any different

I've got 1080p, 1440p and 4k in the room with me and direct experience with all of them, even at LAN parties dealing with retro games. Some games work amazing no matter the resolution (call of duty 2 runs with no issues in windows 10 in 4k for example) and others are very broken (supreme commander works great in 1440p 144hz despite its 100fps cap, but the UI is utterly broken at 4k)

No competitive gamer is going to choose a resolution, practise and learn it and then go to an event with the hardware provided and have all the muscle memory be out of whack
 
Joined
Feb 18, 2017
Messages
454 (0.45/day)
If these results are 4 real, it doesn't look bad...still way too expensive for now. If people don't pre-order like crazy the price will definitely go down
In terms of prices, it really looks bad in my opinion.

This ^^^ is an example of tech snobbery.
There's nothing wrong with gaming at 1080p.
There is nothing wrong. But it is wrong from its foundations when you defend it in terms of a 1050-1100$ GPU.
 
Last edited:
Joined
Apr 12, 2017
Messages
67 (0.07/day)
Processor Haswell-E - i7-5820K @ 4.4GHz
Motherboard ASUS X99S
Cooling Noctua NH-D15S
Memory 16GB Corsair Vengeance LPX 3000MHz
Video Card(s) Palit Super JetStream 980Ti
Storage SSD: 512GB [Crucial MX100] HDD: 34TB [4 x 6TB WD Blue, 2 x 5TB Seagate External]
Display(s) Acer ProDesigner BM320 4K
Case Fractal Design R5
Power Supply Corsair RM750x
Still not a genuine 4K 60fps card then.
 
Joined
Aug 6, 2017
Messages
4,866 (5.86/day)
Location
Poland
Processor i7 5775c @4.3GHz/1.385v/EDRAM @2GHz
Motherboard Z97X Gaming 5
Cooling Noctua D15S
Memory Crucial Ballistix Tactical LP 1600 CL8 @2133 9-9-9-27 1T
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128 (OS)/850 PRO 256+256+ 512,860 EVO 500,XPG SX950U 480,M9Pe(Y) 512 (games)/4TB HDDs (3+1)
Display(s) Acer XB241YU+Dell S2716DG dual monitor setup
Case Full tower
Audio Device(s) W830BT headphones
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
Joined
Jul 9, 2015
Messages
1,950 (1.23/day)
System Name My all round PC
Processor i5 750
Motherboard ASUS P7P55D-E
Memory 8GB
Video Card(s) Sapphire 380 OC... sold, waiting for Navi
Storage 256GB Samsung SSD + 2Tb + 1.5Tb
Display(s) Samsung 40" A650 TV
Case Thermaltake Chaser mk-I Tower
Power Supply 425w Enermax MODU 82+
Software Windows 10
Joined
Feb 18, 2017
Messages
454 (0.45/day)
This is fake.
Why? If you want every graphic option enabled except for AA, most of today's games will run 60 fps on the 2080Ti for sure. However, in the most demanding ones (and I'm not speaking of garbage ports like Mafia 3 but Crysis 3, Witcher 3, etc.) and in the upcoming games with the same options may be hard to achieve 4k 60.
 
Joined
Feb 18, 2013
Messages
1,703 (0.69/day)
Location
KL, Malaysia
Fact: 4K 60fps or higher is still not possible when game devs are putting all sorts of "features" into their game, considering they know how low level APIs like DX12 or Vulkan work. I will still take 1080p/1440p/ultrawide @ 60+fps over 4K. Since when 4K 60fps is the "new standard" of PC gaming when 90% of userbase are still running 1080p @ >60fps?
 
Joined
Aug 6, 2017
Messages
4,866 (5.86/day)
Location
Poland
Processor i7 5775c @4.3GHz/1.385v/EDRAM @2GHz
Motherboard Z97X Gaming 5
Cooling Noctua D15S
Memory Crucial Ballistix Tactical LP 1600 CL8 @2133 9-9-9-27 1T
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128 (OS)/850 PRO 256+256+ 512,860 EVO 500,XPG SX950U 480,M9Pe(Y) 512 (games)/4TB HDDs (3+1)
Display(s) Acer XB241YU+Dell S2716DG dual monitor setup
Case Full tower
Audio Device(s) W830BT headphones
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
Why? If you want every graphic option enabled except for AA, most of today's games will run 60 fps on the 2080Ti for sure. However, in the most demanding ones (and I'm not speaking of garbage ports like Mafia 3 but Crysis 3, Witcher 3, etc.) and in the upcoming games with the same options may be hard to achieve 4k 60.
I mean the leaks are fake.
 
Joined
Feb 26, 2016
Messages
125 (0.09/day)
Location
Texas
System Name Berfs1
Processor i7-9750H (turbo clocked)
Motherboard Dell Precision 7540
Cooling Stock w/ ICD thermal compound
Memory G.Skill Ripjaws 16GB (2x8GB) @3200 MHz
Video Card(s) NVIDIA GeForce GTX 980 @1465 MHz/2052 MHz (core/memory)
Storage XPG SX8200 Pro 512GB SSD
Display(s) 1080p @60 Hz 100% sRGB laptop screen
Power Supply Dell 240W charger
Mouse Logitech G403
Keyboard Logitech G910
Software Windows 10 Pro 64 bit
Benchmark Scores XTU - 2218 points CB15 - 1311 CB20 - 3203
I'd be interested to see what the difference is at equal clock speeds.

The 2080 Ti FE comes overclocked out of the box and should run cooler, which means it can sustain much higher clocks than the stock 1080 Ti FE.
Around 33.2%; ((4352/3584*100)-100)/12*18.6
That was due to the difference in cores and transistors, not including clock speed. Of course, since I know I will be told this, yes, it’s really just theoretical, but, obviously it’s pretty damn close. Also, the fact that my numbers actually aren’t off compared to the released benchmark numbers, I’d say yes, if you want max quality, go RTX 2080 Ti. There won’t be any better GPUs coming in the next two years anyways, other than perhaps Titan versions, and nothing on the AMD side.
 
Joined
Jun 28, 2016
Messages
2,710 (2.20/day)
Since when 4K 60fps is the "new standard" of PC gaming when 90% of userbase are still running 1080p @ >60fps?
It's not a "standard" == "typical gaming PC offers it", but it's a standard we should go for... and possibly stay there.
Most people don't notice much difference when going beyond either 4K or 60fps (nor are they willing to pay for it).

Of course there will be people who want to play on multiple screens or 144Hz.
And since there's demand, there should be a "halo" product that can do that for a premium. I'd imagine that by 2025 such GPUs will have no problem with 4K@144fps or triple 4K@60fps.

But wouldn't it make sense for the rest of GPU lineup to concentrate on 4K@60fps and differentiate by image quality (features, effects)?
I mean... Nvidia currently makes 10 different models per generation (some in two RAM variants). Do we really need that? :)
 
Joined
Feb 12, 2015
Messages
1,101 (0.63/day)
Yet Maxwell was faster than Kepler, on the same 28nm. Let's not kid ourselves Nvidia could have still easily doubled performance (or at least get very close to it) if they would have dropped the Tensor and RT cores. People aren't spoiled, there have always been huge jumps in performance in the GPU space, that's actually the norm. It's Nvidia that has spoiled itself , funny even when they go all out with the biggest dies that they can make they still chose to make it so that it's not the best they can do.

Bingo. It's a bloody 750mm^2 die!!! They could have made this a 6080-SP card with a 16GB 512-bit bus and 1 TB/s of bandwidth. It would have been easily capable of 4K 144Hz.

But instead we get this inefficient modest upgrade so they can sell defective compute cards to oblivious gamers.
 
Joined
Feb 23, 2018
Messages
135 (0.21/day)
Obviously will wait to see an actual comparison from a solid / reliable reviewer like TPU, but if the #s are accurate, that's incredibly disappointing given the price points - just adding more support to those of us who are planning on skipping this iteration from NVidia. Awful pricing, not nearly enough of a performance jump to justify it...

Looking forward to the TPU review though.
 
Joined
Oct 2, 2004
Messages
13,791 (2.50/day)
I would not call 30%+ "tiny bit"

People got spoiled with Maxwell--->Pascal, completely ignoring the fact the previous gen was stuck in 28nm for a long time.

Meanwhile I will keep my pre-order. Any 20XX is better than my current GPU that is for sure.
Leave % out of it really. I get severe allergic reaction when people plaster percentages to make a point and salivate over 10% or 15% differences. Great, 30%. And then you look at actual framerate difference and it's 13fps (example 45fps+30%). 30% sounds like HUGE difference, but in actual framerate, 13fps is a difference, but hardly a game changing. And when that's best case scenario, many games may experience even lower gains as evident from graphs (fake or not). And then you ask yourself, a 1000€ graphic card for 13fps boost? Really? For someone who shits money, sure. Every FPS counts. But for rational people, sorry, just no.
 
Joined
Aug 6, 2017
Messages
4,866 (5.86/day)
Location
Poland
Processor i7 5775c @4.3GHz/1.385v/EDRAM @2GHz
Motherboard Z97X Gaming 5
Cooling Noctua D15S
Memory Crucial Ballistix Tactical LP 1600 CL8 @2133 9-9-9-27 1T
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128 (OS)/850 PRO 256+256+ 512,860 EVO 500,XPG SX950U 480,M9Pe(Y) 512 (games)/4TB HDDs (3+1)
Display(s) Acer XB241YU+Dell S2716DG dual monitor setup
Case Full tower
Audio Device(s) W830BT headphones
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
What are you talking about, the lower the fps, the more noticeable the difference. 45+13 is 58, making an uplayable or hard to play experience pretty playable again. 100 + 30 fps is not even close to being that impactful. I can't game at 45, 58 is not great but it's fine. 100 and 130 are both super fast, while I do notice the difference, it's not make or break anymore.I went 980-980ti-1080,I'm used to 20-25%,and as long as it's reflected in min fps I can see it very clearly in games.
Leaving out percentage comparison for direct fps comparisons would be an absolutely awful idea.
 
Last edited:
Joined
Feb 12, 2015
Messages
1,101 (0.63/day)
Around 33.2%; ((4352/3584*100)-100)/12*18.6
That was due to the difference in cores and transistors, not including clock speed. Of course, since I know I will be told this, yes, it’s really just theoretical, but, obviously it’s pretty damn close. Also, the fact that my numbers actually aren’t off compared to the released benchmark numbers, I’d say yes, if you want max quality, go RTX 2080 Ti. There won’t be any better GPUs coming in the next two years anyways, other than perhaps Titan versions, and nothing on the AMD side.
AMD announced they are launching 7nm Vega this year for gamers. Don't count out the possibility that it could blow away Turing...
for some games, higher res (and/or larger screen) is a competitive advantage
I was the only 4k gamer (now 2k 144) in a group of 1080p screen users, and i could see and snipe people with iron sights they couldnt see with 4x scopes
That's definitely true. I noticed a major disadvantage with iron sights when I gamed on my laptop recently (900p).
 
Joined
Aug 6, 2017
Messages
4,866 (5.86/day)
Location
Poland
Processor i7 5775c @4.3GHz/1.385v/EDRAM @2GHz
Motherboard Z97X Gaming 5
Cooling Noctua D15S
Memory Crucial Ballistix Tactical LP 1600 CL8 @2133 9-9-9-27 1T
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128 (OS)/850 PRO 256+256+ 512,860 EVO 500,XPG SX950U 480,M9Pe(Y) 512 (games)/4TB HDDs (3+1)
Display(s) Acer XB241YU+Dell S2716DG dual monitor setup
Case Full tower
Audio Device(s) W830BT headphones
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
AMD announced they are launching 7nm Vega this year for gamers. Don't count out the possibility that it could blow away Turing...
Source.
 
Joined
Feb 12, 2015
Messages
1,101 (0.63/day)
Google "7nm Radeon 2018" and filter only news from the past month. TBH it's somewhat a mixed bag of rumors and facts. What we DO know is AMD is launching 7nm Vega this year, and most recent rumors point to a Vega coming to gaming as well. However some rumors point to Navi possibly launching this year as well.

The point is we do know AMD is bringing 7nm cards THIS year. But it's up in the air if it is Vega or Navi (or both), and if it will be a full or a paper launch. I just wouldn't bother with overpriced and soon outdated 12nm products.
 
Top