• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

MSI GeForce RTX 2070 Super Gaming Z Trio

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
20,599 (3.49/day)
Processor Core i7-4790K
Memory 16 GB
Video Card(s) GTX 1080
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 7
MSI's GeForce RTX 2070 Super Gaming Z Trio comes with super-fast 16 Gbps Samsung memory chips which overclock very well, reaching almost 2.3 GHz. The large triple-slot, triple-fan cooler, which has fan-stop, runs extremely quietly and will almost be inaudible during heavy gaming.

Show full review
 
Last edited:
Joined
Jul 25, 2017
Messages
53 (0.05/day)
Questions regarding the review.

The 2070S is around 10% faster than stock 5700Xt, but cost around 130 dollars (28%) more?
How can VESA Adaptive-Sync only be positive on NVIDIA cards?
And how can RTX Raytracing still be a positive when there is only like 3 games out there supporting after 13 months?
Why is DLSS a pros when its show to be inferior to AMD Image Sharpening?
MSI 5700 XT has "Power consumption increased, some efficiency lost" because it used 4-5% more power than stock 5700XT. MSI RTX 2700S uses 4-5% more power than stock, but its not a negative?

Sorry for the "negative" post, but I am just wondering how the same thing is pro/con on one product, but the same thing is not a pro/con on a different product?

Cheers.
 
Joined
Jun 2, 2017
Messages
2,541 (2.24/day)
System Name Best AMD Computer
Processor AMD TR4 1920X
Motherboard MSI X399 SLI Plus
Cooling Alphacool Eisbaer 420 x2 Noctua XPX Pro TR4 block
Memory Gskill RIpjaws 4 3000MHZ 48GB
Video Card(s) Sapphire Vega 64 Nitro, Gigabyte Vega 64 Gaming OC
Storage 6 x NVME 480 GB, 2 x SSD 2TB, 5TB HDD, 2 TB HDD, 2x 2TB SSHD
Display(s) Acer 49BQ0k 4K monitor
Case Thermaltake Core X9
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Corsair HX1200!
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 10 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 24955 Time Spy: 13500
I was watching a review of the Vega 7 last night from Techdeals. One of the things that struck me right away was the fact that he was doing the review vs the Vega 64, 2070S, 2080S and 2080TI. He had DLSS turned on for all the Nvidia cards. As I understand it DLSS works by having the card do the work at 1080P and output the image at 1440P. Would that not skew the numbers?
 
Joined
Mar 14, 2008
Messages
426 (0.09/day)
Location
DK
System Name Main setup
Processor I7 5930K
Motherboard Asus X99 Deluxe
Cooling Water
Memory Kingston 16GB 2400 DDR4
Video Card(s) Asus GTX 1080TI STRIX OC
Storage Samsung 960EVO, Crucial MX300 750GB Limited edition, Kingston SSDnow V+ Samsung 850 EVO
Display(s) ASUS PB287Q 4K
Audio Device(s) Logitech G933
Power Supply Corsair RX750M
Mouse MMO 800000button thing that i cant remember the name of.
Keyboard Logitech G19
Software W10
Grate Review as usual :) Could it be an idea to have the average FPS at the performance summery charts as well. (I would like that) "VooDoo 5 6000 128MB: -6000% (1FPS)"
 
Joined
Jun 2, 2017
Messages
2,541 (2.24/day)
System Name Best AMD Computer
Processor AMD TR4 1920X
Motherboard MSI X399 SLI Plus
Cooling Alphacool Eisbaer 420 x2 Noctua XPX Pro TR4 block
Memory Gskill RIpjaws 4 3000MHZ 48GB
Video Card(s) Sapphire Vega 64 Nitro, Gigabyte Vega 64 Gaming OC
Storage 6 x NVME 480 GB, 2 x SSD 2TB, 5TB HDD, 2 TB HDD, 2x 2TB SSHD
Display(s) Acer 49BQ0k 4K monitor
Case Thermaltake Core X9
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Corsair HX1200!
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 10 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 24955 Time Spy: 13500
I was watching a review of the Vega 7 last night from Techdeals. One of the things that struck me right away was the fact that he was doing the review vs the Vega 64, 2070S, 2080S and 2080TI. He had DLSS turned on for all the Nvidia cards. As I understand it DLSS works by having the card do the work at 1080P and output the image at 1440P. Would that not skew the numbers?
Sorry I dived deeper into the review and saw that you did disable DLSS where you could. I guess my next question is DLSS available by default in the NVIdia control program?
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
20,599 (3.49/day)
Processor Core i7-4790K
Memory 16 GB
Video Card(s) GTX 1080
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 7
average FPS at the performance summery charts
I'm not sure if it's correct math to just average all the FPS. Wouldn't that skew the weighting towards games that run higher FPS and put low FPS games at a disadvantage?

is DLSS available by default in the NVIdia control program?
We disabled DLSS/RTX/RIS/Radeon Chill and similar exclusive technologies for all our testing. Not exactly sure what you are asking. DLSS is an option that appears in games that support it.

VESA Adaptive-Sync
Yeah, it's standard enough nowadays, removing the whole line

MSI 5700 XT has "Power consumption increased, some efficiency lost" because it used 4-5% more power than stock 5700XT.
 
Last edited:
Joined
Jun 2, 2017
Messages
2,541 (2.24/day)
System Name Best AMD Computer
Processor AMD TR4 1920X
Motherboard MSI X399 SLI Plus
Cooling Alphacool Eisbaer 420 x2 Noctua XPX Pro TR4 block
Memory Gskill RIpjaws 4 3000MHZ 48GB
Video Card(s) Sapphire Vega 64 Nitro, Gigabyte Vega 64 Gaming OC
Storage 6 x NVME 480 GB, 2 x SSD 2TB, 5TB HDD, 2 TB HDD, 2x 2TB SSHD
Display(s) Acer 49BQ0k 4K monitor
Case Thermaltake Core X9
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Corsair HX1200!
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 10 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 24955 Time Spy: 13500
I'm not sure if it's correct math to just average all the FPS. Wouldn't that skew the weighting towards games that run higher FPS and put low FPS games at a disadvantage?


We disabled DLSS/RTX/RIS/Radeon Chill and similar exclusive technologies for all our testing. Not exactly sure what you are asking. DLSS is an option that appears in games that support it.

Ok thanks Wizzard that was exactly what I was asking

Yeah, it's standard enough nowadays, removing the whole line


 
Joined
Mar 14, 2008
Messages
426 (0.09/day)
Location
DK
System Name Main setup
Processor I7 5930K
Motherboard Asus X99 Deluxe
Cooling Water
Memory Kingston 16GB 2400 DDR4
Video Card(s) Asus GTX 1080TI STRIX OC
Storage Samsung 960EVO, Crucial MX300 750GB Limited edition, Kingston SSDnow V+ Samsung 850 EVO
Display(s) ASUS PB287Q 4K
Audio Device(s) Logitech G933
Power Supply Corsair RX750M
Mouse MMO 800000button thing that i cant remember the name of.
Keyboard Logitech G19
Software W10
I'm not sure if it's correct math to just average all the FPS. Wouldn't that skew the weighting towards games that run higher FPS and put low FPS games at a disadvantage?
What i'm thinking is that if you are lookin to buy a card that will be able to run games in 1440p@120Hz but you are not looking at one specific game ore the games you play is not testet in the review then you will have an idea of how this card performes on average in all games testet.

i guess it vould be the same as 100% vs 125% (if that was 100FPS vs 125FPS average)
but yes a card that was Super grate in some games and really bad in others would look good on average soooooooo :) but is that a thing any more ? :)

EDIT: ok i can see if i take 1080TI vs Vega 7 there is a good exsample of that with BF5 and the performance summery (91% vs 93%) but in BF5 it is 62.6FPS vs 77
 
Last edited:

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
20,599 (3.49/day)
Processor Core i7-4790K
Memory 16 GB
Video Card(s) GTX 1080
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 7
What i'm thinking is that if you are lookin to buy a card that will be able to run games in 1440p@120Hz but you are not looking at one specific game ore the games you play is not testet in the review then you will have an idea of how this card performes on average in all games testet.

i guess it vould be the same as 100% vs 125% (if that was 100FPS vs 125FPS average)
but yes a card that was Super grate in some games and really bad in others would look good on average soooooooo :) but is that a thing any more ? :)
Hrmm .. I like the idea, I'll think about how this can be implemented math-wise
 
Joined
Mar 10, 2014
Messages
1,756 (0.76/day)
16Gbps gddr6... interesting so nvidia really allows AIB to use faster memories now. Wonder if this will carry on lesser cards like gtx1660tis too now that gtx1660S launch is near.

Any ways, great review as always :toast: . Are those two added games any good(Greedfall and The Surge 2)? At least Radeons has very abysmal performance on the Surge 2, while looking nvidia perf numbers the game is not really graphically heavy title. Anyways liked to see new Vulkan title added. Which reminds me that Strange Brigade, it should run now better with Vulkan on both IHVs.
 
Joined
May 2, 2012
Messages
14 (0.00/day)
Location
Helsinki, Finland
The power figures, for the 5700 XT reference card seem to be out of whack?
Average (166 vs. 219), peak (180 vs. 227), sustained (183 vs. 223)?
 
Joined
Jan 8, 2017
Messages
5,166 (4.04/day)
System Name Good enough
Processor AMD Ryzen R7 1700X - 4.0 Ghz / 1.350V
Motherboard ASRock B450M Pro4
Cooling Scythe Katana 4 - 3x 120mm case fans
Memory 16GB - Corsair Vengeance LPX
Video Card(s) OEM Dell GTX 1080
Storage 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) 4K Samsung TV
Case Zalman R1
Power Supply 500W
It's outright bizarre that this outperforms a stock 2080 super when overclocked.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
20,599 (3.49/day)
Processor Core i7-4790K
Memory 16 GB
Video Card(s) GTX 1080
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 7
Are those two added games any good(Greedfall and The Surge 2)?
I liked them both. Neither is a huge AAA game, but I thought they'd be worth adding because different game engines from the norm, and Vulkan on Surge 2
 
Joined
Feb 18, 2005
Messages
2,807 (0.50/day)
Location
Ikenai borderline!
Nice card, shame about the price - why isn't that a negative?

And when did triple-slot coolers become the norm? The FE does quite fine with a dual-slot solution - if you're going to give triple-slot cards props for being really cool and quiet, you should also give them a negative about using more slots than some cases will have available.

Finally, cards that ship with stupidly overbuilt cooling solutions but no boost to the default power limit should be penalised, because seriously what's the point?
 
Joined
Sep 9, 2015
Messages
101 (0.06/day)
The point is silent operation - all nvidia cards are power limited in order to fit in to the prescribed performance brackets nvida has set.

So a 2070S cannot be faster then a 2080S - no matter how much you OC it. This is so the sales of the higher priced cards are not cannibalized and AIB partners therefore are guaranteed to sell X amount of cards that nvidia's market research indicates should sell.

The 2nd reason is power efficiency - nvidia wants to look better on paper, now that they are still on an older process node.

Just look how critical wizzard is of AMD AIB OC cards power consumption - yet with most nvidia AIB cards the conclusion always has "power limit too low". 260watts on a 2070S - no worries here but 5700XT having an open wide power limit is a bad thing...
 
Last edited:
Joined
Jun 28, 2016
Messages
3,595 (2.44/day)
And how can RTX Raytracing still be a positive when there is only like 3 games out there supporting after 13 months?
Because RTRT will become a standard quite soon and RTX makes you future-proof more than anything else right now.
BTW, not 3:
Why is DLSS a pros when its show to be inferior to AMD Image Sharpening?
First of all: this makes no sense to begin with. Nvidia GPU can't have AMD RIS. So this point is not about DLSS being better than anything else on the planet. It's about having or not having DLSS on board.

Second: AMD RIS and DLSS are doing something very different.
In more general graphics editing terms, DLSS is like denoising. That's why it works well in tandem with RTX. They lead to more realistic graphics. DLSS implementation should be content-aware and not sharpen areas that aren't meant to be sharp.
Some say that it makes the picture less detailed, because people tend to perceive noise (granularity) as sharpness.
That's connected to how some people prefer smartphone photos over large-sensor camera photos. Because it's noise + sharpening.

AMD RIS is just unsharp masking like you would find in graphics editors. It's added at the end of the pipeline. And it makes the picture more crisp all-round.

The big caveat is that - while straightforward sharpening is fairly light load and can be done using normal GPGPU with little lag (Nvidia and Intel do it as well, obviously) - denoising can be quite complex computationally. That's why Nvidia uses fancy tensor cores in RTX cards.

Basically you've rephrased a popular accusation that RTX and DLSS make games look less sharp and darker. Which is true.
 
Joined
Aug 27, 2015
Messages
32 (0.02/day)
Processor Core i5-4440
Motherboard Gigabyte G1.Sniper Z87
Memory 8 GB DDR3-2400 CL11
Video Card(s) GTX 760 2GB
Because RTRT will become a standard quite soon and RTX makes you future-proof more than anything else right now.
BTW, not 3:

First of all: this makes no sense to begin with. Nvidia GPU can't have AMD RIS. So this point is not about DLSS being better than anything else on the planet. It's about having or not having DLSS on board.

Second: AMD RIS and DLSS are doing something very different.
In more general graphics editing terms, DLSS is like denoising. That's why it works well in tandem with RTX. They lead to more realistic graphics. DLSS implementation should be content-aware and not sharpen areas that aren't meant to be sharp.
Some say that it makes the picture less detailed, because people tend to perceive noise (granularity) as sharpness.
That's connected to how some people prefer smartphone photos over large-sensor camera photos. Because it's noise + sharpening.

AMD RIS is just unsharp masking like you would find in graphics editors. It's added at the end of the pipeline. And it makes the picture more crisp all-round.

The big caveat is that - while straightforward sharpening is fairly light load and can be done using normal GPGPU with little lag (Nvidia and Intel do it as well, obviously) - denoising can be quite complex computationally. That's why Nvidia uses fancy tensor cores in RTX cards.

Basically you've rephrased a popular accusation that RTX and DLSS make games look less sharp and darker. Which is true.
Image sharpening is not even exclusive to AMD or Nvidia or Intel. it's just an post-processing effect that can be done on any video card. I have been using sharpening filters to sharpen blurry looking games for years with a very useful program called Reshade. it gives you much more control and adjustabilty than AMD's solution and it can do much more than just image sharpening. now after AMD included this post-processing effect in their driver, it's suddenly the best thing in the world.
 
Last edited:
Joined
Jun 28, 2016
Messages
3,595 (2.44/day)
Image sharpening is not even exclusive to AMD or Nvidia or Intel. it's just an post-processing effect that can be done on any video card. I have been using sharpening filters to sharpen blurry looking games for years with a very useful program called Reshade. it gives you much more control and adjustabilty than AMD's solution and it can do much more than just image sharpening. now after AMD included this post-processing effect in their driver, it's suddenly the best thing in the world.
Making products that start to shine when you tinker and install additional software is exactly what AMD has been doing wrong.
I'm absolutely not against AMD introducing RIS, so you have one less thing to take care for.
And it actually works as intended as well. The image is sharper.

But it's still all down to how people perceive game image quality.
There's a group that wants absolute sharpness, good visibility and high fps. And they just don't understand why RTRT even exists. It makes no sense to them.
AMD is not doing anything explicitly, but they're trying to cater to this group by adding things like RIS.

The problem is: this status quo will collapse in 1-2 years, when AMD adds RTRT on their GPUs.
Suddenly, no matter which brand you choose, you'll be paying a premium for RTRT. So the group that praises AMD today (or rather: criticizes Nvidia) will be left behind. They'll have to either go with times or turn a big chunk of their GPU off.
 
Joined
Sep 9, 2015
Messages
101 (0.06/day)
W1zzard has never said anything of the sort.
Have you not read any of the AMD AIB reviews? In every single review it is stated in the cons:

"Large increase in power consumption, power efficiency lost"


This is why AIB models have huge coolers, large power targets and come with higher stock volts - so people can easily OC them and this is what many have come to expect form AIB cards.

But in nvidia card reviews - it's the opposite. "power target not increased"

You just can't have your cake and eat it too..
 
Joined
Aug 27, 2015
Messages
32 (0.02/day)
Processor Core i5-4440
Motherboard Gigabyte G1.Sniper Z87
Memory 8 GB DDR3-2400 CL11
Video Card(s) GTX 760 2GB
Have you not read any of the AMD AIB reviews? In every single review it is stated in the cons:

"Large increase in power consumption, power efficiency lost"


This is why AIB models have huge coolers, large power targets and come with higher stock volts - so people can easily OC them and this is what many have come to expect form AIB cards.

But in nvidia card reviews - it's the opposite. "power target not increased"

You just can't have your cake and eat it too..
The problem with those 5700xt cards is they don't show much better performance despite having much more power draw. if more power draw does not results in better performance or better overclocking then what's the benefit of it?

For example the MSI 2070 gamingZ trio gives 20 fps more after overclock compare to stock 2070 super FE.
overclocked-performance.png


But the MSI 5700xt gamingX gives only 8 fps more after overclock compare to stock AMD 5700xt despite having much higher power limit.
overclocked-performance (2).png

This is just a waste of energy and generation of heat for nothing. how can it be a positive thing?
 
Last edited:
Joined
Sep 17, 2014
Messages
12,577 (5.93/day)
Location
Mars
Processor i7 8700k 4.7Ghz @ 1.26v
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) MSI GTX 1080 Gaming X @ 2100/5500
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Eizo Foris FG2421
Case Fractal Design Define C TG
Power Supply EVGA G2 750w
Mouse Logitech G502 Protheus Spectrum
Keyboard Sharkoon MK80 (Brown)
Software W10 x64
Nice card, shame about the price - why isn't that a negative?

And when did triple-slot coolers become the norm? The FE does quite fine with a dual-slot solution - if you're going to give triple-slot cards props for being really cool and quiet, you should also give them a negative about using more slots than some cases will have available.

Finally, cards that ship with stupidly overbuilt cooling solutions but no boost to the default power limit should be penalised, because seriously what's the point?
Dude, the card gets 69 C at full load and is more silent than your average case fan. What's the point?! Thát's the point! 29dB is pretty damn awesome. Power limit or no, you can crank up the core voltage and PT to whatever it gives and still keep all, or most of your boost bins too. And still not hear it. I'll take that over the 5% that *might* be in the tank with a higher power limit any day of the week tbh...

Image sharpening is not even exclusive to AMD or Nvidia or Intel. it's just an post-processing effect that can be done on any video card. I have been using sharpening filters to sharpen blurry looking games for years with a very useful program called Reshade. it gives you much more control and adjustabilty than AMD's solution and it can do much more than just image sharpening. now after AMD included this post-processing effect in their driver, it's suddenly the best thing in the world.
Sharpening is not what DLSS is at all.

DLSS does not produce the typical sharpening artifacts, because its not sharpening, its an internal upscale, sampled to target res, so to speak. Kinda like a reversed DSR aimed at efficiency at the cost of accuracy.
Sharpening is increasing the contrast between edges to make them stand out more, you can do this with a Photoshop filter too.
 
Last edited:
Joined
Jul 25, 2017
Messages
53 (0.05/day)
Because RTRT will become a standard quite soon and RTX makes you future-proof more than anything else right now.
BTW, not 3:
And how many of these games are released yet? I couldn't care less if you showed me a list with 10000 games that are gonna support RTX down the line.
If you announce a new feature and promote it like the best thing since the invention of the wheel and after 13 months you still only have 3 games, how is that a pro since you are paying ekstra for that feature that is not really supported yet?

And RTX doesn't make the 20xx series future-proff? Far from it. When RTX becomes "standard" the 20xx RTX will be obsolete unless you own a 2080S/2080Ti. In RTX 30xx you will probably get 2080ti RTX perfomance in the 3060. So how future-proof is the 2060/2070 gonna be in 1-2 years when they are already struggling to do 1080p/60fps in the 3 games that already support RTX.
 
Top