• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

MSI GeForce RTX 2070 Super Gaming Z Trio

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
26,956 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
MSI's GeForce RTX 2070 Super Gaming Z Trio comes with super-fast 16 Gbps Samsung memory chips which overclock very well, reaching almost 2.3 GHz. The large triple-slot, triple-fan cooler, which has fan-stop, runs extremely quietly and will almost be inaudible during heavy gaming.

Show full review
 
Last edited:
Joined
Jul 25, 2017
Messages
59 (0.02/day)
Questions regarding the review.

The 2070S is around 10% faster than stock 5700Xt, but cost around 130 dollars (28%) more?
How can VESA Adaptive-Sync only be positive on NVIDIA cards?
And how can RTX Raytracing still be a positive when there is only like 3 games out there supporting after 13 months?
Why is DLSS a pros when its show to be inferior to AMD Image Sharpening?
MSI 5700 XT has "Power consumption increased, some efficiency lost" because it used 4-5% more power than stock 5700XT. MSI RTX 2700S uses 4-5% more power than stock, but its not a negative?

Sorry for the "negative" post, but I am just wondering how the same thing is pro/con on one product, but the same thing is not a pro/con on a different product?

Cheers.
 
Joined
Jun 2, 2017
Messages
7,797 (3.13/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
I was watching a review of the Vega 7 last night from Techdeals. One of the things that struck me right away was the fact that he was doing the review vs the Vega 64, 2070S, 2080S and 2080TI. He had DLSS turned on for all the Nvidia cards. As I understand it DLSS works by having the card do the work at 1080P and output the image at 1440P. Would that not skew the numbers?
 
Joined
Mar 14, 2008
Messages
511 (0.09/day)
Location
DK
System Name Main setup
Processor i9 12900K
Motherboard Gigabyte z690 Gaming X
Cooling Water
Memory Kingston 32GB 5200@cl30
Video Card(s) Asus Tuf RTS 4090
Storage Adata SX8200 PRO 1 adn 2 TB, Samsung 960EVO, Crucial MX300 750GB Limited edition
Display(s) HP "cheapass" 34" 3440x1440
Case CM H500P Mesh
Audio Device(s) Logitech G933
Power Supply Corsair RX850i
Mouse G502
Keyboard SteelSeries Apex Pro
Software W11
Grate Review as usual :) Could it be an idea to have the average FPS at the performance summery charts as well. (I would like that) "VooDoo 5 6000 128MB: -6000% (1FPS)"
 
Joined
Jun 2, 2017
Messages
7,797 (3.13/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
I was watching a review of the Vega 7 last night from Techdeals. One of the things that struck me right away was the fact that he was doing the review vs the Vega 64, 2070S, 2080S and 2080TI. He had DLSS turned on for all the Nvidia cards. As I understand it DLSS works by having the card do the work at 1080P and output the image at 1440P. Would that not skew the numbers?

Sorry I dived deeper into the review and saw that you did disable DLSS where you could. I guess my next question is DLSS available by default in the NVIdia control program?
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
26,956 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
average FPS at the performance summery charts
I'm not sure if it's correct math to just average all the FPS. Wouldn't that skew the weighting towards games that run higher FPS and put low FPS games at a disadvantage?

is DLSS available by default in the NVIdia control program?
We disabled DLSS/RTX/RIS/Radeon Chill and similar exclusive technologies for all our testing. Not exactly sure what you are asking. DLSS is an option that appears in games that support it.

VESA Adaptive-Sync
Yeah, it's standard enough nowadays, removing the whole line

MSI 5700 XT has "Power consumption increased, some efficiency lost" because it used 4-5% more power than stock 5700XT.
 
Last edited:
Joined
Jun 2, 2017
Messages
7,797 (3.13/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
I'm not sure if it's correct math to just average all the FPS. Wouldn't that skew the weighting towards games that run higher FPS and put low FPS games at a disadvantage?


We disabled DLSS/RTX/RIS/Radeon Chill and similar exclusive technologies for all our testing. Not exactly sure what you are asking. DLSS is an option that appears in games that support it.

Ok thanks Wizzard that was exactly what I was asking

Yeah, it's standard enough nowadays, removing the whole line


 
Joined
Mar 14, 2008
Messages
511 (0.09/day)
Location
DK
System Name Main setup
Processor i9 12900K
Motherboard Gigabyte z690 Gaming X
Cooling Water
Memory Kingston 32GB 5200@cl30
Video Card(s) Asus Tuf RTS 4090
Storage Adata SX8200 PRO 1 adn 2 TB, Samsung 960EVO, Crucial MX300 750GB Limited edition
Display(s) HP "cheapass" 34" 3440x1440
Case CM H500P Mesh
Audio Device(s) Logitech G933
Power Supply Corsair RX850i
Mouse G502
Keyboard SteelSeries Apex Pro
Software W11
I'm not sure if it's correct math to just average all the FPS. Wouldn't that skew the weighting towards games that run higher FPS and put low FPS games at a disadvantage?

What i'm thinking is that if you are lookin to buy a card that will be able to run games in 1440p@120Hz but you are not looking at one specific game ore the games you play is not testet in the review then you will have an idea of how this card performes on average in all games testet.

i guess it vould be the same as 100% vs 125% (if that was 100FPS vs 125FPS average)
but yes a card that was Super grate in some games and really bad in others would look good on average soooooooo :) but is that a thing any more ? :)

EDIT: ok i can see if i take 1080TI vs Vega 7 there is a good exsample of that with BF5 and the performance summery (91% vs 93%) but in BF5 it is 62.6FPS vs 77
 
Last edited:

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
26,956 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
What i'm thinking is that if you are lookin to buy a card that will be able to run games in 1440p@120Hz but you are not looking at one specific game ore the games you play is not testet in the review then you will have an idea of how this card performes on average in all games testet.

i guess it vould be the same as 100% vs 125% (if that was 100FPS vs 125FPS average)
but yes a card that was Super grate in some games and really bad in others would look good on average soooooooo :) but is that a thing any more ? :)
Hrmm .. I like the idea, I'll think about how this can be implemented math-wise
 
Joined
Mar 10, 2014
Messages
1,793 (0.49/day)
16Gbps gddr6... interesting so nvidia really allows AIB to use faster memories now. Wonder if this will carry on lesser cards like gtx1660tis too now that gtx1660S launch is near.

Any ways, great review as always :toast: . Are those two added games any good(Greedfall and The Surge 2)? At least Radeons has very abysmal performance on the Surge 2, while looking nvidia perf numbers the game is not really graphically heavy title. Anyways liked to see new Vulkan title added. Which reminds me that Strange Brigade, it should run now better with Vulkan on both IHVs.
 
Joined
May 2, 2012
Messages
15 (0.00/day)
Location
Helsinki, Finland
The power figures, for the 5700 XT reference card seem to be out of whack?
Average (166 vs. 219), peak (180 vs. 227), sustained (183 vs. 223)?
 
Joined
Jan 8, 2017
Messages
8,862 (3.36/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
It's outright bizarre that this outperforms a stock 2080 super when overclocked.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
26,956 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Are those two added games any good(Greedfall and The Surge 2)?
I liked them both. Neither is a huge AAA game, but I thought they'd be worth adding because different game engines from the norm, and Vulkan on Surge 2
 
Joined
Feb 18, 2005
Messages
5,239 (0.75/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Logitech G613
Software Windows 10 Professional x64
Nice card, shame about the price - why isn't that a negative?

And when did triple-slot coolers become the norm? The FE does quite fine with a dual-slot solution - if you're going to give triple-slot cards props for being really cool and quiet, you should also give them a negative about using more slots than some cases will have available.

Finally, cards that ship with stupidly overbuilt cooling solutions but no boost to the default power limit should be penalised, because seriously what's the point?
 
Joined
Sep 9, 2015
Messages
260 (0.08/day)
The point is silent operation - all nvidia cards are power limited in order to fit in to the prescribed performance brackets nvida has set.

So a 2070S cannot be faster then a 2080S - no matter how much you OC it. This is so the sales of the higher priced cards are not cannibalized and AIB partners therefore are guaranteed to sell X amount of cards that nvidia's market research indicates should sell.

The 2nd reason is power efficiency - nvidia wants to look better on paper, now that they are still on an older process node.

Just look how critical wizzard is of AMD AIB OC cards power consumption - yet with most nvidia AIB cards the conclusion always has "power limit too low". 260watts on a 2070S - no worries here but 5700XT having an open wide power limit is a bad thing...
 
Last edited:
Joined
Feb 18, 2005
Messages
5,239 (0.75/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Logitech G613
Software Windows 10 Professional x64
Joined
Jun 28, 2016
Messages
3,595 (1.27/day)
And how can RTX Raytracing still be a positive when there is only like 3 games out there supporting after 13 months?
Because RTRT will become a standard quite soon and RTX makes you future-proof more than anything else right now.
BTW, not 3:
Why is DLSS a pros when its show to be inferior to AMD Image Sharpening?
First of all: this makes no sense to begin with. Nvidia GPU can't have AMD RIS. So this point is not about DLSS being better than anything else on the planet. It's about having or not having DLSS on board.

Second: AMD RIS and DLSS are doing something very different.
In more general graphics editing terms, DLSS is like denoising. That's why it works well in tandem with RTX. They lead to more realistic graphics. DLSS implementation should be content-aware and not sharpen areas that aren't meant to be sharp.
Some say that it makes the picture less detailed, because people tend to perceive noise (granularity) as sharpness.
That's connected to how some people prefer smartphone photos over large-sensor camera photos. Because it's noise + sharpening.

AMD RIS is just unsharp masking like you would find in graphics editors. It's added at the end of the pipeline. And it makes the picture more crisp all-round.

The big caveat is that - while straightforward sharpening is fairly light load and can be done using normal GPGPU with little lag (Nvidia and Intel do it as well, obviously) - denoising can be quite complex computationally. That's why Nvidia uses fancy tensor cores in RTX cards.

Basically you've rephrased a popular accusation that RTX and DLSS make games look less sharp and darker. Which is true.
 
Joined
Oct 18, 2019
Messages
392 (0.24/day)
Location
NYC, NY
Use a Credit Card.

Buy the 2080Ti instead. Just pay it back over time.
 

Attachments

  • 2080ti resized (2).jpg
    2080ti resized (2).jpg
    154.5 KB · Views: 365
  • 2080ti resized (3).jpg
    2080ti resized (3).jpg
    276.3 KB · Views: 368
Joined
Aug 27, 2015
Messages
41 (0.01/day)
Processor Core i5-4440
Motherboard Gigabyte G1.Sniper Z87
Memory 8 GB DDR3-2400 CL11
Video Card(s) GTX 760 2GB
Because RTRT will become a standard quite soon and RTX makes you future-proof more than anything else right now.
BTW, not 3:

First of all: this makes no sense to begin with. Nvidia GPU can't have AMD RIS. So this point is not about DLSS being better than anything else on the planet. It's about having or not having DLSS on board.

Second: AMD RIS and DLSS are doing something very different.
In more general graphics editing terms, DLSS is like denoising. That's why it works well in tandem with RTX. They lead to more realistic graphics. DLSS implementation should be content-aware and not sharpen areas that aren't meant to be sharp.
Some say that it makes the picture less detailed, because people tend to perceive noise (granularity) as sharpness.
That's connected to how some people prefer smartphone photos over large-sensor camera photos. Because it's noise + sharpening.

AMD RIS is just unsharp masking like you would find in graphics editors. It's added at the end of the pipeline. And it makes the picture more crisp all-round.

The big caveat is that - while straightforward sharpening is fairly light load and can be done using normal GPGPU with little lag (Nvidia and Intel do it as well, obviously) - denoising can be quite complex computationally. That's why Nvidia uses fancy tensor cores in RTX cards.

Basically you've rephrased a popular accusation that RTX and DLSS make games look less sharp and darker. Which is true.
Image sharpening is not even exclusive to AMD or Nvidia or Intel. it's just an post-processing effect that can be done on any video card. I have been using sharpening filters to sharpen blurry looking games for years with a very useful program called Reshade. it gives you much more control and adjustabilty than AMD's solution and it can do much more than just image sharpening. now after AMD included this post-processing effect in their driver, it's suddenly the best thing in the world.
 
Last edited:
Joined
Jun 28, 2016
Messages
3,595 (1.27/day)
Image sharpening is not even exclusive to AMD or Nvidia or Intel. it's just an post-processing effect that can be done on any video card. I have been using sharpening filters to sharpen blurry looking games for years with a very useful program called Reshade. it gives you much more control and adjustabilty than AMD's solution and it can do much more than just image sharpening. now after AMD included this post-processing effect in their driver, it's suddenly the best thing in the world.
Making products that start to shine when you tinker and install additional software is exactly what AMD has been doing wrong.
I'm absolutely not against AMD introducing RIS, so you have one less thing to take care for.
And it actually works as intended as well. The image is sharper.

But it's still all down to how people perceive game image quality.
There's a group that wants absolute sharpness, good visibility and high fps. And they just don't understand why RTRT even exists. It makes no sense to them.
AMD is not doing anything explicitly, but they're trying to cater to this group by adding things like RIS.

The problem is: this status quo will collapse in 1-2 years, when AMD adds RTRT on their GPUs.
Suddenly, no matter which brand you choose, you'll be paying a premium for RTRT. So the group that praises AMD today (or rather: criticizes Nvidia) will be left behind. They'll have to either go with times or turn a big chunk of their GPU off.
 
Joined
Sep 9, 2015
Messages
260 (0.08/day)
W1zzard has never said anything of the sort.
Have you not read any of the AMD AIB reviews? In every single review it is stated in the cons:

"Large increase in power consumption, power efficiency lost"


This is why AIB models have huge coolers, large power targets and come with higher stock volts - so people can easily OC them and this is what many have come to expect form AIB cards.

But in nvidia card reviews - it's the opposite. "power target not increased"

You just can't have your cake and eat it too..
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
26,956 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Joined
Aug 27, 2015
Messages
41 (0.01/day)
Processor Core i5-4440
Motherboard Gigabyte G1.Sniper Z87
Memory 8 GB DDR3-2400 CL11
Video Card(s) GTX 760 2GB
Have you not read any of the AMD AIB reviews? In every single review it is stated in the cons:

"Large increase in power consumption, power efficiency lost"


This is why AIB models have huge coolers, large power targets and come with higher stock volts - so people can easily OC them and this is what many have come to expect form AIB cards.

But in nvidia card reviews - it's the opposite. "power target not increased"

You just can't have your cake and eat it too..
The problem with those 5700xt cards is they don't show much better performance despite having much more power draw. if more power draw does not results in better performance or better overclocking then what's the benefit of it?

For example the MSI 2070 gamingZ trio gives 20 fps more after overclock compare to stock 2070 super FE.
overclocked-performance.png


But the MSI 5700xt gamingX gives only 8 fps more after overclock compare to stock AMD 5700xt despite having much higher power limit.
overclocked-performance (2).png

This is just a waste of energy and generation of heat for nothing. how can it be a positive thing?
 
Last edited:
Joined
Sep 17, 2014
Messages
20,780 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Nice card, shame about the price - why isn't that a negative?

And when did triple-slot coolers become the norm? The FE does quite fine with a dual-slot solution - if you're going to give triple-slot cards props for being really cool and quiet, you should also give them a negative about using more slots than some cases will have available.

Finally, cards that ship with stupidly overbuilt cooling solutions but no boost to the default power limit should be penalised, because seriously what's the point?

Dude, the card gets 69 C at full load and is more silent than your average case fan. What's the point?! Thát's the point! 29dB is pretty damn awesome. Power limit or no, you can crank up the core voltage and PT to whatever it gives and still keep all, or most of your boost bins too. And still not hear it. I'll take that over the 5% that *might* be in the tank with a higher power limit any day of the week tbh...

Image sharpening is not even exclusive to AMD or Nvidia or Intel. it's just an post-processing effect that can be done on any video card. I have been using sharpening filters to sharpen blurry looking games for years with a very useful program called Reshade. it gives you much more control and adjustabilty than AMD's solution and it can do much more than just image sharpening. now after AMD included this post-processing effect in their driver, it's suddenly the best thing in the world.

Sharpening is not what DLSS is at all.

DLSS does not produce the typical sharpening artifacts, because its not sharpening, its an internal upscale, sampled to target res, so to speak. Kinda like a reversed DSR aimed at efficiency at the cost of accuracy.
Sharpening is increasing the contrast between edges to make them stand out more, you can do this with a Photoshop filter too.
 
Last edited:
Joined
Jul 25, 2017
Messages
59 (0.02/day)
Because RTRT will become a standard quite soon and RTX makes you future-proof more than anything else right now.
BTW, not 3:

And how many of these games are released yet? I couldn't care less if you showed me a list with 10000 games that are gonna support RTX down the line.
If you announce a new feature and promote it like the best thing since the invention of the wheel and after 13 months you still only have 3 games, how is that a pro since you are paying ekstra for that feature that is not really supported yet?

And RTX doesn't make the 20xx series future-proff? Far from it. When RTX becomes "standard" the 20xx RTX will be obsolete unless you own a 2080S/2080Ti. In RTX 30xx you will probably get 2080ti RTX perfomance in the 3060. So how future-proof is the 2060/2070 gonna be in 1-2 years when they are already struggling to do 1080p/60fps in the 3 games that already support RTX.
 
Top