• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 2080 Ti Founders Edition 11 GB

Joined
Jun 10, 2014
Messages
2,902 (0.80/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
The only thing to complain about is the price, and it's fine letting Nvidia know it's too high.
But the price doesn't change the fact that Turing is a great achievement in performance gains, efficiency, etc. 39% gains over the previous generation is a substantial step forward, and is still more than most generational changes in the past. If AMD achieved this, no one would complain at all.

What's not to understand?
Nvidia introduces GameWorks, it catches flak for using proprietary technologies.
Nvidia introduces Pascal, it catches flak for introducing a mere refinement over Maxwell.
Nvidia introduces Turing, it catches flak for RTX not being in widespread use already.
Is the pattern more obvious now? ;)
Nvidia introduces CUDA, it catches flak for using proprietary technologies.
Nvidia introduces G-Sync, it catches flak for using proprietary technologies.

AMD introduces Mantle, a proprietary AMD-only alternative to Direct3D and OpenGL with "support" in a handful games. Even after it turned out that Mantle only had marginal gains for low-end APUs, and that it will never catch on among developers, AMD are still praised.

I wonder if there is a double standard?
 
Joined
Jan 6, 2013
Messages
349 (0.08/day)
Only DLSS can save nvidia. They bet on a large die with this generation and they cannot sell it at peanuts price. They simply cannot afford to sell a 2080ti at 1080Ti prices, cause they won't make almost any profit, since 2080ti is on a newer node and its die is much bigger compared to 1080Ti (because of RT and tensor cores). If dlss can add 25% more performance at least, they might sell pretty well.
 

bug

Joined
May 22, 2015
Messages
13,228 (4.06/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
@efikkan Yeah, "price is too high" is an understatement.
But even that is only in part because of greed and/or lack of competition. The price is high because chips this large are that expensive.
Like you said, it's ok to let Nvidia know they went too far, because at the end of the day, justified or not, these prices mean extremely few gamers will afford to buy these cards. Even then, low sales will convey the message better than any forums whining.
 
Joined
Jun 28, 2018
Messages
299 (0.14/day)
I mean Vega was essentially AMD's first time in the high end GPU market because previously they just heavily saturated the mid-tier market, but how did AMD do? Well Sapphire's Nitro+ Vega 64 consistently beats the GTX 1080 FE so I think AMD did pretty good for their first time in the high end market.

Are you serious? Vega was the first hig-end card from AMD?

What about Fury X? What about 290X? What about HD7970/Ghz Edition? What about the multiple dual-GPU cards they released?

AMD has been competing in the high-end market for a long time.
 
Joined
Jan 17, 2018
Messages
375 (0.16/day)
Processor Ryzen 7 5800X3D
Motherboard MSI B550 Tomahawk
Cooling Noctua U12S
Memory 32GB @ 3600 CL18
Video Card(s) AMD 6800XT
Storage WD Black SN850(1TB), WD Black NVMe 2018(500GB), WD Blue SATA(2TB)
Display(s) Samsung Odyssey G9
Case Be Quiet! Silent Base 802
Power Supply Seasonic PRIME-GX-1000
The only thing to complain about is the price, and it's fine letting Nvidia know it's too high.
But the price doesn't change the fact that Turing is a great achievement in performance gains, efficiency, etc. 39% gains over the previous generation is a substantial step forward, and is still more than most generational changes in the past. If AMD achieved this, no one would complain at all.


Nvidia introduces CUDA, it catches flak for using proprietary technologies.
Nvidia introduces G-Sync, it catches flak for using proprietary technologies.

AMD introduces Mantle, a proprietary AMD-only alternative to Direct3D and OpenGL with "support" in a handful games. Even after it turned out that Mantle only had marginal gains for low-end APUs, and that it will never catch on among developers, AMD are still praised.

I wonder if there is a double standard?


So, Cuda I don't know enough about so I won't comment. However Freesync is essentially just AMD's name for adaptive sync, a VESA standard. It's something Nvidia could support, but instead choose to continue on with Gsync which is basically identical except for the added price premium.

Your statement about Mantle is entirely false though. Mantle was not designed to be a proprietary AMD-only API. Vulcan is what we got from AMD's API development, an open standard API that every single gamer should support(unless they work for Microsoft).
 
Joined
Dec 28, 2012
Messages
3,478 (0.84/day)
System Name Skunkworks
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software openSUSE tumbleweed/Mint 21.2
I mean Vega was essentially AMD's first time in the high end GPU market because previously they just heavily saturated the mid-tier market, but how did AMD do? Well Sapphire's Nitro+ Vega 64 consistently beats the GTX 1080 FE so I think AMD did pretty good for their first time in the high end market.

Navi is suppose to be even better than Vega with next gen memory and I guarantee you that Navi will be cheaper than the new Nvidia cards. Unless you are solely gaming at 4k, the GTX 1080ti and the GTX 2080ti are completely overkill. I personally think that 1440p is the ideal resolution for gaming and both the Vega 56 and 64 handle 1440p like a champ.
So the 9800 pro, 2900xt, 4870, 4870x2, 5870, 5970, 6970, 6990, 7970, 290x, 295x2, fury X, ece, do NONE of those exist anymore?
 
Joined
Jun 10, 2014
Messages
2,902 (0.80/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
@efikkan Yeah, "price is too high" is an understatement.

But even that is only in part because of greed and/or lack of competition. The price is high because chips this large are that expensive.

Like you said, it's ok to let Nvidia know they went too far, because at the end of the day, justified or not, these prices mean extremely few gamers will afford to buy these cards. Even then, low sales will convey the message better than any forums whining.
We still have to remember that Nvidia is trying to dump the remaining stock of Pascal cards as well. It remains to be seen if the price remains unchanged in a few months as the Pascal cards runs out.

Nvidia should be cautious though, driving up the cost of desktop gaming might hurt the long-term gains we've seen in user base vs. consoles.

Still, we have to remember that AMD have done similar things that have been long forgotten. When Vega launched, it was priced $100 over "MSRP" with "$100 worth of games included", but they quickly changed this after massive criticism.

So, Cuda I don't know enough about so I won't comment. However Freesync is essentially just AMD's name for adaptive sync, a VESA standard. It's something Nvidia could support, but instead choose to continue on with Gsync which is basically identical except for the added price premium.

Your statement about Mantle is entirely false though. Mantle was not designed to be a proprietary AMD-only API. Vulcan is what we got from AMD's API development, an open standard API that every single gamer should support(unless they work for Microsoft).
G-Sync still relies on adaptive sync, with advanced processing on the monitor side, which FreeSync doesn't have. If it's worth it or not is up to you, but they are not 100% comparable.

You're wrong about Mantle. It was a proprietary API designed around GCN. Direct3D and Vulkan inherited some of the function names from it, but not the underlying structure, they are only similar on the surface. Vulkan is built on a SPIR-V, which is derived from OpenCL. The next version of Direct3D is planning to copy this compiler structure from Vulkan, which is something Mantle lacks completely.
 
Joined
Apr 17, 2017
Messages
401 (0.16/day)
Processor AMD Ryzen 1700X 8-Core 3.4Ghz
Motherboard ASRock X370 Taichi
Cooling Thermalright Le Grand Macho RT, 2x front Aerocool DS 140mm fans, 1x rear Aerocool DS 140mm fan
Memory 16GB G.Skill Flare X DDR4 3200Mhz
Video Card(s) PowerColor Red Devil Vega 64
Storage Samsung 960 Evo 500GB NVME SSD - Boot drive, Samsung 850 Evo 1TB SSD - Main storage
Display(s) Acer XG270HU Red 27" 1ms 144HZ WQHD 2K FreeSync Gaming Monitor
Case Fractal Design Define R6 USB C-Type Blackout Edition (Non TG)
Audio Device(s) Sennheiser GSP 301 Headset, Sennheiser Game Zero Special Edition Headset, Logitech Z623 System
Power Supply Seasonic AirTouch 850w 80 Plus Gold
Mouse SteelSeries Rival
Keyboard Razer Ornata Chroma
Software Windows 10 Professional 64bit
Are you serious? Vega was the first hig-end card from AMD?

What about Fury X? What about 290X? What about HD7970/Ghz Edition? What about the multiple dual-GPU cards they released?

AMD has been competing in the high-end market for a long time.

So neither the Fury X or the 290X (or its multiple revisions) are considered high end, though they may be in the upper echelon of mid-tier cards. With today's standards, you don't start hitting the high-end GPU market until you get to GTX 1070 levels. The Fury X's performance is slightly above a GTX 1060 6GB, but it is no where near a GTX 1070. The 290X and its family can't even compete with the GTX 1060 6GB which is a mid tier card. If these cards were released 4-6 years ago that would be a different story, but when they were released they weren't really high end.

The HD7970 is a different story because for it's time it was definitely high end, but that was also 7 years ago. AMD hasn't made high end GPUs for a very long time so I guess it was a poor choice of wording on my end- what I should have said was Vega was essentially AMD's return to the high end GPU market. Would that be more acceptable?
 
Joined
Jan 8, 2017
Messages
8,943 (3.36/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
CUDA means absolutely nothing to the average consumer.

So neither the Fury X or the 290X (or its multiple revisions) are considered high end,

Absolutely not true, 290X was faster than anything else at the time of release, you cannot get more high end than that.

I think you are confusing it with something esle.
 
Last edited:
Joined
Dec 28, 2012
Messages
3,478 (0.84/day)
System Name Skunkworks
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software openSUSE tumbleweed/Mint 21.2
So neither the Fury X or the 290X (or its multiple revisions) are considered high end, though they may be in the upper echelon of mid-tier cards. With today's standards, you don't start hitting the high-end GPU market until you get to GTX 1070 levels. The Fury X's performance is slightly above a GTX 1060 6GB, but it is no where near a GTX 1070. The 290X and its family can't even compete with the GTX 1060 6GB which is a mid tier card. If these cards were released 4-6 years ago that would be a different story, but when they were released they weren't really high end.

The HD7970 is a different story because for it's time it was definitely high end, but that was also 7 years ago. AMD hasn't made high end GPUs for a very long time so I guess it was a poor choice of wording on my end- what I should have said was Vega was essentially AMD's return to the high end GPU market. Would that be more acceptable?
So nvidia never competed in the high end either, because the 780ti wasnt anywhere near the 1070 right?

I think you just dont understand what the term high end GPU means.

The 290X was faster then the 780, and traded blows with the 780ti. In case you dont remember, the 290x made nvidia panic, as the 780ti was not slated for a consumer release and was rushed out after the 290x turned out to be way better then the 780. The fury X was trading blows with the 980ti after a few months of driver updates. Ergo, those cards are "high end". The 9800 pro was king of the hill for over a year after its launch, and absolutely embarrassed nvidia. The 5870 at launch was the fastest GPU available.

Also interesting you have gone from "AMD never competed in the high end" to "Well AMD DID compete int he high end, but that was like 7 years ago". Which is it?
 
Joined
Jun 28, 2018
Messages
299 (0.14/day)
So neither the Fury X or the 290X (or its multiple revisions) are considered high end, though they may be in the upper echelon of mid-tier cards. With today's standards, you don't start hitting the high-end GPU market until you get to GTX 1070 levels. The Fury X's performance is slightly above a GTX 1060 6GB, but it is no where near a GTX 1070. The 290X and its family can't even compete with the GTX 1060 6GB which is a mid tier card. If these cards were released 4-6 years ago that would be a different story, but when they were released they weren't really high end.

The HD7970 is a different story because for it's time it was definitely high end, but that was also 7 years ago. AMD hasn't made high end GPUs for a very long time so I guess it was a poor choice of wording on my end- what I should have said was Vega was essentially AMD's return to the high end GPU market. Would that be more acceptable?

I ask again, are you serious or trolling at this point?
 
Joined
Apr 17, 2017
Messages
401 (0.16/day)
Processor AMD Ryzen 1700X 8-Core 3.4Ghz
Motherboard ASRock X370 Taichi
Cooling Thermalright Le Grand Macho RT, 2x front Aerocool DS 140mm fans, 1x rear Aerocool DS 140mm fan
Memory 16GB G.Skill Flare X DDR4 3200Mhz
Video Card(s) PowerColor Red Devil Vega 64
Storage Samsung 960 Evo 500GB NVME SSD - Boot drive, Samsung 850 Evo 1TB SSD - Main storage
Display(s) Acer XG270HU Red 27" 1ms 144HZ WQHD 2K FreeSync Gaming Monitor
Case Fractal Design Define R6 USB C-Type Blackout Edition (Non TG)
Audio Device(s) Sennheiser GSP 301 Headset, Sennheiser Game Zero Special Edition Headset, Logitech Z623 System
Power Supply Seasonic AirTouch 850w 80 Plus Gold
Mouse SteelSeries Rival
Keyboard Razer Ornata Chroma
Software Windows 10 Professional 64bit
CUDA means absolutely nothing to the average consumer.



Absolutely not true, 290X was faster than anything else at the time of release, you cannot get more high end than that.

I think you are confusing it with something esle.

I mean the 290x was not faster than the GTX 780ti and they were released within a month of each other. From the GTX 780ti, Nvidia went with the 900 series and how did AMD respond? By rehashing the 290x with the 390x.
 
Joined
Mar 10, 2015
Messages
3,984 (1.19/day)
System Name Wut?
Processor 3900X
Motherboard ASRock Taichi X570
Cooling Water
Memory 32GB GSkill CL16 3600mhz
Video Card(s) Vega 56
Storage 2 x AData XPG 8200 Pro 1TB
Display(s) 3440 x 1440
Case Thermaltake Tower 900
Power Supply Seasonic Prime Ultra Platinum
If these cards were released 4-6 years ago that would be a different story, but when they were released they weren't really high end.

Did I miss the the Fury or 290X release the other day? I could have sworn I read something about them many, many moons ago.
 
Joined
Apr 17, 2017
Messages
401 (0.16/day)
Processor AMD Ryzen 1700X 8-Core 3.4Ghz
Motherboard ASRock X370 Taichi
Cooling Thermalright Le Grand Macho RT, 2x front Aerocool DS 140mm fans, 1x rear Aerocool DS 140mm fan
Memory 16GB G.Skill Flare X DDR4 3200Mhz
Video Card(s) PowerColor Red Devil Vega 64
Storage Samsung 960 Evo 500GB NVME SSD - Boot drive, Samsung 850 Evo 1TB SSD - Main storage
Display(s) Acer XG270HU Red 27" 1ms 144HZ WQHD 2K FreeSync Gaming Monitor
Case Fractal Design Define R6 USB C-Type Blackout Edition (Non TG)
Audio Device(s) Sennheiser GSP 301 Headset, Sennheiser Game Zero Special Edition Headset, Logitech Z623 System
Power Supply Seasonic AirTouch 850w 80 Plus Gold
Mouse SteelSeries Rival
Keyboard Razer Ornata Chroma
Software Windows 10 Professional 64bit
So nvidia never competed in the high end either, because the 780ti wasnt anywhere near the 1070 right?

I think you just dont understand what the term high end GPU means.

The 290X was faster then the 780, and traded blows with the 780ti. In case you dont remember, the 290x made nvidia panic, as the 780ti was not slated for a consumer release and was rushed out after the 290x turned out to be way better then the 780. The fury X was trading blows with the 980ti after a few months of driver updates. Ergo, those cards are "high end". The 9800 pro was king of the hill for over a year after its launch, and absolutely embarrassed nvidia. The 5870 at launch was the fastest GPU available.

Also interesting you have gone from "AMD never competed in the high end" to "Well AMD DID compete int he high end, but that was like 7 years ago". Which is it?

The Fury X got absolutely stomped on by the GTX 980ti... wasn't really trading blows.

http://gpu.userbenchmark.com/Compare/Nvidia-GTX-980-Ti-vs-AMD-R9-Fury-X/3439vs3498

The last AMD card that I had was the R9 390x which got released around the same time as the GTX 980ti, but Nvidia's card was a lot better in terms of performance. The Fury X got smoked, and the 290X / 390X got smoked. I sold my R9 390X and I got a GTX 1070, yet I am about to sell my GTX 1070 because I just got a Vega 64. Either way you spin it, it boils down to this: AMD hasn't been competitive in the high end GPU market for 3+ years or so.
 
Last edited:
Joined
Jan 8, 2017
Messages
8,943 (3.36/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
I mean the 290x was not faster than the GTX 780ti and they were released within a month of each other. From the GTX 780ti, Nvidia went with the 900 series and how did AMD respond? By rehashing the 290x with the 390x.

That has nothing to do with your claim that the 290X wasn't high end, which is undoubtedly not true.

The Fury X got absolutely stomped on by the GTX 980ti... wasn't really trading blows.

http://gpu.userbenchmark.com/Compare/Nvidia-GTX-980-Ti-vs-AMD-R9-Fury-X/3439vs3498

Userbenchmark, ehm ... OK. Though have a look at TPUs own rankings:

d.png


7%. We certainly have different definitions of "stomped" , still a high end card nonetheless.
 
Joined
Apr 17, 2017
Messages
401 (0.16/day)
Processor AMD Ryzen 1700X 8-Core 3.4Ghz
Motherboard ASRock X370 Taichi
Cooling Thermalright Le Grand Macho RT, 2x front Aerocool DS 140mm fans, 1x rear Aerocool DS 140mm fan
Memory 16GB G.Skill Flare X DDR4 3200Mhz
Video Card(s) PowerColor Red Devil Vega 64
Storage Samsung 960 Evo 500GB NVME SSD - Boot drive, Samsung 850 Evo 1TB SSD - Main storage
Display(s) Acer XG270HU Red 27" 1ms 144HZ WQHD 2K FreeSync Gaming Monitor
Case Fractal Design Define R6 USB C-Type Blackout Edition (Non TG)
Audio Device(s) Sennheiser GSP 301 Headset, Sennheiser Game Zero Special Edition Headset, Logitech Z623 System
Power Supply Seasonic AirTouch 850w 80 Plus Gold
Mouse SteelSeries Rival
Keyboard Razer Ornata Chroma
Software Windows 10 Professional 64bit
That has nothing to do with your claim that the 290X wasn't high end, which is undoubtedly not true.



Userbenchmark, ehm ... OK. Though have a look at TPUs own rankings:

View attachment 107143

7%. We certainly have different definitions of "stomped" , still a high end card nonetheless.

Instead of showing me some useless benchmark, why don't you look at this and compare and contrast.

http://gpu.userbenchmark.com/Compare/Nvidia-GTX-980-Ti-vs-AMD-R9-Fury-X/3439vs3498

The funny thing is that I am an AMD fan, but until Vega, they haven't really been relevant in the high end GPU market. Why buy a Fury X when I can buy a GTX 980ti that is not only cheaper, but completely outperforms it?
 
Joined
Jan 17, 2018
Messages
375 (0.16/day)
Processor Ryzen 7 5800X3D
Motherboard MSI B550 Tomahawk
Cooling Noctua U12S
Memory 32GB @ 3600 CL18
Video Card(s) AMD 6800XT
Storage WD Black SN850(1TB), WD Black NVMe 2018(500GB), WD Blue SATA(2TB)
Display(s) Samsung Odyssey G9
Case Be Quiet! Silent Base 802
Power Supply Seasonic PRIME-GX-1000
G-Sync still relies on adaptive sync, with advanced processing on the monitor side, which FreeSync doesn't have. If it's worth it or not is up to you, but they are not 100% comparable.

You're wrong about Mantle. It was a proprietary API designed around GCN. Direct3D and Vulkan inherited some of the function names from it, but not the underlying structure, they are only similar on the surface. Vulkan is built on a SPIR-V, which is derived from OpenCL. The next version of Direct3D is planning to copy this compiler structure from Vulkan, which is something Mantle lacks completely.

G-Sync & Freesync are absolutely directly comparable. They do the same basic thing, syncronizing a monitors refresh with the frames the GPU is producing, that's it. There are multiple video's both objective & subjective that show hardware that Nvidia adds simply isn't needed. Simply put, Nvidia doesn't need to use G-sync modules.

As for Mantle we know AMD gave their Mantle code to Khronos. With where AMD was in the market in 2014, it's unlikely they thought they could get away with mantle being a proprietary AMD-only API. Mantle was essentially the awakening of low-level API's, and it would be more sane to praise AMD for developing mantle to criticize them for it.

AMD simply doesn't get a lot of flak because they support many open standards(Adaptive Sync, OpenCL, etc.), and maybe because of where they are at on the market they need to, however the same cannot be said about Nvidia(G-sync, Cuda, etc.).
 
Joined
Jan 8, 2017
Messages
8,943 (3.36/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Joined
Oct 10, 2009
Messages
929 (0.17/day)
System Name Desktop | Laptop
Processor AMD Ryzen 7 5800X3D | Intel Core i7 7700HQ
Motherboard MAG X570S Torpedo Max| Neptune KLS HM175
Cooling Corsair H100x | Twin fan, fin stack & heat pipes
Memory 32GB G.Skill F4-3600C16-8GVK @ 3600MHz / 16-16-16-36-1T | 16GB DDR4 @ 2400MHz / 17-17-17-39-2T
Video Card(s) EVGA RTX 3080 Ti FTW3 Ultra | GTX 1050 Ti 4GB
Storage Kingston KC3000 1TB + Kingston KC3000 2TB + Samsung 860 EVO 1TB | 970 Evo 500GB
Display(s) 32" Dell G3223Q (2160p @ 144Hz) | 17" IPS 1920x1080P
Case Fractal Meshify 2 Compact | Aspire V Nitro BE
Audio Device(s) ifi Audio ZEN DAC V2 + Focal Radiance / HyperX Solocast
Power Supply Super Flower Leadex V Platinum Pro 1000W | 150W
Mouse Razer Viper Ultimate | Logitech MX Anywhere 2
Keyboard Razer Huntsman V2 Optical (Linear Red)
Software Windows 11 Pro x64
Performs well, but nVidia wanged out with the prices. Also can't help but wonder how much better they would have been on 10nm? Lower power usage and higher clocks? Ten to twenty percent more performance?
 
Joined
Nov 2, 2013
Messages
461 (0.12/day)
System Name Auriga
Processor Ryzen 7950X3D w/ aquacomputer cuplex kryos NEXT with VISION - acrylic/nickel
Motherboard Asus ROG Strix X670E-E Gaming WiFi
Cooling Alphacool Res/D5 Combo •• Corsair XR7 480mm + Black Ice Nemesis 360GTS radiators •• 7xNF-A12 chromax
Memory 2x 32GB G.Skill Trident Z5 Neo RGB @ 6200MHz, 30-40-40-28, 1.35V (F5-6000J3040G32GX2-TZ5NR)
Video Card(s) MSI RTX 4090 Suprim Liquid X w/ Bykski waterblock
Storage 2TB WD Black SN850X ••• 2TB Corsair M510 ••• 40TB QNAP NAS via SFP+ NIC
Display(s) Alienware AW3423DWF (3440x1440, 10-bit @ 139Hz)
Case Thermaltake Core P8
Power Supply Corsair AX1600i
Mouse Razer Viper V2 Pro (FPS games) + Logitech MX Master 2S (everything else)
Keyboard Keycult No2 rev 1 w/Amber Alps and TX stabilizers on a steel plate. DCS 9009 WYSE keycaps
Software W10 X64 Pro
Benchmark Scores https://valid.x86.fr/c3rxw7
So, AdoredTV was right, or was he not?
adored was wrong - this is WORSE, he did predicted +40% increase over gtx1080 ti - but obviously that was leakded from nvidias "testing guidlines".

adored was not the only one who predicted this, everyone did. everyone was saying 2080 ti would be 35%-40% faster than 1080 Ti. this review showed it was 38% faster, other reviewers got 35% or so. seems spot on to me. the 1080p and 1440p benches don't count as many of them were bottlenecked by the CPU and in case of doom a lot of the cards were hitting the 200 FPS hard limit. if you were picking either of these up for 1080P, you need to get your head examined. the 2080 Ti still makes sense for ultrawide monitors (performance wise) if you absolutely must get 100+ FPS in every game with everything maxed out.

Only the 2080Ti is amazing, the regular 2080 has no place in the market at this moment. It trades blows with 1080Ti, which you can get for far less money.

are we comparing prices of used 1080 Tis? in the US, new 1080 Tis cost about $700 while 2080 can be picked up for as little as $750 and custom models with beefy coolers start at $800 (links in prices). so for $50 - $100 extra you get a card that's a little faster BUT most importantly it does not choke in properly optimized DX12 titles and async, which has been a problem for nvidia for a while now. in productivity benchmarks, both Turing cards destroy everything. if i was getting into high end GPUs now, i would buy a 2080 over 1080 Ti. if you're already spending $700+, what's another $50 for a card that won't get owned by $500 AMD cards anytime async is involved?


and you can't blame nvidia for these prices too. we, the consumers are not obligated to buy anything. however, nvidia's board partners still have millions of pascal cards they have to move. these high RTX prices will allow them to do so. if the 2080 Ti was the same price as 1080 Ti, a lot of the partners would take massive hits to their profits.
 
Last edited:
Joined
Apr 17, 2017
Messages
401 (0.16/day)
Processor AMD Ryzen 1700X 8-Core 3.4Ghz
Motherboard ASRock X370 Taichi
Cooling Thermalright Le Grand Macho RT, 2x front Aerocool DS 140mm fans, 1x rear Aerocool DS 140mm fan
Memory 16GB G.Skill Flare X DDR4 3200Mhz
Video Card(s) PowerColor Red Devil Vega 64
Storage Samsung 960 Evo 500GB NVME SSD - Boot drive, Samsung 850 Evo 1TB SSD - Main storage
Display(s) Acer XG270HU Red 27" 1ms 144HZ WQHD 2K FreeSync Gaming Monitor
Case Fractal Design Define R6 USB C-Type Blackout Edition (Non TG)
Audio Device(s) Sennheiser GSP 301 Headset, Sennheiser Game Zero Special Edition Headset, Logitech Z623 System
Power Supply Seasonic AirTouch 850w 80 Plus Gold
Mouse SteelSeries Rival
Keyboard Razer Ornata Chroma
Software Windows 10 Professional 64bit
Thanks for spamming the same link, didn't notice it the first time. Still quite irrelevant but have it your way , 980ti stomped Fury X.

It's completely relevant because it shows that the 980ti beats (sometimes significantly) in almost every performance category. The Fury X can't even run Witcher 3 on max settings at 1080p at a stable 60fps, but the 980ti can.
 
Joined
Mar 18, 2008
Messages
5,717 (0.97/day)
System Name Virtual Reality / Bioinformatics
Processor Undead CPU
Motherboard Undead TUF X99
Cooling Noctua NH-D15
Memory GSkill 128GB DDR4-3000
Video Card(s) EVGA RTX 3090 FTW3 Ultra
Storage Samsung 960 Pro 1TB + 860 EVO 2TB + WD Black 5TB
Display(s) 32'' 4K Dell
Case Fractal Design R5
Audio Device(s) BOSE 2.0
Power Supply Seasonic 850watt
Mouse Logitech Master MX
Keyboard Corsair K70 Cherry MX Blue
VR HMD HTC Vive + Oculus Quest 2
Software Windows 10 P
HAHAHAHA oh wow. That is just pathetic.

To top that off, we have the clueless hordes here crying that a $ 1,200 card (2080 Ti) that's twice as fast as a $600 card (Vega 64) is overpriced. Who wants to bet half of them will own a Turing card in 6 months' time?


Man the Red Guards are out in doves this time.

When RTG GPU delivers: OMFG world peace happily ever after

When RTG GPU sucks: They have smaller budget, underdog, FineWine(tm) etc etc

When NV GPU delivers: We demand 1000fps at 8k for $200, while still complaining about “big bad green”

When NV GPU sucks: Hell yeah death the devil green!

Meanwhile in reality when you check their system profiles, most of them rock a Nvidia “evil” GPU

All you haters should go buy a Vega64 to support RTG today.
 
Joined
Feb 8, 2017
Messages
198 (0.08/day)
Seems like your numbers are quite ways off, for example hardware unboxed on the same games and resolutions found the RTX 2080 much closer to the GTX 1080ti and even losing in two games.

Again, maybe you should recheck your results as they seem off compared to other more reputable reviewers.

The cards themselves are impressive no doubt about it pure performance wise. Value wise they are really bad actually, in fact for the price increase it actually offers negative performance. Both the RTX 2080 and RTX 2080ti offer negative value compared to the 2080 and 2080ti respectfully.
 
Joined
Nov 2, 2013
Messages
461 (0.12/day)
System Name Auriga
Processor Ryzen 7950X3D w/ aquacomputer cuplex kryos NEXT with VISION - acrylic/nickel
Motherboard Asus ROG Strix X670E-E Gaming WiFi
Cooling Alphacool Res/D5 Combo •• Corsair XR7 480mm + Black Ice Nemesis 360GTS radiators •• 7xNF-A12 chromax
Memory 2x 32GB G.Skill Trident Z5 Neo RGB @ 6200MHz, 30-40-40-28, 1.35V (F5-6000J3040G32GX2-TZ5NR)
Video Card(s) MSI RTX 4090 Suprim Liquid X w/ Bykski waterblock
Storage 2TB WD Black SN850X ••• 2TB Corsair M510 ••• 40TB QNAP NAS via SFP+ NIC
Display(s) Alienware AW3423DWF (3440x1440, 10-bit @ 139Hz)
Case Thermaltake Core P8
Power Supply Corsair AX1600i
Mouse Razer Viper V2 Pro (FPS games) + Logitech MX Master 2S (everything else)
Keyboard Keycult No2 rev 1 w/Amber Alps and TX stabilizers on a steel plate. DCS 9009 WYSE keycaps
Software W10 X64 Pro
Benchmark Scores https://valid.x86.fr/c3rxw7
Man the Red Guards are out in doves this time.

When RTG GPU delivers: OMFG world peace happily ever after

When RTG GPU sucks: They have smaller budget, underdog, FineWine(tm) etc etc

When NV GPU delivers: We demand 1000fps at 8k for $200, while still complaining about “big bad green”

When NV GPU sucks: Hell yeah death the devil green!

Meanwhile in reality when you check their system profiles, most of them rock a Nvidia “evil” GPU

All you haters should go buy a Vega64 to support RTG today.

i wish i could like this 1000 times. i've owned more AMD hardware than any of the morons calling me an nvidia shill. i put a vega 64 into my VR rig over a 1080 not because it's faster (it is slower) but because i wanted to support the little guy. meanwhile they're all salivating over used 1080 ti mining cards on ebay.
 
Joined
Mar 18, 2008
Messages
5,717 (0.97/day)
System Name Virtual Reality / Bioinformatics
Processor Undead CPU
Motherboard Undead TUF X99
Cooling Noctua NH-D15
Memory GSkill 128GB DDR4-3000
Video Card(s) EVGA RTX 3090 FTW3 Ultra
Storage Samsung 960 Pro 1TB + 860 EVO 2TB + WD Black 5TB
Display(s) 32'' 4K Dell
Case Fractal Design R5
Audio Device(s) BOSE 2.0
Power Supply Seasonic 850watt
Mouse Logitech Master MX
Keyboard Corsair K70 Cherry MX Blue
VR HMD HTC Vive + Oculus Quest 2
Software Windows 10 P
Seems like your numbers are quite ways off, for example hardware unboxed on the same games and resolutions found the RTX 2080 much closer to the GTX 1080ti and even losing in two games.

Again, maybe you should recheck your results as they seem off compared to other more reputable reviewers.

The cards themselves are impressive no doubt about it pure performance wise. Value wise they are really bad actually, in fact for the price increase it actually offers negative performance. Both the RTX 2080 and RTX 2080ti offer negative value compared to the 2080 and 2080ti respectfully.


You call a well established 10+ years reviewer and review site less reputable than some self claimed “youtuber” WTF are you smoking? Seriously?
 
Top