• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Big Navi GPU Features Infinity Cache?

AleksandarK

Staff member
Joined
Aug 19, 2017
Messages
715 (0.61/day)
As we are nearing the launch of AMD's highly hyped, next-generation RDNA 2 GPU codenamed "Big Navi", we are seeing more details emerge and crawl their way to us. We already got some rumors suggesting that this card is supposedly going to be called AMD Radeon RX 6900 and it is going to be AMD's top offering. Using a 256-bit bus with 16 GB of GDDR6 memory, the GPU will not use any type of HBM memory, which has historically been rather pricey. Instead, it looks like AMD will compensate for a smaller bus with a new technology it has developed. Thanks to the new findings on Justia Trademarks website by @momomo_us, we have information about the alleged "infinity cache" technology the new GPU uses.

It is reported by VideoCardz that the internal name for this technology is not Infinity Cache, however, it seems that AMD could have changed it recently. What does exactly you might wonder? Well, it is a bit of a mystery for now. What it could be, is a new cache technology which would allow for L1 GPU cache sharing across the cores, or some connection between the caches found across the whole GPU unit. This information should be taken with a grain of salt, as we are yet to see what this technology does and how it works, when AMD announces their new GPU on October 28th.



View at TechPowerUp Main Site
 
Joined
May 13, 2015
Messages
200 (0.10/day)
Processor AMD Ryzen 3800X / AMD 8350
Motherboard ASRock X570 Phantom Gaming X / Gigabyte 990FXA-UD5 Revision 3.0
Cooling Stock / Corsair H100
Memory 32GB / 24GB
Video Card(s) AMD Radeon 290X (Toggling until 6950XT)
Storage C:\ 1TB SSD, D:\ RAID-1 1TB SSD, 2x4TB-RAID-1
Display(s) Samsung U32E850R
Case be quiet! Dark Base Pro 900 Black rev. 2 / Fractal Design
Power Supply EVGA Supernova 1300G2 / EVGA Supernova 850G+
Mouse Logitech M-U0007
Keyboard Logitech G110 / Logitech G110
I've been stuck on a 290X for a few years now and I can't wait to get the 6900XT or if they make the liquid cooled version 6900XTX. Now that AMD has beaten back the anti-capitalist crony Intel and made enough money to really push R&D:
  • The drivers are rumored to be solid for this release.
  • There will actually be stock because unlike Nvidia they're not trying to artificially drive up prices.
  • It's not going to be a watt-sucking heat-producing beast.
  • I'll finally stop running out of video memory (browsers use GPU memory).
 
Joined
Jun 24, 2020
Messages
49 (0.39/day)
1gb cache = from 512bit to 128bit bw

wow

how about 6gb cache we could not need
 
Joined
Sep 17, 2014
Messages
13,594 (6.09/day)
Location
The Kitchen Table
Processor i7 8700k 4.7Ghz @ 1.26v
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) MSI GTX 1080 Gaming X @ 2100/5500
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Eizo Foris FG2421
Case Fractal Design Define C TG
Power Supply EVGA G2 750w
Mouse Logitech G502 Protheus Spectrum
Keyboard Sharkoon MK80 (Brown)
Software W10 x64
Good comedy, this

Fans desperately searching for some argument to say 256 bit GDDR6 will do anything more than hopefully get even with a 2080ti.

History repeats.

Bandwidth is bandwidth and cache is not new. Also... elephant in the room.... Nvidia needed expanded L2 Cache since Turing to cater for their new shader setup with RT/tensor in them...yeah, I really wonder what magic Navi is going to have with a similar change in cache sizes... surely they won't copy over what Nvidia has done before them like they always have right?! Surely this isn't history repeating, right? Right?!

:lovetpu:



I've been stuck on a 290X for a few years now and I can't wait to get the 6900XT or if they make the liquid cooled version 6900XTX. Now that AMD has beaten back the anti-capitalist crony Intel and made enough money to really push R&D:
  • The drivers are rumored to be solid for this release.
  • There will actually be stock because unlike Nvidia they're not trying to artificially drive up prices.
  • It's not going to be a watt-sucking heat-producing beast.
  • I'll finally stop running out of video memory (browsers use GPU memory).
Let's revisit those assumptions post launch ;) That'll be fun, too. I'll take a bet... drivers will need hotfixing, which will likely come pretty late or creates new issues along the way (note: Nvidia has fallen prey to this just as well, this alone should say enough); things will be out of stock shortly after launch, its going to suck an easy 250-300W just as well, and yes, you do have 16GB on the top model.

If I'm wrong, I'll buy it :p
 
Last edited by a moderator:
Joined
Apr 29, 2018
Messages
53 (0.06/day)
Good comedy, this

Fans desperately searching for some argument to say 256 bit GDDR6 will do anything more than hopefully get even with a 2080ti.

History repeats.

Bandwidth is bandwidth and cache is not new. Also... elephant in the room.... Nvidia needed expanded L2 Cache since Turing to cater for their new shader setup with RT/tensor in them...yeah, I really wonder what magic Navi is going to have with a similar change in cache sizes... surely they won't copy over what Nvidia has done before them like they always have right?! Surely this isn't history repeating, right? Right?!

:lovetpu:

View attachment 170974



Let's revisit those assumptions post launch ;) That'll be fun, too. I'll take a bet... drivers will need hotfixing, which will likely come pretty late or creates new issues along the way (note: Nvidia has fallen prey to this just as well, this alone should say enough); things will be out of stock shortly after launch, its going to suck an easy 250-300W just as well, and yes, you do have 16GB on the top model.

If I'm wrong, I'll buy it :p
You have to be a special kind of stupid to think their top card will only match the 2080ti considering the 2080ti is 50% faster than the 5700xt. It does not take a genius to realize that doubling the cores of the 5700xt, increasing IPC, and running higher clocks would result in a MUCH higher gain than 50%. FFS even the XBOX series X has a gpu as fast or faster than the 2080 super and the 6900xt will be a hell of a lot bigger gpu.
 

Frick

Fishfaced Nincompoop
Joined
Feb 27, 2006
Messages
16,510 (3.08/day)
Location
Piteå
System Name Black MC in Tokyo
Processor Ryzen 5 2600x
Motherboard Asrock B450M-HDV
Cooling AMD Wraith Spire I think
Memory 2 x 8GB G-skill Aegis 3000 or somesuch
Video Card(s) Powercolor Radeon HD 7850 2GB
Storage Kingston A400 240GB | WD Blue 1TB x 2 | Toshiba P300 2TB
Display(s) BenQ GL2450HT
Case Antec dumpster find
Audio Device(s) Line6 UX1 + slightly repaired Sony DR-ZX302
Power Supply Fractal Design Effekt 400W
Mouse Logitech G602
Keyboard Dell Sk3205
Software Windows 10 Pro
Benchmark Scores I once had +100 dorfs in DF, so yeah pretty great
You have to be a special kind of stupid to think their top card will only match the 2080ti considering the 2080ti is 50% faster than the 5700xt. It does not take a genius to realize that doubling the cores of the 5700xt, increasing IPC, and running higher clocks would result in a MUCH higher gain than 50%. FFS even the XBOX series X has a gpu as fast or faster than the 2080 super and the 6900xt will be a hell of a lot bigger gpu.
It's less about being stupid and more about managing expectations. High tier AMD cards have burned people in the past because they expected too much. The only sensible thing to do is to wait for reviews.
 
Joined
Sep 6, 2013
Messages
1,714 (0.66/day)
Location
Athens, Greece
System Name 3 systems: Gaming / Internet / HTPC
Processor Intel i5 4460 / Athlon 3000G / A6 7400K
Motherboard ASRock Z97 Extreme6 / MSI X470 Gaming Plus Max / ASUS FM2+
Cooling Modified AM2 Asetek MicroChill / AMD stock / CoolerMaster TX2
Memory 16GB Kingston KHX1866C10D3 / 16GB G.Skill Aegis DDR4-3200MHz / 8GB Kingston 2400MHz (DDR3)
Video Card(s) XFX RX 580 8GB + GT 620 (PhysX)/ Vega 3 / A6 7400K iGPU
Storage Intel NVMe 500GB, Samsung NVMe 250GB + more / Kingston 240GB + more / Samsung SSD 120GB
Display(s) Samsung LE32D550 32'' TV(2 systems connected) / 19'' monitor + projector
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Sharkoon 650W / Seasonic 400W
Mouse CoolerMaster / Rapoo / Logitech
Keyboard CoolerMaster / Microsoft / Logitech
Software Windows
I don't think cache can replace bandwidth. Especially when games ask for more and more VRAM. I might be looking at it the wrong way and the next example could be wrong, but, Hybrid HDDs NEVER performed as real SSDs.

I am keeping my expectations really low after reading about that 256bit data bus.
 
Joined
May 2, 2017
Messages
2,989 (2.34/day)
Processor AMD Ryzen 5 1600X
Motherboard Biostar X370GTN
Cooling Custom CPU+GPU water loop
Memory 16GB G.Skill TridentZ DDR4-3200 C16
Video Card(s) AMD R9 Fury X
Storage 500GB 960 Evo (OS ++), 500GB 850 Evo (Games)
Display(s) Dell U2711
Case NZXT H200i
Power Supply EVGA Supernova G2 750W
Mouse Logitech G602
Keyboard Lenovo Compact Keyboard with Trackpoint
Software Windows 10 Pro
Regardless of the veracity of this, there is definitely something weird about the rumored specifications for these GPUs. 256-bit and 192-bit bus widths for a high-end GPU in 2020 with no new tricks to counteract this would be a significant bottleneck. And AMD obviously knows this. They do, after all, design GPUs for a living. They have the resources to, say, make a 512-bit test chip + PCB and benchmark it with varying numbers of memory controllers enabled, identifying when and how bottlenecks appear. And while 512-bit buses aren't really commercially viable (huge, hot, expensive, and at that point HBM is a better alternative at likely the same price), 384-bit buses are. So if they've chosen to go 256-bit for their highest end GPU, there has to be some reason for it.
 
Joined
Nov 11, 2016
Messages
603 (0.42/day)
System Name The de-ploughminator
Processor I7 8700K @ 5.1Ghz
Motherboard Gigabyte Z370 Gaming 5
Cooling Custom Watercooling
Memory 4x8GB G.Skill Trident Neo 3600mhz 15-15-15-30
Video Card(s) RTX 2080 Ti + Heatkiller IV wb
Storage Plextor 512GB nvme SSD
Display(s) LG 34GN850-B
Case Lian Li 011D Dynamic
Audio Device(s) Creative AE-5
Power Supply Corsair RM1000
You have to be a special kind of stupid to think their top card will only match the 2080ti considering the 2080ti is 50% faster than the 5700xt. It does not take a genius to realize that doubling the cores of the 5700xt, increasing IPC, and running higher clocks would result in a MUCH higher gain than 50%. FFS even the XBOX series X has a gpu as fast or faster than the 2080 super and the 6900xt will be a hell of a lot bigger gpu.
Let say 6900XT is 20-30% faster than 2080 Ti in "specific" rasterization workload that doesn't require massive bandwidth, but slower than 2080 Ti in Ray Trace workload, does it mean 6900XT is a faster GPU ?
"But you don't need Ray Tracing" is not an excuse for >500usd GPU.
Before you say that there are other API alternative for Ray Tracing, not having dedicated RT cores will just hammer performance, just look at Crysis Remastered as an example (the game can leverage the RT cores)

 
Joined
Jan 8, 2017
Messages
5,893 (4.24/day)
System Name Good enough
Processor AMD Ryzen R7 1700X - 4.0 Ghz / 1.350V
Motherboard ASRock B450M Pro4
Cooling Deepcool Gammaxx L240 V2
Memory 16GB - Corsair Vengeance LPX - 3333 Mhz CL16
Video Card(s) OEM Dell GTX 1080 with Kraken G12 + Water 3.0 Performer C
Storage 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) 4K Samsung TV
Case Deepcool Matrexx 70
Power Supply GPS-750C
Fans desperately searching for some argument to say 256 bit GDDR6 will do anything more than hopefully get even with a 2080ti.
I've noticed you are quite dead set on saying some pretty inflammatory and quite stupid things to be honest as of late. What's the matter ?

A 2080ti has 134% the performance of a 5700XT. The new flagship is said to have twice the shaders, likely higher clock speeds and improved IPC. Only a pretty avid fanboy of a certain color would think that such a GPU could only muster some 30% higher performance with all that. GPUs scale very well, you can expect it to be between 170-190% the performance of a 5700XT.

Bandwidth is bandwidth and cache is not new.
Caches aren't new, caches as big as the ones rumored are a new thing. I should also point out that bandwidth and the memory hierarchy is completely hidden away from the GPU cores, in other words, whether it's reading at 100GB/s from DRAM or at 1 TB/s from a cache, it doesn't care, it's just operating on some memory at an address as far as the GPU core is concerned.

Rendering is also an iterative process where you need to go over the same data many times a second, if you can keep for example megabytes of vertex data in some fast memory close to the cores that's a massive win.

GPUs hide very well memory bottlenecks by scheduling hundreds of threads, another thing you might have missed is that over time the ratio of GB/s from DRAM per GPU core has been getting lower and lower. And somehow performance keeps increasing, how the hell does that work if "bandwidth is bandwidth" ?

Clearly, there are ways of increasing the efficiency of these GPU such that they need less DRAM bandwidth to achieve the same performance, this is another one of those ways. By your logic, we must have had GPUs with tens of TB/s by now because otherwise the performances wouldn't have gone up.

  • There will actually be stock because unlike Nvidia they're not trying to artificially drive up prices.
They wont have much stock, most wafers are going to consoles.

  • It's not going to be a watt-sucking heat-producing beast.
While performance/watt must have increased massively, perhaps even over Ampere, the highest end card will still be north of 250W.
 
Last edited:
Joined
Mar 13, 2012
Messages
249 (0.08/day)
I don't think cache can replace bandwidth. Especially when games ask for more and more VRAM. I might be looking at it the wrong way and the next example could be wrong, but, Hybrid HDDs NEVER performed as real SSDs.

I am keeping my expectations really low after reading about that 256bit data bus.
Why do you think we have cache in CPU, GPU and SSD + more.

Because it works and it does replace bandwidth, information that the GPU uses repeatedly is stored in and fetched from cache and thus does not have to travel through the memory bus each time. Therefore the memory bandwidth saved by using cache can instead be used for other information. So a 256-bit bus with a large very effective cache equals MORE MEMORY BANDWITH, Nvidia already uses this system on all their cards.
 
Joined
Feb 3, 2017
Messages
2,646 (1.94/day)
Processor i5-8400
Motherboard ASUS ROG STRIX Z370-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-3200 CL16
Video Card(s) Gainward GeForce RTX 2080 Phoenix
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Logitech G700
Keyboard Corsair K60
A 2080ti has 134% the performance of a 5700XT.
At 1080p. At 1440p, its 142% and at 2160p its 152%.
More notably though, 3080 is twice as fast.
 
Joined
Jan 8, 2017
Messages
5,893 (4.24/day)
System Name Good enough
Processor AMD Ryzen R7 1700X - 4.0 Ghz / 1.350V
Motherboard ASRock B450M Pro4
Cooling Deepcool Gammaxx L240 V2
Memory 16GB - Corsair Vengeance LPX - 3333 Mhz CL16
Video Card(s) OEM Dell GTX 1080 with Kraken G12 + Water 3.0 Performer C
Storage 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) 4K Samsung TV
Case Deepcool Matrexx 70
Power Supply GPS-750C
At 1080p. At 1440p, its 142% and at 2160p its 152%.
Probably you're right, I went of the comparison tool thingy when you browse different GPU that one says the 2080ti is 134% the performance of a 5700XT.

Based on TPU review data: "Performance Summary" at 1920x1080, 4K for 2080 Ti and faster.
 
Joined
Feb 11, 2009
Messages
2,781 (0.65/day)
System Name Cyberline
Processor Intel Core i7 2600k
Motherboard Asus P8P67 LE Rev 3.0
Cooling Tuniq Tower 120
Memory Corsair (4x2) 8gb 1600mhz
Video Card(s) AMD RX480
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb
Display(s) Philips 32inch LPF5605H (television)
Case antec 600
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
It Always pains me to see people overhyping products, it can pretty much only lead to dissapointment.
That said, lets not forget this GPU was pretty much made with the help of Sony and Microsoft because of their consoles using RDNA2, that is a lot of (smart) people working on a product, so I do have faith that it will be good.

And personally I care little for "beating" Nvidia in "performance".
If it delivers good frames, while going ez on the powerconsumption and while costing, finally again, a reasonable amount of money and not the obscene prices being asked as of late, its a winner in my book.

Heck I would REALLY love it if we had a new RX460/470/480 moment, where all games could be lifted up, where everyone could upgrade and get with the times.

This would also be really good for the evolution/implementation of Ray Tracing, the industry can only really make use of that if the world can use it.
 
Joined
Jun 27, 2019
Messages
351 (0.72/day)
Location
Mid EU
Processor Ryzen 5 1600x @Stock
Motherboard ASUS ROG STRIX B350-F
Cooling Be quiet! Pure Rock Slim 'CPU', 3x Raijintek Auras 12 +3x Cooler Master MF120L non led case fans
Memory 2x8GB G.SKILL Ripjaws V DDR4 3200MHz
Video Card(s) Gigabyte RX 570 Gaming 4G @stock clock/950 mV Undervolt
Storage 1 TB WD Blue, 3 TB Toshiba P300, 120 GB WD Green 2.5 SSD
Display(s) 29" 2560x1080 / LG 29WK600-W
Case In Win 101c Black
Audio Device(s) Onboard + Kingston HyperX Cloud Stinger
Power Supply Cooler Master 650W MWE Gold
Mouse Motospeed V20
Keyboard Genius Scorpion K10
Software Windows 10 Pro
And personally I care little for "beating" Nvidia in "performance".
If it delivers good frames, while going ez on the powerconsumption and while costing, finally again, a reasonable amount of money and not the obscene prices being asked as of late, its a winner in my book.

Heck I would REALLY love it if we had a new RX460/470/480 moment, where all games could be lifted up, where everyone could upgrade and get with the times.

This would also be really good for the evolution/implementation of Ray Tracing, the industry can only really make use of that if the world can use it.
Yup, this is what I would also love to see and what I mainly care about when upgrading.

Those RX cards were a godsend for me, it was a solid upgrade from my previous card w/o breaking the bank/my wallet.

Looking at the prices lately, most likely my only option will be the second hand market again if I want the same performance uplift as last time. 'went from a GTX 950 to RX 570'
 

M2B

Joined
Jun 2, 2017
Messages
277 (0.22/day)
Location
Iran
Processor Intel Core i5-8600K @4.9GHz
Motherboard MSI Z370 Gaming Pro Carbon
Cooling Cooler Master MasterLiquid ML240L RGB
Memory XPG 8GBx2 - 3200MHz CL16
Video Card(s) Asus Strix GTX 1080 OC Edition 8G 11Gbps
Storage 2x Samsung 850 EVO 1TB
Display(s) BenQ PD3200U
Case Thermaltake View 71 Tempered Glass RGB Edition
Power Supply EVGA 650 P2
For the sake of comparison, the RTX 2080Ti has exactly twice as many shaders as the RTX 2060 Super with very similar Real-World clocks and performs about 63.3% better at 4K according to TPU's average framerate in 20+ games.
Based on Xbox Series X performance scaling over the X1X it doesn't seem like RDNA2 has much in the way of IPC improvements over RDNA.
So with similar clocks I expect the top-end 80CU RDNA2 to be 55-65% faster than the 5700XT depending on the resolution. (Assuming there is no bandwidth bottleneck)
But as we all know RNDA2 will have noticeably higher clocks than RDNA1, I expect the average clocks of the 80CU part to be in the 2-2.1GHz range which is a decent 10-13% above the 5700XT, assuming semi-linear scaling, this clock boost alone will put RDNA2 10-12% above RDNA1, now with addition of that massive shader count increase It's probably reasonable to expect the top-end RDNA2 to be 75-85% faster than the 5700XT as Vya Domus predicted.

Expecting flagship RDNA2 to be only as fast as a 3070/2080Ti is not realistic, as it will probably beat them both comfortably.
 
Last edited:
Joined
Apr 12, 2013
Messages
3,665 (1.33/day)
When did X1X have RDNA based GPU :wtf:

Also, don't extrapolate RDNA2 performance based on console numbers. They're not exactly comparable, it's more like comparing cashews to figs.
 
Last edited:

M2B

Joined
Jun 2, 2017
Messages
277 (0.22/day)
Location
Iran
Processor Intel Core i5-8600K @4.9GHz
Motherboard MSI Z370 Gaming Pro Carbon
Cooling Cooler Master MasterLiquid ML240L RGB
Memory XPG 8GBx2 - 3200MHz CL16
Video Card(s) Asus Strix GTX 1080 OC Edition 8G 11Gbps
Storage 2x Samsung 850 EVO 1TB
Display(s) BenQ PD3200U
Case Thermaltake View 71 Tempered Glass RGB Edition
Power Supply EVGA 650 P2
Joined
Apr 12, 2013
Messages
3,665 (1.33/day)
Based on Xbox Series X performance scaling over the X1X it doesn't seem like RDNA2 has much in the way of IPC improvements over RDNA.
You said this, how can it be interpreted any differently?
 

M2B

Joined
Jun 2, 2017
Messages
277 (0.22/day)
Location
Iran
Processor Intel Core i5-8600K @4.9GHz
Motherboard MSI Z370 Gaming Pro Carbon
Cooling Cooler Master MasterLiquid ML240L RGB
Memory XPG 8GBx2 - 3200MHz CL16
Video Card(s) Asus Strix GTX 1080 OC Edition 8G 11Gbps
Storage 2x Samsung 850 EVO 1TB
Display(s) BenQ PD3200U
Case Thermaltake View 71 Tempered Glass RGB Edition
Power Supply EVGA 650 P2
You said this, how can it be interpreted any differently?
I agree, that part of my comment was a bit confusing but I didn't mean The X1X has RDNA.
just the Real-World performance increase didn't suggest higher IPC than RDNA1 to me, based on how RDNA performs in comparison to the console.
 
Joined
Aug 5, 2019
Messages
429 (0.95/day)
System Name Neon Master
Processor AMD Ryzen 3900x
Motherboard x570 Aorus Master F30
Cooling H115i RGB Platinum 280mm/Case Fans NF-A14 x4
Memory G.Skill Neo 32GB @3600 MT/s C16
Storage MP600 2TB
Display(s) LG 27GL850-b
Case Phanteks Evolv X
Audio Device(s) DT 770 Pro
Power Supply Seasonic Prime Ultra Titanium 1000w
Mouse Scimitar Pro
Keyboard K95 Platinum
You say Infinity Cache, i hear "we have chilplets on GPU's now"
 
Joined
Jan 11, 2005
Messages
1,385 (0.24/day)
Location
66 feet from the ground
System Name 2nd AMD puppy
Processor FX-8350 vishera
Motherboard Gigabyte GA-970A-UD3
Cooling Cooler Master Hyper TX2
Memory 16 Gb DDR3:8GB Kingston HyperX Beast + 8Gb G.Skill Sniper(by courtesy of tabascosauz &TPU)
Video Card(s) Sapphire RX 580 Nitro+;1450/2000 Mhz
Storage SSD :840 pro 128 Gb;Iridium pro 240Gb ; HDD 2xWD-1Tb
Display(s) Benq XL2730Z 144 Hz freesync
Case NZXT 820 PHANTOM
Audio Device(s) Audigy SE with Logitech Z-5500
Power Supply Riotoro Enigma G2 850W
Mouse Razer copperhead / Gamdias zeus (by courtesy of sneekypeet & TPU)
Keyboard MS Sidewinder x4
Software win10 64bit ltsc
Benchmark Scores irrelevant for me
me love to read comments!

popcorn gif.gif
 

bug

Joined
May 22, 2015
Messages
8,291 (4.17/day)
Processor Intel i5-6600k (AMD Ryzen5 3600 in a box, waiting for a mobo)
Motherboard ASRock Z170 Extreme7+
Cooling Arctic Cooling Freezer i11
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V (@3200)
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 3TB Seagate
Display(s) HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Ok, who the hell calls Navi2 "Big Navi"?
Big Navi was a pipe dream of AMD loyalists left wanting for a first gen Navi high-end card.
 
Top