• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

The Curious Case of the 12-pin Power Connector: It's Real and Coming with NVIDIA Ampere GPUs

Joined
Jul 9, 2015
Messages
3,413 (1.06/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Nvidia requires more power on the topend max tuned cards
Uh, why does NV suddenly require (much) more power for the topend GPUs? Are we back to Fermi times?
 
Joined
Mar 8, 2013
Messages
60 (0.01/day)
I understand the desire to have a single connector for a graphics card instead of a bundle of them, especially for more power hungry hardware. What I don't get is why on earth Nvidia feels the need to draw up their own bloody connector.

There already exist a 12-circuit header housing (literally from the exact same connector series that the ATX24 pin, EPS 8 pin and PCIe 6 pin comes from, specifically, this part here: https://www.molex.com/molex/products/part-detail/crimp_housings/0469921210). I don't even buy the argument about insufficient power handling capability. Standard Minifit JR crimp terminals and connector housings are rated up to 9A per circuit pairs, so for the 12-circuit part, you're looking at 9A*12V*6 = 648W of power. This is around the same as the rough '600W' figure quoted in the article, not to mention, you can get higher . You don't need a new connector along with new crimp terminal design and non-standard keying for this. THE EXISTING ATX 12 PIN CONNECTOR WILL DO THE JOB JUST FINE. Not to mention, you can get Minifit-Jr terminals that are rated for even higher amperage (13A for copper/tin crimp terminals when used with 18AWG or thicker conductors, IIRC, in which case the standard ATX 12-pin cable will carry even more power). This is literally just Nvidia trying to reinvent the wheel for no apparent reason.

As someone who crimps their own PSU cables, this is a hecking pain in the butt, and I hope whoever at Nvidia came up with this idea gets 7 years of bad luck for this. Just use the industry-standard part that already exist FFS.

/rant
 
Joined
May 22, 2010
Messages
344 (0.07/day)
Processor R7-7700X
Motherboard Gigabyte X670 Aorus Elite AX
Cooling Scythe Fuma 2 rev B
Memory no name DDR5-5200
Video Card(s) Some 3080 10GB
Storage dual Intel DC P4610 1.6TB
Display(s) Gigabyte G34MQ + Dell 2708WFP
Case Lian-Li Lancool III black no rgb
Power Supply CM UCP 750W
Software Win 10 Pro x64


The 3dfx actually came with it's own power brick to be plugged into the wall to feed the GPU.

Apart from that, weird decision. 2x8 pins should be more then enough for most GPU's unless this thing is supposed to feed GPU's in enterprise markets or machines.

Lot of wires are overrated as well. One single yellow wire on a good build PSU could easily push 10A up to 13A.
that 3dfx board is a fraction of the power consumption of a modern gpu, that's why it used an external brick, there was no pcie power connector and there was no standard for internal power connectors for boards, the most they could do was a molex connector that aren't very reliable or high power.
As i've said, an external power brick for a modern gpu would be a PC PSU in an external case(thus costing upwards of 100USD) as it would require 500W of output, plus you'd need a multitude of thick cables and connector occupying and entire slot on the back (there's no room for the power connector plus the regular video output), and that's another fan making noise and getting clogged.

2x8 pins may be barely enough for ampere, but not for the future, or even not enough for 3080ti(or whatever name it comes out), there are already 2080ti cards with 3x8 pin.

Yes a wire can push 10+A, but they have to take into account the voltage drop and heating on the wire and contact resistance of the connector, it's not that simple(otherwise we'd use one big thick cable instead of 2x8 pins for example), and also how flexible it is
 
Joined
Apr 24, 2007
Messages
264 (0.04/day)
Location
Bay Area, CA
I understand the desire to have a single connector for a graphics card instead of a bundle of them, especially for more power hungry hardware. What I don't get is why on earth Nvidia feels the need to draw up their own bloody connector.

There already exist a 12-circuit header housing (literally from the exact same connector series that the ATX24 pin, EPS 8 pin and PCIe 6 pin comes from, specifically, this part here: https://www.molex.com/molex/products/part-detail/crimp_housings/0469921210). I don't even buy the argument about insufficient power handling capability. Standard Minifit JR crimp terminals and connector housings are rated up to 9A per circuit pairs, so for the 12-circuit part, you're looking at 9A*12V*6 = 648W of power. This is around the same as the rough '600W' figure quoted in the article, not to mention, you can get higher . You don't need a new connector along with new crimp terminal design and non-standard keying for this. THE EXISTING ATX 12 PIN CONNECTOR WILL DO THE JOB JUST FINE. Not to mention, you can get Minifit-Jr terminals that are rated for even higher amperage (13A for copper/tin crimp terminals when used with 18AWG or thicker conductors, IIRC, in which case the standard ATX 12-pin cable will carry even more power). This is literally just Nvidia trying to reinvent the wheel for no apparent reason.

As someone who crimps their own PSU cables, this is a hecking pain in the butt, and I hope whoever at Nvidia came up with this idea gets 7 years of bad luck for this. Just use the industry-standard part that already exist FFS.

/rant

That Molex is a mini-fit. The one in the drawing is a micro-fit. Look at the dimensions. And it's not "made up". It exists. Has for a long time. I can buy one from Digikey, Mouser, etc. from four or five different manufacturers.

Any way.. This isn't a news story. This is someone downloading a drawing from a connector supplier and posting it as "news". Real news would be seeing the connector on the card itself. Am I right?

Now look what's popping up in my Google ads!

1595098117981.png


See... Not a "made up" connector.
 
Joined
Mar 8, 2013
Messages
60 (0.01/day)
That Molex is a mini-fit. The one in the drawing is a micro-fit. Look at the dimensions. And it's not "made up". It exists. Has for a long time. I can buy one from Digikey, Mouser, etc. from four or five different manufacturers.

Any way.. This isn't a news story. This is someone downloading a drawing from a connector supplier and posting it as "news". Real news would be seeing the connector on the card itself. Am I right?

Now look what's popping up in my Google ads!

View attachment 162532

See... Not a "made up" connector.

Thanks for the reference, I didn't recognise the keying of the connector housing and the wording of the article made me think they designed a fancy new connector housing. This invalidates my previous point. What's the technical improvement you get from moving from the Minifit Jr to Minifit then? Most of the other connectors commonly used for ATX PCs are Minifit Jr, what do you gain by going to Micro-fit? Is it really that much better in terms of mechanical strength or electrical/current handling?
 
Joined
Apr 24, 2007
Messages
264 (0.04/day)
Location
Bay Area, CA
Thanks for the reference, I didn't recognise the keying of the connector housing and the wording of the article made me think they designed a fancy new connector housing. This invalidates my previous point. What's the technical improvement you get from moving from the Minifit Jr to Minifit then? Most of the other connectors commonly used for ATX PCs are Minifit Jr, what do you gain by going to Micro-fit? Is it really that much better in terms of mechanical strength or electrical/current handling?

Smaller size. Nothing more. If this is real, then they're only doing it because they ran out of PCB space.
 
Joined
Oct 17, 2019
Messages
40 (0.02/day)
Processor Intel Core i5-8600
Motherboard MSI Z370-A Pro
Memory G. Skill Ripjaws V DDR4 3200 16 GB (8x2)
Video Card(s) nVIDIA GeForce RTX 2080 Ti FE
Storage NVMe Samsung 970 Pro 1TB, Ultrastar DC HA210 2TB
Display(s) LG 27UD58-B
Power Supply Seasonic Prime TX-650
Can somebody explain this stupid decision?
 
Joined
Jul 9, 2015
Messages
3,413 (1.06/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Joined
Feb 20, 2020
Messages
9,340 (6.12/day)
Location
Louisiana
System Name Ghetto Rigs z490|x99|Acer 17 Nitro 7840hs/ 5600c40-2x16/ 4060/ 1tb acer stock m.2/ 4tb sn850x
Processor 10900k w/Optimus Foundation | 5930k w/Black Noctua D15
Motherboard z490 Maximus XII Apex | x99 Sabertooth
Cooling oCool D5 res-combo/280 GTX/ Optimus Foundation/ gpu water block | Blk D15
Memory Trident-Z Royal 4000c16 2x16gb | Trident-Z 3200c14 4x8gb
Video Card(s) Titan Xp-water | evga 980ti gaming-w/ air
Storage 970evo+500gb & sn850x 4tb | 860 pro 256gb | Acer m.2 1tb/ sn850x 4tb| Many2.5" sata's ssd 3.5hdd's
Display(s) 1-AOC G2460PG 24"G-Sync 144Hz/ 2nd 1-ASUS VG248QE 24"/ 3rd LG 43" series
Case D450 | Cherry Entertainment center on Test bench
Audio Device(s) Built in Realtek x2 with 2-Insignia 2.0 sound bars & 1-LG sound bar
Power Supply EVGA 1000P2 with APC AX1500 | 850P2 with CyberPower-GX1325U
Mouse Redragon 901 Perdition x3
Keyboard G710+x3
Software Win-7 pro x3 and win-10 & 11pro x3
Benchmark Scores Are in the benchmark section
Hi,
Most psu's have six 8 pin vga ports on them so I don't see this as an issue and just speculation needing a new psu when most likely just needs a new adapter or cable at worst.
30 series Titan rtx may need more but those aren't exactly mainstream cards either.

Guess we'll find out sooner or later :)
 
Joined
May 15, 2020
Messages
697 (0.48/day)
Location
France
System Name Home
Processor Ryzen 3600X
Motherboard MSI Tomahawk 450 MAX
Cooling Noctua NH-U14S
Memory 16GB Crucial Ballistix 3600 MHz DDR4 CAS 16
Video Card(s) MSI RX 5700XT EVOKE OC
Storage Samsung 970 PRO 512 GB
Display(s) ASUS VA326HR + MSI Optix G24C4
Case MSI - MAG Forge 100M
Power Supply Aerocool Lux RGB M 650W
Pissing off TSMC and going with Samsung has backfired.
That,s the shorter way of putting it, basically, Nvidia is anticipating higher TDP graphic cards in the near future.

This might be because of the fact that the improvement from moving to Samsung 8nm is less than they expected, or simply that they intend to leave less performance on the table and place their cards more towards the right side of the voltage/frequency curve, sacrificing some efficiency for more raw power.

To be fair, we know nothing of the TDPs from team red, they might've gone up too, in spite of using TSMC's 7nm EUV.
 
Joined
Jul 19, 2016
Messages
476 (0.17/day)
Spec allows up to 600W. So Ampere 3080 Ti might draw 400W+!! This seems like Nvidia dropping the ball going with Samsung 8nn fab process. Or Ampere as an arch is a bit of a turd. Likely a bit of both.

RDNA2 on TSMC's 7nm seems to be quite formidable.

To be fair, we know nothing of the TDPs from team red, they might've gone up too, in spite of using TSMC's 7nm EUV.

We know a little. PS5 RDNA2 GPU is 2.2Ghz in a ~100W package offering near 2080 performance.

A 350W RDNA2 GPU is going to give a 2080 Ti a run for its money.
 
Joined
May 15, 2020
Messages
697 (0.48/day)
Location
France
System Name Home
Processor Ryzen 3600X
Motherboard MSI Tomahawk 450 MAX
Cooling Noctua NH-U14S
Memory 16GB Crucial Ballistix 3600 MHz DDR4 CAS 16
Video Card(s) MSI RX 5700XT EVOKE OC
Storage Samsung 970 PRO 512 GB
Display(s) ASUS VA326HR + MSI Optix G24C4
Case MSI - MAG Forge 100M
Power Supply Aerocool Lux RGB M 650W
We know a little. PS5 RDNA2 GPU is 2.2Ghz in a ~100W package offering near 2080 performance.
That's only 36CU's and the TDP is not 100W, the reason the PS5 is so big is to allow better ventilation. So it indeed is more efficient than RDNA1, but it's unclear by exactly how much.
A 350W RDNA2 GPU is going to give a 2080 Ti a run for its money.
It better do more than that, it will launch at the same time with Ampere, the 2080Ti is a generation old now. It has to give the 3080 a run for its money, be it at 350W or 500W.
 
Joined
Jul 19, 2016
Messages
476 (0.17/day)
That's only 36CU's and the TDP is not 100W, the reason the PS5 is so big is to allow better ventilation. So it indeed is more efficient than RDNA1, but it's unclear by exactly how much.

It better do more than that, it will launch at the same time with Ampere, the 2080Ti is a generation old now. It has to give the 3080 a run for its money, be it at 350W or 500W.

I'm talking about the TDP of the PS5 GPU cores only, not the entire system. 100W was a guess.

And yes latest rumours from sources I trust state the top Big Navi will be around 40% faster than a 2080 Ti, so it will give that a run for its money as stated and challenge the 3080 Ti.
 
Joined
May 15, 2020
Messages
697 (0.48/day)
Location
France
System Name Home
Processor Ryzen 3600X
Motherboard MSI Tomahawk 450 MAX
Cooling Noctua NH-U14S
Memory 16GB Crucial Ballistix 3600 MHz DDR4 CAS 16
Video Card(s) MSI RX 5700XT EVOKE OC
Storage Samsung 970 PRO 512 GB
Display(s) ASUS VA326HR + MSI Optix G24C4
Case MSI - MAG Forge 100M
Power Supply Aerocool Lux RGB M 650W
I'm talking about the TDP of the PS5 GPU cores only, not the entire system. 100W was a guess.
Yes, I reckoned you would be guesstimating, but I think you're overoptimistic, I imagine the smart shifts in these APU's gives more umph to the GPU part, so I think the TDP is about or above 200W. I'm guesstimating, too :)
And yes latest rumours from sources I trust state the top Big Navi will be around 40% faster than a 2080 Ti, so it will give that a run for its money as stated and challenge the 3080 Ti.
Here's a recap from Tom that points out that we will see high TDP's from Nvidia this fall:
 
Joined
Dec 31, 2009
Messages
19,366 (3.70/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
Hi,
Most psu's have six 8 pin vga ports on them so I don't see this as an issue and just speculation needing a new psu when most likely just needs a new adapter or cable at worst.
30 series Titan rtx may need more but those aren't exactly mainstream cards either.

Guess we'll find out sooner or later :)
Most have 2 6+2 pin connectors... some (typically with a bit higher power, ~700W+) have four. A few, over the 1KW range, have six. Most do NOT have six, but two or four.

And yes latest rumours from sources I trust state the top Big Navi will be around 40% faster than a 2080 Ti, so it will give that a run for its money as stated and challenge the 3080 Ti.
So, essentially, what you are saying is that a RDNA2 card will be 'close enough' to a 300-400W Ampre card to give it a "run for its money"? What does that mean, exactly? Like 10% behind? I'm curious how you came up with that conclusion...

What I see is this.....2080 Ti(FE) currently leads 5700XT (Nitro+) by 42% (1440p). With all the rumors about the high power use, even with a significant node shrink (versus AMD who is tweaking) and a new architecture, you still think that is true?

I mean, I hope you're right, but from what we've seen so far, I don't understand how that kind of math even works out. You're assuming that with a die shrink and new arch from Ampre will only be around the same amount faster 2080Ti was over 1080Ti (~25-30%) while a new arch and node tweak will gain upwards of 70% performance? What am I missing here?

If NV Ampre comes in at 300W+, I don't see RDNA2 coming close (split the difference between 2080Ti and Ampre flagship)
 
Last edited:
Joined
Feb 20, 2019
Messages
7,305 (3.86/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
Here's a recap from Tom that points out that we will see high TDP's from Nvidia this fall:
Tom dropped off the radar for me these last few months, I guess he fell off my feed. Still, I believe he's credible and smart with a reasonable track record and he is really certain that 1st-gen Ampere will be an overheating, power-hungry dumpster fire on Samsung 8nm.

Even if he's only half-right, that doesn't bode well for the 3000-series.

It will also mean that Nvidia play dirty (using DLSS and black-box dev tools that hinder AMD/Intel performance to get the unfair advantage their inferior hardware needs). In the case of this generation, I suspect that means that proprietary RTX-specific extensions are heavily pushed, rather than DX12's DXR API. Gawd, I hope I'm wrong....
 
Joined
Aug 6, 2009
Messages
1,162 (0.22/day)
Location
Chicago, Illinois
Tom dropped off the radar for me these last few months, I guess he fell off my feed. Still, I believe he's credible and smart with a reasonable track record and he is really certain that 1st-gen Ampere will be an overheating, power-hungry dumpster fire on Samsung 8nm.

Even if he's only half-right, that doesn't bode well for the 3000-series.

It will also mean that Nvidia play dirty (using DLSS and black-box dev tools that hinder AMD/Intel performance to get the unfair advantage their inferior hardware needs). In the case of this generation, I suspect that means that proprietary RTX-specific extensions are heavily pushed, rather than DX12's DXR API. Gawd, I hope I'm wrong....

Lol inferior hardware. Inferior hardware that's better. Wut?
 
Joined
May 15, 2020
Messages
697 (0.48/day)
Location
France
System Name Home
Processor Ryzen 3600X
Motherboard MSI Tomahawk 450 MAX
Cooling Noctua NH-U14S
Memory 16GB Crucial Ballistix 3600 MHz DDR4 CAS 16
Video Card(s) MSI RX 5700XT EVOKE OC
Storage Samsung 970 PRO 512 GB
Display(s) ASUS VA326HR + MSI Optix G24C4
Case MSI - MAG Forge 100M
Power Supply Aerocool Lux RGB M 650W
It will also mean that Nvidia play dirty (using DLSS and black-box dev tools that hinder AMD/Intel performance to get the unfair advantage their inferior hardware needs). In the case of this generation, I suspect that means that proprietary RTX-specific extensions are heavily pushed, rather than DX12's DXR API. Gawd, I hope I'm wrong....
I haven't seen the edit.

That's always the marketing battle to be fought and AMD has been very bad at this in the past. And Nvidia are masters at this game, just ask 3DFX, S3 and Kyro...

But this time there are AMD APU's in all the consoles that matter, and the whole Radeon marketing team has been changed. So I reckon they might have something resembling a strategy.
 
Joined
Feb 20, 2019
Messages
7,305 (3.86/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
Lol inferior hardware. Inferior hardware that's better. Wut?
It's based on the video. Tom is putting his guesses that Big Navi will perform at 50-60% better than a 2080Ti, whilst his same evidence points towards Nvidia needing insane power draw and cooling just to hit 40% more than a 2080Ti.

If he's right, it means AMD will have the superior hardware.

I hope you realise this is a speculation thread though and there's no hard evidence on Ampere or Big Navi yet.

Me? I'm sitting on the fence and waiting for real-world testing and independent reviews. I'm old enough to have seen this game between AMD/ATi and Nvidia played out over and over again for 20+ years. It would not be the first time that either company had intentionally leaked misleading performance numbers to throw off the other team.
 
Joined
Aug 6, 2009
Messages
1,162 (0.22/day)
Location
Chicago, Illinois
When was the last time AMD had better hardware than Nvidia? From a performance standpoint.
 
Joined
May 15, 2020
Messages
697 (0.48/day)
Location
France
System Name Home
Processor Ryzen 3600X
Motherboard MSI Tomahawk 450 MAX
Cooling Noctua NH-U14S
Memory 16GB Crucial Ballistix 3600 MHz DDR4 CAS 16
Video Card(s) MSI RX 5700XT EVOKE OC
Storage Samsung 970 PRO 512 GB
Display(s) ASUS VA326HR + MSI Optix G24C4
Case MSI - MAG Forge 100M
Power Supply Aerocool Lux RGB M 650W
The best way to find the answer to your own question, go to this page:
and use the relative performance graph to find top of the line cards launched in the same year. I found this for you:
 
Joined
Feb 20, 2019
Messages
7,305 (3.86/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
When was the last time AMD had better hardware than Nvidia? From a performance standpoint.
If you're asking seriously, it depends what you mean:

In terms of performance per dollar or performance per transistor?

Currently they do. AMD's Navi10 is a reasonable competitor for TU106 but it is faster and cheaper than either the 2060S or the 2070 for a lower transistor count. It can't do raytracing, but arguably, TU106 is too weak to do it too. I've been wholly disappointed by my 2060S's raytracing performance, even with DLSS trying desperately to hide the fact it's only rendering at 720p. Heck, my 2060S can barely run Quake II or Minecraft ;)

In terms of halo/flagships?
  • In 2008, Terascale architecture (HD 4000 series) ended a few years of rubbish from ATi/AMD and was better than the 9800GTX in every way.
  • In 2010, Fermi (GTX 480) was a disaster that memes were born from.
  • In 2012 Kepler (GTX680) had an edge over the first iteration of GCN (HD7970) because DX11 was too common. As DX12 games appeared, Kepler fell apart badly.
  • In 2014 Kepler on steroids (GTX780Ti and Titan) tried to make up the difference but AMD just made Hawaii (290X) which was like an HD7970 on steroids, to match.
Nvidia has pretty much held the flagship position since Maxwell (900-series), and generally offered better performance/Watt and performance/Transistor even before you consider that they made consumer versions of their huge enterprise silicon (980Ti, 1080Ti, 2080Ti). The Radeon VII was a poor attempt to do the same and it wasn't a very good product even if you ignore the price - it was just Vega's failures but clocked a bit higher and with more VRAM that games (even 4K games) didn't really need.

So yeah, if you don't remember that the performance crown has been traded back and forth a lot over the years, then you need to take off your special Jensen-Huang spectacles and revisit some nostaligic youtube comparisons of real games frequently being better on AMD/ATi hardware. Edit, or just look at BoBoOOZ's links

I don't take sides, If Nvidia puts out rubbish, I'll diss it.
If AMD puts out rubbish, I'll diss that too.
I just like great hardware, ideally at a reasonable price and power consumption.
 
Last edited:
Joined
Mar 10, 2010
Messages
11,878 (2.30/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
When was the last time AMD had better hardware than Nvidia? From a performance standpoint.
7970 clawed performance back, the original r5870 had it's contemporary beat.
And later this year :p

Moore's law is dead might as well have quoted me verbatim, though I can't remember where on here I called it.

And if the many rumours are as true as usual, ie bits are but 50% balls then it still doesn't look Rosey for Nvidia this generation regardless.
 
Last edited:
Top