• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Rumor: AMD Ryzen Threadripper PRO Lineup Leaked

Joined
May 28, 2020
Messages
267 (2.10/day)
System Name Main PC
Processor AMD Ryzen 7 3700X
Motherboard ASUS X570 Crosshair VIII Hero (Wi-Fi)
Cooling EKWB X570 VIII Hero Monoblock, Corsair HYDRO X 2080 Ti block & XD5 pump/res, Heatkiller IV SB block
Memory 2x 2x16GB 3200-14-14-14-34 G.Skill Trident RGB (OC: 3600-14-14-14-28)
Video Card(s) PNY GeForce RTX 2080 Ti OC
Storage Corsair Force LE SSD 500GB, Fusion IoDrive 2 1TB, Huawei HSSD 2TB
Display(s) 1x Dell U2913WM 2560x1080 75Hz, 1x BENQ XL2411Z 1920x1080 100Hz, 1x Acer XB1 xb271hu 2560x1440 144Hz
Case Phanteks P600S
Audio Device(s) Sennheiser HD599, Blue Yeti
Power Supply EVGA SuperNova G2 750W
Mouse Logitech G502 Lightspeed
Keyboard Corsair Strafe RGB MK2
Software Windows 10 Pro 2004
Benchmark Scores Cinebench R20: 5086 MC, 498 SC CPU-Z 17.01.64: 5788.9 MC 521.5 SC
Yes, it's interesting if there are, somewhere, data-centre rooms just outside cooled directly by the Earth's atmosphere. Say Norway or southern Chile, Argentina, or somewhere high in Switzerland.
All the electricity cost for ventilation will be virtually zero.
I actually work in a data centre in Northern Norway. The problem with this solution is that it can't be done permanently. Unless you go way, way, way north (or south, I suppose), you still have a reasonably hot summertime and the environmental cooling would therefore only work during 3/4ths of the year. So the solution is to either have downtime for 3 months (not going to happen) or go with traditional cooling (you can't swap cooling methods every year, too much work and too much downtime). For us; most of our servers use a watercooling system that feeds cold water from a local spring, which is then heated up by the servers, and that hot water is then used to warm the air inside the buildings, and the water we use. This way we reduce costs for heating both the buildings themselves and the water we use for coffee and such. It's also essentially free cooling as water is virtually free here, there's only the initial piping and such that costs, there's no bill after that. For the rest of the servers, we use traditional high-airflow server fans, so ventilation still costs a non-zero amount.
 
Joined
Dec 31, 2009
Messages
18,515 (4.71/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
This lineup is confusing. I swear it feels like all of the good that amd tout, is falling to pieces now that they are competitive in the market... :(



Yes, it's interesting if there are, somewhere, data-centre rooms just outside cooled directly by the Earth's atmosphere. Say Norway or southern Chile, Argentina, or somewhere high in Switzerland.
All the electricity cost for ventilation will be virtually zero.
I did this for a living... data center management. Those exist ("free cooling" exists i should say) and dragon is spot on. ;)

What I've seen is that cooling systems for a dat center work in conjunction with the outside air. When its cool enough more outside air is brought in and conditioned (humidity, etc) and the other cooling is shutdown/turned down.

You pay for "Amperage" consumption in a rack. So lets say i can hire 1U space and my limitation is 1A for my server. It would mean that my server cannot use more then 1A on 220V or so. A CPU that has a fairly high amount of cores with a reasonable TDP gets me work done in less, sort of say.
sure... if you're renting server space. If you own/are a data center...
 
Last edited:
Joined
Nov 23, 2010
Messages
188 (0.05/day)
  • 64 cores
  • 8 channel DDR4
  • 2TB Max memory
  • 128 lanes PCIe 4.0
Wow... final nail in the coffin for intel until they can come up with their next-gen platform.
 
Joined
Dec 26, 2006
Messages
369 (0.07/day)
System Name Just another PC
Processor Ryzen 1700
Motherboard Gigabyte GA-AX370-K3
Cooling Noctua NH-C12P SE14
Memory DDR4-2133 2x16GB
Video Card(s) XFX RX480 8GB
Storage Samy 960 EVO 500GB m.2, 500GB SSD & a 2TB spinner
Display(s) LG 27UL550-W
Case Be Quiet Pure Base 600 (no window)
Audio Device(s) Realtek ALC1220
Power Supply EVGA Supernova G2 550W
Mouse Mionix Naos 8200
Keyboard Corsair with browns
Software W10 Pro x64 v1809
Benchmark Scores It can run the interwebs
I actually work in a data centre in Northern Norway. The problem with this solution is that it can't be done permanently. Unless you go way, way, way north (or south, I suppose), you still have a reasonably hot summertime and the environmental cooling would therefore only work during 3/4ths of the year. So the solution is to either have downtime for 3 months (not going to happen) or go with traditional cooling (you can't swap cooling methods every year, too much work and too much downtime). For us; most of our servers use a watercooling system that feeds cold water from a local spring, which is then heated up by the servers, and that hot water is then used to warm the air inside the buildings, and the water we use. This way we reduce costs for heating both the buildings themselves and the water we use for coffee and such. It's also essentially free cooling as water is virtually free here, there's only the initial piping and such that costs, there's no bill after that. For the rest of the servers, we use traditional high-airflow server fans, so ventilation still costs a non-zero amount.
As a mechanical engineer, I'd love to see a P&ID for that system :)
 
Joined
Jul 9, 2015
Messages
2,436 (1.27/day)
System Name My all round PC
Processor i5 750
Motherboard ASUS P7P55D-E
Memory 8GB
Video Card(s) Sapphire 380 OC... sold, waiting for Navi
Storage 256GB Samsung SSD + 2Tb + 1.5Tb
Display(s) Samsung 40" A650 TV
Case Thermaltake Chaser mk-I Tower
Power Supply 425w Enermax MODU 82+
Software Windows 10
It takes lots of air-conditioning power to move heat around. The easiest way forward is to simply generate less heat.
It's not about electricity bill that much.
E.g. Twitter openly stated, hey, we can cram more compute power into existing data centers with EPYC so f*ck Intel.
Imagine the costs of expanding the buildings/new construction etc.
 
Joined
Apr 24, 2020
Messages
394 (2.45/day)
It's not about electricity bill that much.
E.g. Twitter openly stated, hey, we can cram more compute power into existing data centers with EPYC so f*ck Intel.
Imagine the costs of expanding the buildings/new construction etc.
Buildings, and rooms, are constructed with a limited amount of power (or really, amps) that they support. Depending on your company, air-conditioning costs are included, or may not be included... but the reality is that all of that fits under the power-budget (air conditioning can grow only as large as the electricity can provide).

Power delivery is one of the major costs of construction when building a building. Buildings are surprisingly cheap: with mega-malls dying, you have plenty of real-estate to buy these days. Even then, its often cheaper to construct warehouses in the middle of nowhere. But for data-centers, such space is worthless because there's simply not enough power-delivery to support computers and air conditioning. There's a reason why a huge amount of data-centers are located near power plants, hydroelectric plants, and the like. Moving your computers closer to power-generation sources has non-trivial savings involved.

"More compute power within existing data centers" almost certainly means "more compute per unit power / electricity". Its hard to imagine any other interpretation.
 
Joined
Dec 29, 2010
Messages
1,393 (0.39/day)
I don't understand the fuss and complaints. These Pro chips filled a GAPING whole AMD has in the high end workstation space. Threadripper is HEDT, not really workstation with its 256gb ram limitation.
 
Joined
May 29, 2013
Messages
45 (0.02/day)
280W TDP for the 12 and 16 core CPUs? I highly doubt that. I don't see AMD pushing higher than 140W on those two CPUs.
 
Joined
Dec 14, 2013
Messages
1,573 (0.63/day)
Location
Alabama
Processor Ryzen 2700X
Motherboard X470 Tachi Ultimate
Cooling Scythe Big Shuriken 3
Memory C.R.S.
Video Card(s) Radeon VII
Software Win 7
Benchmark Scores Never high enough
Athlon - entry
Ryzen - mainstream, mid-range
Threadripper - high-end
Threadripper Pro - professional, high-end
EPYC - server, government, serious money...
Fixed that for you.

Things are about right for the lineup but I can see why some may be like W?T?F? over it.
 
Joined
Dec 31, 2009
Messages
18,515 (4.71/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
I don't understand the fuss and complaints. These Pro chips filled a GAPING whole AMD has in the high end workstation space. Threadripper is HEDT, not really workstation with its 256gb ram limitation.
Really? A gaping hole?! Pretty sure their lineup covers about anything with mainstream, HEDT, and server. Few workstations are going to use 256GB of ram... much anything above that is server (xeon/epyc) land.

Workstation is HEDT. Its the line in the sand before server.
 
Last edited:
Joined
Oct 9, 2010
Messages
25 (0.01/day)
System Name Game PC
Processor i7 970 @ 4.2GHz
Motherboard Asus Rampage III Extreme
Cooling Water rad one 420x140 rad two 280x140
Memory 6GB 1600 memory, its the one component that dose not realy benefit from speed
Video Card(s) 2x 5870 matrix 2GB 1x the old GeForce GTX 275 for PhysX
Storage 60GB Vertex, 60GB vertex II, 320GB WD Caviar SE, +23TB on server
Display(s) 3x 26" Asus VW266H 1920x1200 for a Eyefinity setup
Case Corsair 800D
Audio Device(s) Xonar DX
Power Supply HX850
Software WIndows 7
Few workstations are going to use 256GB of ram... much anything above that is server (xeon/epyc) land.
If you do 8K video editing on RAW video, you're running out of ram real fast with 120GB/min of data, also in scientific research you have huge data sets, my daughter is doing geographical study, and has run many times against the 512GB limit she has in her 2990WX machine.

Not gone say the majority of people are gone running in to that limit, but there are plenty that do.

280W TDP for the 12 and 16 core CPUs? I highly doubt that. I don't see AMD pushing higher than 140W on those two CPUs.
These are basically dual 3600XT or 3800XT CPU's, that do 95W and 105W TDP times two, so a 190W and 210W TDP for these parts would not be unreasonable, and 300W+ with a good OC would not be strange.
 
Joined
Dec 31, 2009
Messages
18,515 (4.71/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
If you do 8K video editing on RAW video, you're running out of ram real fast with 120GB/min of data, also in scientific research you have huge data sets, my daughter is doing geographical study, and has run many times against the 512GB limit she has in her 2990WX machine.

Not gone say the majority of people are gone running in to that limit, but there are plenty that do.


These are basically dual 3600XT or 3800XT CPU's, that do 95W and 105W TDP times two, so a 190W and 210W TDP for these parts would not be unreasonable, and 300W+ with a good OC would not be strange.
Sounds Epyc, what she does. ;)

Few do... it isnt a "gaping" hole is my point.
 
Joined
Oct 9, 2010
Messages
25 (0.01/day)
System Name Game PC
Processor i7 970 @ 4.2GHz
Motherboard Asus Rampage III Extreme
Cooling Water rad one 420x140 rad two 280x140
Memory 6GB 1600 memory, its the one component that dose not realy benefit from speed
Video Card(s) 2x 5870 matrix 2GB 1x the old GeForce GTX 275 for PhysX
Storage 60GB Vertex, 60GB vertex II, 320GB WD Caviar SE, +23TB on server
Display(s) 3x 26" Asus VW266H 1920x1200 for a Eyefinity setup
Case Corsair 800D
Audio Device(s) Xonar DX
Power Supply HX850
Software WIndows 7
Few do... it isnt a "gaping" hole is my point.
No, I think that it's a real smart idea, especially the 12 and 16 core parts, they fill a gap in the TR platform for people that need more PCIe lanes then AM4 got, but don't need, and don't wane pay 1400+ for a 24 core CPU.

By now they have high yield rates of dies that can clock much higher than they need for EPYC, so why not ask a price premium for parts you otherwise have to put in a AM4 or sTRX4 solutions, now you can ask more and also give Intel a bid more pain with a cheep to make part.

Someone like me, depending on prices, I will be looking at the 12 or 16 core parts for video encoding/gaming rig, on the other end of the spectrum you got the people the need a lot of power, and they are willing to pay premium for high clock speeds and more memory bandwidth over EPYC solutions.
 
Joined
Dec 29, 2010
Messages
1,393 (0.39/day)
Really? A gaping hole?! Pretty sure their lineup covers about anything with mainstream, HEDT, and server. Few workstations are going to use 256GB of ram... much anything above that is server (xeon/epyc) land.

Workstation is HEDT. Its the line in the sand before server.
You need too stop acting like your opinion is defining. Workstations can be equipped with up to 2tb of memory. This is a gaping hole for AMD because they don't have a solution for this w/o going to low frequency Epyc parts. Take the Mac Pro for example, AMD can't touch it due to TR's extremely limited ram capability even though the cpus rofl stomp the Intel all day long.
 
Joined
Dec 31, 2009
Messages
18,515 (4.71/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
You need too stop acting like your opinion is defining. Workstations can be equipped with up to 2tb of memory. This is a gaping hole for AMD because they don't have a solution for this w/o going to low frequency Epyc parts. Take the Mac Pro for example, AMD can't touch it due to TR's extremely limited ram capability even though the cpus rofl stomp the Intel all day long.
Lol, I'm not saying my word is the gospel.. I dont agree with the severity in which you say there is a hole. Its just that simple....nor defining their ram capabilities as 'extremely limited.
 
Top