• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Dell Workstation Owners Club

wow nice deal on one of those! I'd love to have one myself but this year is kicking my ass financially and It's barely started. :(
 
wow nice deal on one of those! I'd love to have one myself but this year is kicking my ass financially and It's barely started. :(
Yeah so far I'm roughly 1280.00 CAD invested. Mind you, the GPU costed me 850.00 CAD second hand. But still has 11 months store warranty and 4 years manufacturer warranty so worth the purchase since they sell new usually for 1395.00+Tax CAD.

If you're in Canada, more specifically in Alberta, there are places in both Edmonton and calagary that sells used enterprise equipment fairly cheap. Just inventory is always floating.

I did also sell my previous gaming desktop to fund this project. It might of had a better CPU in terms of single thread performance. i9-11900KF. But I do need the core count for multithreaded workloads (VMs) once they come down in price I will be upgrading the CPUs again for higher core count and high clock rate gold cpus like the Gold 6250 having a max turbo of 4.5Ghz and base of 3.9 Ghz and 8 cores/16 Threads VS 6/12

I do wish it was the 7920 though because it would be nice to get dual 120 AIOs mounted on the CPUs for lower thermals. I'll be keeping an eye out for a barebones to swap everything into though and then sell the 7820 to someone for a fair price.
 
Yeah so far I'm roughly 1280.00 CAD invested. Mind you, the GPU costed me 850.00 CAD second hand. But still has 11 months store warranty and 4 years manufacturer warranty so worth the purchase since they sell new usually for 1395.00+Tax CAD.

If you're in Canada, more specifically in Alberta, there are places in both Edmonton and calagary that sells used enterprise equipment fairly cheap. Just inventory is always floating.

I did also sell my previous gaming desktop to fund this project. It might of had a better CPU in terms of single thread performance. i9-11900KF. But I do need the core count for multithreaded workloads (VMs) once they come down in price I will be upgrading the CPUs again for higher core count and high clock rate gold cpus like the Gold 6250 having a max turbo of 4.5Ghz and base of 3.9 Ghz and 8 cores/16 Threads VS 6/12

I do wish it was the 7920 though because it would be nice to get dual 120 AIOs mounted on the CPUs for lower thermals. I'll be keeping an eye out for a barebones to swap everything into though and then sell the 7820 to someone for a fair price.
ouch that's a pricey GPU! I'm in a small town in west virginia USA so not much places other than pawn shops for used computer stuff here...
 
I have no experience with this yet. But historically Registry edits are needed to get the Tesla MPU recognized as a GPU (even though the MPU is physically a GPU). I got the Tesla P100 because Nvidia released a gaming driver for this card that eliminates that step. I think you just need to find the right forum and ask the right questions.
I did run across a forum post where an individual mentioned that he was able to get the K80 or possibly the M40 to boot IIRC by adding some lines to the GRUB file, but I'm having a hell of a time trying to find it again. Anyway thanks for the reply, that system you posted w/ the P100 looks pretty dope, I've always been a fan of those older style workstation cases.
 
Does anyone know if the T5600 has the "above 4g decoding" option in its BIOS, or is it called something different. I am trying to boot with a Tesla K80, I have sufficient power and the proper 8 pin cable and cooling & would like to boot into Ubuntu. I have been at this off and on for a while now I have read one forum post while down the rabbit hole were someone was able to get it to boot, but of course they don't say how. If anyone knows the black magic required to get this combo to work i would love to know. :confused:

I doubt it. My T3600 (same generation) doesn't, and I have several newer generation Precisions (a T5810, 2 T5820s, and an R3930) and none of them have it either. I'm tempted to try hacking a BIOS and flashing it (I want resizable BAR) but I don't know if I want to bother using one of those flashing devices.
 
I doubt it. My T3600 (same generation) doesn't, and I have several newer generation Precisions (a T5810, 2 T5820s, and an R3930) and none of them have it either. I'm tempted to try hacking a BIOS and flashing it (I want resizable BAR) but I don't know if I want to bother using one of those flashing devices.
I actually just tracked down a post on pg. 115 where masterdeejay modded the T5810 bios "Dell T5810 workstation modded bios with turbo hack and rebar enabled". I have a CH341A programmer but he posted something about cards being limited to 8Gb of vram?? I may not have understood that part fully. IDK I have been trying to put together a workstation/server (at a resonable price) that will easily support both Maxwell and Pascal Tesla cards for months.
 
I actually just tracked down a post on pg. 115 where masterdeejay modded the T5810 bios "Dell T5810 workstation modded bios with turbo hack and rebar enabled". I have a CH341A programmer but he posted something about cards being limited to 8Gb of vram?? I may not have understood that part fully. IDK I have been trying to put together a workstation/server (at a resonable price) that will easily support both Maxwell and Pascal Tesla cards for months.
So wait it's a bios limitation to where gpus that have more than 8gb won't work right?? Or is it just reduced speeds or whatever??
 
So wait it's a bios limitation to where gpus that have more than 8gb won't work right?? Or is it just reduced speeds or whatever??
I don't think it's a VRAM issue. The Quadro and Tesla cards that have more than 8GB all work fine(or at least they're supposed too) in the T5810. It's got to be something else.
 
I don't think it's a VRAM issue. The Quadro and Tesla cards that have more than 8GB all work fine(or at least they're supposed too) in the T5810. It's got to be something else.
prolly speeds or something...
 
So wait it's a bios limitation to where gpus that have more than 8gb won't work right?? Or is it just reduced speeds or whatever??
The Tesla is an MPU. The GPU limit might not apply, but may be real for GPUs. Not sure how a Tesla 16GB as GPU would play out.
 
I doubt it. My T3600 (same generation) doesn't, and I have several newer generation Precisions (a T5810, 2 T5820s, and an R3930) and none of them have it either. I'm tempted to try hacking a BIOS and flashing it (I want resizable BAR) but I don't know if I want to bother using one of those flashing devices.
I think Dell calls it "Memory Map IO above 4GB". To be found inside "System Setup Options" from T5610 and later.
 
I think Dell calls it "Memory Map IO above 4GB". To be found inside "System Setup Options" from T5610 and later.

You're right - just found it (already enabled) on one of my 5820s. I'll have to check my others when I get home from work. Thanks!

Now if only Dell would add rebar support - they still sell the 5820 brand new, after all...
 
My T5500 has failed...ruled out it is the MBO, after much ado (change of CPUs, RAM, GPUs, PSU, ...).


So got my next T5500, this one is with single CPU. So far it works, but got a strange thing.
After removing obsolete FX 1800 GPU & putting in M5000, 2 things happen:
  1. no BIOS screen shown
  2. after automatic login, screen goes black...only way to revert to it is to unplug the GPU & replug DisplayPort.
Has anybody had the similar problem on Dell?
 
Have you updated the BIOS?
Is the new motherboard revision the same as the original?
Both use A18 BIOS version now, which is the latest from Dell.
Though it was that, so upgraded from A8 to A10 to A18 on the new board.

No, both are not the same rev. Can check that later...
 
Both use A18 BIOS version now, which is the latest from Dell.
Though it was that, so upgraded from A8 to A10 to A18 on the new board.

No, both are not the same rev. Can check that later...
Then they should be acting in identical ways. You might have missed another hardware problem somewhere.
EDIT: This suggestion presumes BIOS settings have been set identically and did not automatically revert to defaults, which they can under certain situations.
 
Last edited:
Then they should be acting in identical ways. You might have missed another hardware problem somewhere.
EDIT: This suggestion presumes BIOS settings have been set identically and did not automatically revert to defaults, which they can under certain situations.
Is there a way to setup BIOS settings under Win environment, so they can be booted with another setting?
 
The 2nd CPU riser card/ socket is known to have issues on the T5500. Whether that can take down the MB IDK.
I've had to use a GPU driver removal program in the past when swapping GPUs. Usually going from AMD to Nvidia. Did you try booting Safe Mode to see if basic VGA stuff is working?
 
ReBoot it with old card, got into BIOS, check everything is OK.

Now boots OK, only to 1024x768 display...will check if it repeating.

Damn, do not know what is going on here?! :confused:
 
ReBoot it with old card, got into BIOS, check everything is OK.

Now boots OK, only to 1024x768 display...will check if it repeating.

Damn, do not know what is going on here?! :confused:
Perhaps the card edge connectors where not completely in the slot? Maybe they need cleaning?

Either way, glad you got it back up and running! :toast:
 
My guess is that you're getting the BIOS video resolution only. I would try R&R video drivers. The driver software package may be the same version for both cards, but the settings for each GPU can vary. I would look for a GPU driver removal tool.
 
So wait it's a bios limitation to where gpus that have more than 8gb won't work right?? Or is it just reduced speeds or whatever??

I think Dell calls it "Memory Map IO above 4GB". To be found inside "System Setup Options" from T5610 and later.

The Tesla is an MPU. The GPU limit might not apply, but may be real for GPUs. Not sure how a Tesla 16GB as GPU would play out
I'm a bit late, but I've been exploring the xCuri0 / ReBarUEFI project and wondering if anyone has experience using it with a T5600/3600 etc.

Additionally, I'm trying to move away from the cloud for my machine learning projects and am seeking a cost-effective platform suitable for 2x Tesla M40 GPUs. I already have a cooling solution for the cards, but any platform suggestions or experience using similar setups would be greatly appreciated
 
One thing to consider is whether older CPUs have all of the instruction sets needed for newer programs. This would be a deal breaker. MPU drivers for other apps may need math instruction sets that gamers don't need to consider. MPUs and technical GPUs tend to have ECC RAM on the GPU. It's a given that the workstations will support this.

Since my experience is based on older computers what I'm going to say here is an opinion based on past practice and observation. It is things I would look for on newer computers, but at a higher degree based on newer standards.
GPUs use system RAM to store things like textures. This is usually equal to the GPU RAM capacity. They reserve memory addresses from the top down. They can access memory capacity beyond the RAM address capacity of the OS. The chipset limit is what actually applies.
Example 1- Running a 32 bit WIN7 there was a 3.5GB OS RAM limit. This worked fine with the typical 256/ 512mb GPU at the time. A 2GB GTX750 was a problem. Adding 2GB RAM to 6GB let the GPU , and the OS each have their own range of memory addresses (from opposite ends) resolving the so called GPU bottleneck. (he computer itself had an actual 8GB RAM capacity (4GB per Dell spec sheet).

Example 2- On another system with an 8GB RAM capacity I had the same GPU "bottleneck" with 64 bit WIN7 and a 4GB GTX1050 GPU. Instead of backing down to a 32bit OS I upgraded the GPU to a 3GB GTX1060 and based on the same principle got a huge performance boost. It seems 5GB is enough for Win7 64. I dd run the full 8GB instead of 6GB.

Based on these observations I suspect a couple things.
1- That the BIOS settiing for high GPU resolution keeps GPUs with over 8GB capacity from reserving more than 8GB system memory unless activated . It's a guess. I don't have any parts to test this.
2- The MPU can access much more system memory than a GPU (4x in the case of the 16GB Tesla P100). It's in the spec. sheet. The GPU driver may restore the normal 16GB " limit". Another guess.
3- My opinion- Since you mention dual Tesla MPUs you might need the 2 CPU system to get enough RAM capacity for what you want to do. That's if you don't need a newer CPU for other reasons.

On the T3600/T5600 Dell went to a proprietary PSU. That alone would probably make the T5600 the better choice to avoid hacking the bigger PSU into the T3600. you may need the 2nd CPU power cable for the 2nd Tesla MPU anyway. You need to understand the Dell multirail PSU to do this sort of thing.
 
Last edited:
One thing to consider is whether older CPUs have all of the instruction sets needed for newer programs. This would be a deal breaker. MPU drivers for other apps may need math instruction sets that gamers don't need to consider. MPUs and technical GPUs tend to have ECC RAM on the GPU. It's a given that the workstations will support this.

Since my experience is based on older computers what I'm going to say here is an opinion based on past practice and observation. It is things I would look for on newer computers, but at a higher degree based on newer standards.
GPUs use system RAM to store things like textures. This is usually equal to the GPU RAM capacity. They reserve memory addresses from the top down. They can access memory capacity beyond the RAM address capacity of the OS. The chipset limit is what actually applies.
Example 1- Running a 32 bit WIN7 there was a 3.5GB OS RAM limit. This worked fine with the typical 256/ 512mb GPU at the time. A 2GB GTX750 was a problem. Adding 2GB RAM to 6GB let the GPU , and the OS each have their own range of memory addresses (from opposite ends) resolving the so called GPU bottleneck. (he computer itself had an actual 8GB RAM capacity (4GB per Dell spec sheet).

Example 2- On another system with an 8GB RAM capacity I had the same GPU "bottleneck" with 64 bit WIN7 and a 4GB GTX1050 GPU. Instead of backing down to a 32bit OS I upgraded the GPU to a 3GB GTX1060 and based on the same principle got a huge performance boost. It seems 5GB is enough for Win7 64. I dd run the full 8GB instead of 6GB.

Based on these observations I suspect a couple things.
1- That the BIOS settiing for high GPU resolution keeps GPUs with over 8GB capacity from reserving more than 8GB system memory unless activated . It's a guess. I don't have any parts to test this.
2- The MPU can access much more system memory than a GPU (4x in the case of the 16GB Tesla P100). It's in the spec. sheet. The GPU driver may restore the normal 16GB " limit". Another guess.
3- My opinion- Since you mention dual Tesla MPUs you might need the 2 CPU system to get enough RAM capacity for what you want to do. That's if you don't need a newer CPU for other reasons.

On the T3600/T5600 Dell went to a proprietary PSU. That alone would probably make the T5600 the better choice to avoid hacking the bigger PSU into the T3600. you may need the 2nd CPU power cable for the 2nd Tesla MPU anyway. You need to understand the Dell multirail PSU to do this sort of thing.
My T3600 has a 635w PSU they had a lower watt one offered as well but I bought the one w a 635w one
 
According to Dell`s T3600/5600 manuals, their BIOS settings provide no option to activate MMIO above 4GB.
Then there is the MB firmware spread across two chips (same with later T3610/5610) leading to headaches.
Search level1/winraid-forums for T3610 BIOS modifications for examples.

TPU member "masterdeejay" reported to have successfully added Resizeble BAR to the T5810´s (X99/C612 chipset) BIOS.

Afaik, 3600-3610-5600-5610-7600-7610 PSUs are interchangable inbetween the 36xx-56xx workstations whilst one can´t benefit from the 76xx 1300W PSU´s additional graphics card PCIe connectors.

Any reason to enable ReBar when the Tesla M40 doesn´t support it?
 
Back
Top