• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Tesla K40 "Atlas" Compute Card Detailed

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,677 (7.43/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
NVIDIA is readying its next big single-GPU compute accelerator, the Tesla K40, codenamed "Atlas." A company slide that's leaked to the web by Chinese publication ByCare reveals its specifications. The card is based on the new GK180 silicon. We've never heard of this one before, but looking at whatever limited specifications that are at hand, it doesn't look too different on paper from the GK110. It features 2,880 CUDA cores.

The card itself offers over 4 TFLOP/s of maximum single-precision floating point performance, with over 1.4 TFLOP/s double-precision. It ships with 12 GB of GDDR5 memory, double that of the Tesla K20X, with a memory bandwidth of 288 GB/s. The card appears to feature a dynamic overclocking feature, which works on ANSYS and AMBER workloads. The chip is configured to take advantage of PCI-Express gen 3.0 system bus. The card will be available in two form-factors, add-on card, and SXM, depending on which the maximum power draw is rated at 235W or 245W, respectively.



View at TechPowerUp Main Site
 
Sounds like a later revision GK 110 judging by the specs if Nvidia can fit a fully enabled GPU,12GB of ECC GDDR5, and a likely clock bump into 235-245W...although no clock bump would still put it over 4 TFlops FP32 /1.4 TFlops FP64 ( 732 * 2 op/clock * 2880 = 4216 FP32/ 1405 FP64)
 
nvidia new series??

guys any news about nvidia gtx 800series???realase date or any spec??
 
guys any news about nvidia gtx 800series???realase date or any spec??

No. Maxwell could be as early as end 1st quarter 2014 or later.
 
12GB of memory, that thing will be a beast! I'm interested in a 4k setup so am looking for a single card solution to power it, maybe this will be the one.
 
GK180 literally makes no sense from the code name scheme.


12GB of memory, that thing will be a beast! I'm interested in a 4k setup so am looking for a single card solution to power it, maybe this will be the one.
This is a tesla, you won't be powering any 4k setups with it.
 
GK180 literally makes no sense from the code name scheme.



This is a tesla, you won't be powering any 4k setups with it.

^ this.

This is pure compute - it has no real world consumer benefit to gamers. And you don't need 12Gb memory for 4k. In fact the R9 290X is being touted by AMD as the proper 4k ready card, in other words, 4Gb memory will be more than enough.
 
No. Maxwell could be as early as end 1st quarter 2014 or later.

i have a gtx 670 dcII i was planing to upgrade it too 770 dcii ,and i dont know to go for it or not ??? becuase the diffrent of those cards are something about 10 frame
help mee :D
 
And you don't need 12Gb memory for 4k. In fact the R9 290X is being touted by AMD as the proper 4k ready card, in other words, 4Gb memory will be more than enough.

Well now, 4GB VRAM for 4K means 1GB VRAM for 1080p. The new games or hard moded ones, even 2GB VRAM is not enough anymore for 1080p. Meaning that for 4K gaming you need at least 8GB of VRAM...
 
Well now, 4GB VRAM for 4K means 1GB VRAM for 1080p. The new games or hard moded ones, even 2GB VRAM is not enough anymore for 1080p. Meaning that for 4K gaming you need at least 8GB of VRAM...

AMD doesn't think so and they brought us eyefinity- argue with them. This is from their R9 290X promo

In fact, I'm sure if I could summon the mighty W1zzard he'll say why you're very wrong about 8GB for 4K.

36.jpg
 
wow what, prima vera, did you think vram requirements scale linearly with resolution? a ton of the used space is textures

let's take bf3 or 4 for example, on ultra, probably going past 1.5gb before any resolution is added to the equation

you can calculate how much space a frame takes up (i forgot... was it 8bit image, 1byte per pixel, 8.3mb for a frame? sounds like i messed up)

by the way, 4k is just 4x1080p, we already do 3x1080p for years with eyefinity/surround, so it's not that massive of a jump (in processing usage)
 
I have seen Arma 3 take up to 1.6GB with no mods and games like Skyrim even more so i can see 8GB easy.

TPU OSD with GPU Z will give all the info needed in game.
 
wow what, prima vera, did you think vram requirements scale linearly with resolution? a ton of the used space is textures

let's take bf3 or 4 for example, on ultra, probably going past 1.5gb before any resolution is added to the equation

you can calculate how much space a frame takes up (i forgot... was it 8bit image, 1byte per pixel, 8.3mb for a frame? sounds like i messed up)

by the way, 4k is just 4x1080p, we already do 3x1080p for years with eyefinity/surround, so it's not that massive of a jump (in processing usage)
The frame buffer alone is small, that's true. But you have to treat many more pixels and apply those textures to them. If you use any form of antialiasing then sure you'll need more memory, and of course with mods you increase the need for more memory.
 
Going from 1080p to 4K needs practically no additional memory. A single frame 4K needs 4K*32bit colour=32MB. Add in multiple frames, back-buffers, over-draw, Z-stencilling etc. and you can probably deliver the same experience in 4K with just an extra 256MB.

But that means nothing else changes. You are using the same texture maps as the 1080p image. So you can get greater FOV or you get a bigger picture but same quality as 1080p.

If you want to bump up the quality of the picture to UHD+, then you are going to need new texture maps. That means more RAM on the GPU, but also much greater install size of the game. (Bumping up textures 2-4x).

So yes a 2GB 4K card is enough for using legacy 1080 textures, but 8GB for more and UHD texture resources.
 
Going from 1080p to 4K needs practically no additional memory. A single frame 4K needs 4K*32bit colour=32MB. Add in multiple frames, back-buffers, over-draw, Z-stencilling etc. and you can probably deliver the same experience in 4K with just an extra 256MB.

But that means nothing else changes. You are using the same texture maps as the 1080p image. So you can get greater FOV or you get a bigger picture but same quality as 1080p.

If you want to bump up the quality of the picture to UHD+, then you are going to need new texture maps. That means more RAM on the GPU, but also much greater install size of the game. (Bumping up textures 2-4x).

So yes a 2GB 4K card is enough for using legacy 1080 textures, but 8GB for more and UHD texture resources.

Awesome answer :toast:

It sounds like it makes total sense. In other words, in the future a game like BF5 for 4k resolution texture maps would take upwards of 100GB drive space for all the locations and ultra HD texture maps? Whereas, everything programmed now has a set 'density' of texture and as such at 4k resolution will just be scaled up?

So that makes everybody right!
 
The biggest problem in creating a 4k setup is the monitor itself. I wouldn't mind spending ~$1000 for a couple of 290s, but $3500 for a monitor is just crazy.
 
^ this.

This is pure compute - it has no real world consumer benefit to gamers. And you don't need 12Gb memory for 4k. In fact the R9 290X is being touted by AMD as the proper 4k ready card, in other words, 4Gb memory will be more than enough.

I challenge that claim when you are running native 4096x maps and Skyrim with texture and ENB mods.
 
I donno... Mass Effect 3 with 4K textures on 1080p uses 2GB of VRAM easily from ~500MB with default textures... I haven't play ME3 on a 4K resolution, but with 4K texures and some SMAA, I think it can take up to 4GB of VRAM easy. Same with Skyrim on 4K textures...
 
^ I'm not exactly sure what's right anymore lol.

Lemonsoda says the textures themselves shouldn't take any extra memory even when moving to 4k, and that's true. The frame buffer gets bigger, but that's a paltry 32GB per frame. Even with triple buffering there should be plenty of VRAM to spare. Kinda makes sense, but then there are other things to consider...

If the primitives in a scene become larger dimensionally, then shader operations on those primitives will take up more RAM, no matter that the texture data remains the same size. Each of the stages in the DX11 pipeline has to store data for each primitive, not just the final rendered scene. That means 4x the number of pixels for each triangle, and in a complex scene with lots of triangles that suddenly became bigger, it could easily mean more than 4x the total VRAM required.
 
No. Maxwell could be as early as end 1st quarter 2014 or later.

Not at all, the 800 seties will be clock bumped rabrands as maxwell is not expected before 2015 in most circles q4 2014 if we and nvidia are lucky hence the compute card you're seeing,, its shoddy brothers are probably nvidias answer to the R9 290x well that and bribeing anyone they can.
 
Not at all, the 800 seties will be clock bumped rabrands as maxwell is not expected before 2015 in most circles q4 2014 if we and nvidia are lucky hence the compute card you're seeing,, its shoddy brothers are probably nvidias answer to the R9 290x well that and bribeing anyone they can.

My post was completely true. Unless it comes out early Q1 2014. I said, "late Q1 2014 or later"

Besides, where is the source for your 2015 release date? Don't say Semiaccurate. Other places, just as unprovable say Q1 2014.
 
I'm afraid this card is going to be +$1000.

So not very interesting to me.
 
Not at all, the 800 seties will be clock bumped rabrands as maxwell is not expected before 2015 in most circles q4 2014 if we and nvidia are lucky hence the compute card you're seeing,, its shoddy brothers are probably nvidias answer to the R9 290x well that and bribeing anyone they can.
C'mon, where's the neg stats to go with general neg? STOP SHORTCHANGING ME!
Are you so neg on Amd you will jump on any possible neg stat.
:rolleyes:
 
C'mon, where's the neg stats to go with general neg? STOP SHORTCHANGING ME!

:rolleyes:

no neg stats just opinion and rumour:p

for a start Tmc is Ramping expenditure in 20nm, in laymans terms that means shut up already were on it, i got the stuff on the way and we have it sussed ,honest,, err about yields though.

not shouting about Tsv or 2.5D or cube loud though are they??
 
Last edited:
Back
Top