Monday, October 7th 2013

NVIDIA Tesla K40 "Atlas" Compute Card Detailed

NVIDIA is readying its next big single-GPU compute accelerator, the Tesla K40, codenamed "Atlas." A company slide that's leaked to the web by Chinese publication ByCare reveals its specifications. The card is based on the new GK180 silicon. We've never heard of this one before, but looking at whatever limited specifications that are at hand, it doesn't look too different on paper from the GK110. It features 2,880 CUDA cores.

The card itself offers over 4 TFLOP/s of maximum single-precision floating point performance, with over 1.4 TFLOP/s double-precision. It ships with 12 GB of GDDR5 memory, double that of the Tesla K20X, with a memory bandwidth of 288 GB/s. The card appears to feature a dynamic overclocking feature, which works on ANSYS and AMBER workloads. The chip is configured to take advantage of PCI-Express gen 3.0 system bus. The card will be available in two form-factors, add-on card, and SXM, depending on which the maximum power draw is rated at 235W or 245W, respectively.
Sources: ByCare, WCCFTech
Add your own comment

25 Comments on NVIDIA Tesla K40 "Atlas" Compute Card Detailed

#1
HumanSmoke
Sounds like a later revision GK 110 judging by the specs if Nvidia can fit a fully enabled GPU,12GB of ECC GDDR5, and a likely clock bump into 235-245W...although no clock bump would still put it over 4 TFlops FP32 /1.4 TFlops FP64 ( 732 * 2 op/clock * 2880 = 4216 FP32/ 1405 FP64)
Posted on Reply
#2
mitil
nvidia new series??

guys any news about nvidia gtx 800series???realase date or any spec??
Posted on Reply
#3
the54thvoid
Intoxicated Moderator
mitilguys any news about nvidia gtx 800series???realase date or any spec??
No. Maxwell could be as early as end 1st quarter 2014 or later.
Posted on Reply
#4
dom99
12GB of memory, that thing will be a beast! I'm interested in a 4k setup so am looking for a single card solution to power it, maybe this will be the one.
Posted on Reply
#5
esrever
GK180 literally makes no sense from the code name scheme.
dom9912GB of memory, that thing will be a beast! I'm interested in a 4k setup so am looking for a single card solution to power it, maybe this will be the one.
This is a tesla, you won't be powering any 4k setups with it.
Posted on Reply
#6
the54thvoid
Intoxicated Moderator
esreverGK180 literally makes no sense from the code name scheme.



This is a tesla, you won't be powering any 4k setups with it.
^ this.

This is pure compute - it has no real world consumer benefit to gamers. And you don't need 12Gb memory for 4k. In fact the R9 290X is being touted by AMD as the proper 4k ready card, in other words, 4Gb memory will be more than enough.
Posted on Reply
#7
mitil
the54thvoidNo. Maxwell could be as early as end 1st quarter 2014 or later.
i have a gtx 670 dcII i was planing to upgrade it too 770 dcii ,and i dont know to go for it or not ??? becuase the diffrent of those cards are something about 10 frame
help mee :D
Posted on Reply
#8
Prima.Vera
the54thvoidAnd you don't need 12Gb memory for 4k. In fact the R9 290X is being touted by AMD as the proper 4k ready card, in other words, 4Gb memory will be more than enough.
Well now, 4GB VRAM for 4K means 1GB VRAM for 1080p. The new games or hard moded ones, even 2GB VRAM is not enough anymore for 1080p. Meaning that for 4K gaming you need at least 8GB of VRAM...
Posted on Reply
#9
the54thvoid
Intoxicated Moderator
Prima.VeraWell now, 4GB VRAM for 4K means 1GB VRAM for 1080p. The new games or hard moded ones, even 2GB VRAM is not enough anymore for 1080p. Meaning that for 4K gaming you need at least 8GB of VRAM...
AMD doesn't think so and they brought us eyefinity- argue with them. This is from their R9 290X promo

In fact, I'm sure if I could summon the mighty W1zzard he'll say why you're very wrong about 8GB for 4K.

Posted on Reply
#10
kn00tcn
wow what, prima vera, did you think vram requirements scale linearly with resolution? a ton of the used space is textures

let's take bf3 or 4 for example, on ultra, probably going past 1.5gb before any resolution is added to the equation

you can calculate how much space a frame takes up (i forgot... was it 8bit image, 1byte per pixel, 8.3mb for a frame? sounds like i messed up)

by the way, 4k is just 4x1080p, we already do 3x1080p for years with eyefinity/surround, so it's not that massive of a jump (in processing usage)
Posted on Reply
#11
AsRock
TPU addict
I have seen Arma 3 take up to 1.6GB with no mods and games like Skyrim even more so i can see 8GB easy.

TPU OSD with GPU Z will give all the info needed in game.
Posted on Reply
#12
Filiprino
kn00tcnwow what, prima vera, did you think vram requirements scale linearly with resolution? a ton of the used space is textures

let's take bf3 or 4 for example, on ultra, probably going past 1.5gb before any resolution is added to the equation

you can calculate how much space a frame takes up (i forgot... was it 8bit image, 1byte per pixel, 8.3mb for a frame? sounds like i messed up)

by the way, 4k is just 4x1080p, we already do 3x1080p for years with eyefinity/surround, so it's not that massive of a jump (in processing usage)
The frame buffer alone is small, that's true. But you have to treat many more pixels and apply those textures to them. If you use any form of antialiasing then sure you'll need more memory, and of course with mods you increase the need for more memory.
Posted on Reply
#13
lemonadesoda
Going from 1080p to 4K needs practically no additional memory. A single frame 4K needs 4K*32bit colour=32MB. Add in multiple frames, back-buffers, over-draw, Z-stencilling etc. and you can probably deliver the same experience in 4K with just an extra 256MB.

But that means nothing else changes. You are using the same texture maps as the 1080p image. So you can get greater FOV or you get a bigger picture but same quality as 1080p.

If you want to bump up the quality of the picture to UHD+, then you are going to need new texture maps. That means more RAM on the GPU, but also much greater install size of the game. (Bumping up textures 2-4x).

So yes a 2GB 4K card is enough for using legacy 1080 textures, but 8GB for more and UHD texture resources.
Posted on Reply
#14
the54thvoid
Intoxicated Moderator
lemonadesodaGoing from 1080p to 4K needs practically no additional memory. A single frame 4K needs 4K*32bit colour=32MB. Add in multiple frames, back-buffers, over-draw, Z-stencilling etc. and you can probably deliver the same experience in 4K with just an extra 256MB.

But that means nothing else changes. You are using the same texture maps as the 1080p image. So you can get greater FOV or you get a bigger picture but same quality as 1080p.

If you want to bump up the quality of the picture to UHD+, then you are going to need new texture maps. That means more RAM on the GPU, but also much greater install size of the game. (Bumping up textures 2-4x).

So yes a 2GB 4K card is enough for using legacy 1080 textures, but 8GB for more and UHD texture resources.
Awesome answer :toast:

It sounds like it makes total sense. In other words, in the future a game like BF5 for 4k resolution texture maps would take upwards of 100GB drive space for all the locations and ultra HD texture maps? Whereas, everything programmed now has a set 'density' of texture and as such at 4k resolution will just be scaled up?

So that makes everybody right!
Posted on Reply
#15
hardcore_gamer
The biggest problem in creating a 4k setup is the monitor itself. I wouldn't mind spending ~$1000 for a couple of 290s, but $3500 for a monitor is just crazy.
Posted on Reply
#16
dom99
hardcore_gamerThe biggest problem in creating a 4k setup is the monitor itself. I wouldn't mind spending ~$1000 for a couple of 290s, but $3500 for a monitor is just crazy.
I have seen a particular cheap 4k TV / monitor here

www.amazon.com/dp/B00BXF7I9M/?tag=tec06d-20

Im waiting for it to come to the UK, but for the price it seems like an affordable 4k setup, they also make a cheaper 39inch version
Posted on Reply
#17
newconroer
the54thvoid^ this.

This is pure compute - it has no real world consumer benefit to gamers. And you don't need 12Gb memory for 4k. In fact the R9 290X is being touted by AMD as the proper 4k ready card, in other words, 4Gb memory will be more than enough.
I challenge that claim when you are running native 4096x maps and Skyrim with texture and ENB mods.
Posted on Reply
#18
Prima.Vera
I donno... Mass Effect 3 with 4K textures on 1080p uses 2GB of VRAM easily from ~500MB with default textures... I haven't play ME3 on a 4K resolution, but with 4K texures and some SMAA, I think it can take up to 4GB of VRAM easy. Same with Skyrim on 4K textures...
Posted on Reply
#19
jihadjoe
^ I'm not exactly sure what's right anymore lol.

Lemonsoda says the textures themselves shouldn't take any extra memory even when moving to 4k, and that's true. The frame buffer gets bigger, but that's a paltry 32GB per frame. Even with triple buffering there should be plenty of VRAM to spare. Kinda makes sense, but then there are other things to consider...

If the primitives in a scene become larger dimensionally, then shader operations on those primitives will take up more RAM, no matter that the texture data remains the same size. Each of the stages in the DX11 pipeline has to store data for each primitive, not just the final rendered scene. That means 4x the number of pixels for each triangle, and in a complex scene with lots of triangles that suddenly became bigger, it could easily mean more than 4x the total VRAM required.
Posted on Reply
#20
TheoneandonlyMrK
the54thvoidNo. Maxwell could be as early as end 1st quarter 2014 or later.
Not at all, the 800 seties will be clock bumped rabrands as maxwell is not expected before 2015 in most circles q4 2014 if we and nvidia are lucky hence the compute card you're seeing,, its shoddy brothers are probably nvidias answer to the R9 290x well that and bribeing anyone they can.
Posted on Reply
#21
the54thvoid
Intoxicated Moderator
theoneandonlymrkNot at all, the 800 seties will be clock bumped rabrands as maxwell is not expected before 2015 in most circles q4 2014 if we and nvidia are lucky hence the compute card you're seeing,, its shoddy brothers are probably nvidias answer to the R9 290x well that and bribeing anyone they can.
My post was completely true. Unless it comes out early Q1 2014. I said, "late Q1 2014 or later"

Besides, where is the source for your 2015 release date? Don't say Semiaccurate. Other places, just as unprovable say Q1 2014.
Posted on Reply
#22
PopcornMachine
I'm afraid this card is going to be +$1000.

So not very interesting to me.
Posted on Reply
#23
HumanSmoke
theoneandonlymrkNot at all, the 800 seties will be clock bumped rabrands as maxwell is not expected before 2015 in most circles q4 2014 if we and nvidia are lucky hence the compute card you're seeing,, its shoddy brothers are probably nvidias answer to the R9 290x well that and bribeing anyone they can.
C'mon, where's the neg stats to go with general neg? STOP SHORTCHANGING ME!
theoneandonlymrkAre you so neg on Amd you will jump on any possible neg stat.
:rolleyes:
Posted on Reply
#24
TheoneandonlyMrK
HumanSmokeC'mon, where's the neg stats to go with general neg? STOP SHORTCHANGING ME!

:rolleyes:
no neg stats just opinion and rumour:p

for a start Tmc is Ramping expenditure in 20nm, in laymans terms that means shut up already were on it, i got the stuff on the way and we have it sussed ,honest,, err about yields though.

not shouting about Tsv or 2.5D or cube loud though are they??
Posted on Reply
#25
HumanSmoke
theoneandonlymrkno neg stats just opinion and rumour:p
for a start Tmc is Ramping expenditure in 20nm, in laymans terms that means shut up already were on it, i got the stuff on the way and we have it sussed ,honest,, err about yields though.
not shouting about Tsv or 2.5D or cube loud though are they??
Absence of evidence is not evidence of absence

If you're expecting daily updates on process fabbing you're shit out of luck since process isn't sexy enough to warrant front page press for mainstream sites- but here's the piece about TSMC's award of BDA's platform partnership, and the design flows for 16nm FinFET (20nm HKMG) to tide you over.
Posted on Reply
Add your own comment
May 9th, 2024 12:17 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts