Monday, December 19th 2011

NVIDIA Kepler To Do Away with Hotclocks

Since the days of NVIDIA's very first DirectX 10 GPUs, NVIDIA has been using different clock domains for the shaders and the rest of the GPU (geometry domain). Over the past few generations, the shader clock has been set 2x the geometry domain (the rest of the GPU). 3DCenter.org has learned that with the next-generation "Kepler" family of GPUs, NVIDIA will do away with this "Hotclock" principle. The heavy number-crunching parts of the GPU, the CUDA cores, will run at the same clock-speed as the rest of the GPU.

It is also learned that NVIDIA will have higher core speeds overall. The clock speed of the GK104, for example, is expected to be set "well above 1 GHz", yielding compute power "clearly over 2 TFLOPs" (3DCenter's words). It looks like NVIDIA too will have some significant architectural changes up its sleeve with Kepler.Source: 3DCenter.org
Add your own comment

18 Comments on NVIDIA Kepler To Do Away with Hotclocks

#1
Damn_Smooth
I wish these weren't so far out, it would be nice if they were available soon.

Damn bta, you're on a roll. :toast:
Posted on Reply
#2
ViperXTR
same clocks? so it will now run in 2 clocks? (core and memory) and no more option to tune the 3rd clock (just liek the Radeon HD series)
Posted on Reply
#3
DarkOCean
So the shader clock =core clock just like on amd cards, they are making the same thing now but with many cores or what?
Posted on Reply
#4
BrooksyX
I hope this comes out soon so it gives amd some competition and prices will be driven down.
Posted on Reply
#5
CrAsHnBuRnXp
Before teh release of the 8xxx series of cards by nvidia, there never used to be shader clocks. Was always just core and memory.
Posted on Reply
#6
ViperXTR
Teh GeForce 7 had an internal variable clocks tho, and it can be exposed in some GPU utilities (particularly RivaTuner with advanced tweaking), but yes the G80 era ushered separation of Rasterizing engine from the shader engine.
Posted on Reply
#7
CrAsHnBuRnXp
by: ViperXTR
Teh GeForce 7 had an internal variable clocks tho, and it can be exposed in some GPU utilities (particularly RivaTuner with advanced tweaking), but yes the G80 era ushered separation of Rasterizing engine from the shader engine.
I remember unlocking the 6800gs AGP back in the day. They went by pipelines back then. :laugh:
Posted on Reply
#8
ViperXTR
Ah yes the pipeline unlocking was the talk back then, but there is a setting in the GeForce 7 series cards for more than just that (monitoring multiple clock domains), i forgot where i found it tho, its been so long.
Posted on Reply
#9
CrAsHnBuRnXp
Back before I got the 6800GS on AGP, I had an FX5200 and it SUCKED so hardcore. I didnt have the money to upgrade my machine then to a pci-e board back then. I wanna say it was in January or February i remember them announcing a 6800gs for AGP and I managed to get the money. I was so stoked. It was the best thing you could get your hands on for AGP. Then out came the news about unlocking the card with riva tuner. Those were the days.

Here's my specs from back in the day:
PSU: Sunbeam 450 Watt (Came with Case)
Case: Sunbeam Transformer Full Tower (Black) w/ Blue LED's
Motherboard: Gigabyte GA-K8NSC-939 F8 BIOS
Processor: AMD Athlon64 3200+ @2.6GHz
Memory: OCZ 2GB Kit DDR400 PC3200 Gold Gamer eXtreme XTC Edition Dual Channel Memory @ 2-3-3-8
Hard Drive: 3 Maxtor Drives; 1x40GB, 1x30GB, 1x300GB SATA
Video Card: BFG nVidia GeForce 6800GS OC (412/1.12) w/ 16 pipes (unlocked)
Monitor: SAMSUNG 19 inch SyncMaster 930B
Sound Card: SoundBlaster Live! 24-bit
Speakers/Headphones: unknown/Phillips SBC HP250
Keyboard: Logitech Cordless EX110
Mouse: Logitech Cordless EX110
Mouse Surface: Mouse Pad
Operating System: Microsoft Windows XP Professional w/SP2
Posted on Reply
#10
ViperXTR
i had a 5900XT and a 6600GT back then :D, had to flash my 6600GT to enable temp monitoring

I wonder if Kepler would still enable domain clock monitoring (and maybe even OC;ing different domains)
Posted on Reply
#11
arnoo1
And there goes nvidia's performans, low shader clock can never be good, i wish to see lower core clock, higher shader clock to reduce the tdp damn you nvidia:p
If this is treu i gues we have to wait how this performs
Posted on Reply
#12
Jon A. Silvers
They`ll come close +- 10%,they melted core and shader clock into one,same architecture......
In the end,the worst scenario for amd will be comparison like the 6xxx vs 5xx,just a little more step in performance gain,than the temporary gen did with previous one.
Posted on Reply
#13
thebluebumblebee
So, F@H performance is going to be about the same as the current generation? Twice as many shaders running at half the speed. Will have to see how well AMD's APP Acceleration on the 7xxx series works.
Posted on Reply
#14
MxPhenom 216
Corsair Fanboy
by: Jon A. Silvers
They`ll come close +- 10%,they melted core and shader clock into one,same architecture......
In the end,the worst scenario for amd will be comparison like the 6xxx vs 5xx,just a little more step in performance gain,than the temporary gen did with previous one.
what? Did you even read. At the end of the OP it says that kepler will have signicant architectural changes. It won't be the same exact as fermi
Posted on Reply
#15
HalfAHertz
Hm I don't really see this as a good thing..
Posted on Reply
#16
Jon A. Silvers
what? Did you even read. At the end of the OP it says that kepler will have signicant architectural changes. It won't be the same exact as fermi
It will be more efficient,less power consumption,better computing power and tessalation...but it wont have great impact on raw performance in just core per clock,and in the same amount of shaders. they cant do 1.5x faster at same clock,shader count with just architecture change,maybe 1.1x,but thats it. However they will make very interesting product.
Sadly they wont release it when D3, MassEff3 arrive ,and than I`ll have to buy gtx 570 or radeon 6970, 78xx or if budget allow 7950 .
Posted on Reply
#17
mediasorcerer
Good one, theres no doubt it will be a fine series, thank god for amd & nvidia, without either one, our choices could be so much more limited and prices so much higher.
Posted on Reply
#18
de.das.dude
Pro Indian Modder
nvidia has chosen to opt AMDs design!

AMD FTW!
Posted on Reply
Add your own comment