• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

ATI's stream enhances Folding@Home performance by over 3000%.

zekrahminator

McLovin
Joined
Jan 29, 2006
Messages
9,066 (1.29/day)
Location
My house.
Processor AMD Athlon 64 X2 4800+ Brisbane @ 2.8GHz (224x12.5, 1.425V)
Motherboard Gigabyte sumthin-or-another, it's got an nForce 430
Cooling Dual 120mm case fans front/rear, Arctic Cooling Freezer 64 Pro, Zalman VF-900 on GPU
Memory 2GB G.Skill DDR2 800
Video Card(s) Sapphire X850XT @ 580/600
Storage WD 160 GB SATA hard drive.
Display(s) Hanns G 19" widescreen, 5ms response time, 1440x900
Case Thermaltake Soprano (black with side window).
Audio Device(s) Soundblaster Live! 24 bit (paired with X-530 speakers).
Power Supply ThermalTake 430W TR2
Software XP Home SP2, can't wait for Vista SP1.
A while ago, NVIDIA released Gelato, which was its entry into the idea of stream computing. Stream computing is a way to run CPU-heavy applications on a GPU. This theoretically would be great, considering the multiple shader and vertex processors on a GPU. While NVIDIA's Gelato was an early proponent of the concept, ATI seems to have nearly perfected it. Their version of stream computing can increase a programs performance by between 10 to 40 times it's orignal performance, effectively making a graphics card's performance equal to the performance of a rack of servers. Of course, this is only if a program is coded to utilize the extra processing power. If it isn't, then the program will spend too much time communicating between CPU and GPU, and very little work will get done, causing a performance loss.


Pictured from From left to right: Chas Boyd of Microsoft, Jeff Yates of Havok, Vijay Pande of Stanford University, Michael Mullaney of Peakstream, and Dave Orton of ATI. Stanford University, known for creating Folding@home, reports that their program is fully compatible with ATI's stream, and the performance benefits are very real. Packstream makes middleware, and their programs fully support stream. And because they make middleware, programmers can use Packstreams API's to make their programs, which will hence be fully compatible with stream computing. Microsoft says that some components of Vista, like Aero, already use some elements of stream technology. And Havok was there to confirm physics on a GPU.

View at TechPowerUp Main Site
 
:cool: I know I'm anxiously awaiting the release of the F@H GPU client, and have been for many months now
 
This sounds a lot like the "merged AMD CPU & ATI GPU" thread we had a week ago. And again, I'm saying 'damn, that sounds a hell of a lot like the Cell processor...'
 
I dont like this!

If the GPU is being used to run all of the programs in the back, it's performance in games will be far lower. So u would need an extra GPU to work as a CPU and an extra GPU to work as a GPU.

And what does this mean for CPUs if GPUs are so much better? Why do ppl continue to create CPUs if GPUs can do their work 40 times better? And why do CPUs suck so much, can Intel/AMD learn from nVidia/ATI and inhance their CPUs to be as good as a GPU?
 
I dont like this!

If the GPU is being used to run all of the programs in the back, it's performance in games will be far lower. So u would need an extra GPU to work as a CPU and an extra GPU to work as a GPU.

And what does this mean for CPUs if GPUs are so much better? Why do ppl continue to create CPUs if GPUs can do their work 40 times better? And why do CPUs suck so much, can Intel/AMD learn from nVidia/ATI and inhance their CPUs to be as good as a GPU?

Maybe this is what amd was thinking as they intend to aquire ati .Turn the useful bits of a gpu into a coprocessor within their own cpu ?
 
Maybe this is what amd was thinking as they intend to aquire ati .Turn the useful bits of a gpu into a coprocessor within their own cpu ?

I doubt it, but thats a good idea.

This issue is troubling though. I mean if GPUs are better than CPUs, are GPUs the next generation of CPUs, and if so then why are ppl still making CPUs?
It seems like ppl are just going finding out that the GPU wich has exsisted for so long, is actually more powerfull than the CPU.
 
that...is awesomely cool. now if only gamesprogrammers could take heed of this and re-familiarise themselves with optimised code *sigh*
 
I dont like this!

If the GPU is being used to run all of the programs in the back, it's performance in games will be far lower. So u would need an extra GPU to work as a CPU and an extra GPU to work as a GPU.

And what does this mean for CPUs if GPUs are so much better? Why do ppl continue to create CPUs if GPUs can do their work 40 times better? And why do CPUs suck so much, can Intel/AMD learn from nVidia/ATI and inhance their CPUs to be as good as a GPU?

I think you misunderstand...Stream tech isn't a hardware-level thing, it's a type of programming. Your performance in games is quite safe, I assure you :).
 
Are there any beta versions of FAH that can utilize the gpu's yet? i havn't seen anything about it yet, but i live in hope :p

Very interesting stuff though. This would also make way for some more benchmarks, so that gpu's can be compared fairly. IE those without sm3.0 support could be tested against the newer cards with sm3.0.
 
I think you misunderstand...Stream tech isn't a hardware-level thing, it's a type of programming. Your performance in games is quite safe, I assure you :).

That's exactly right, too, & it ISN'T NEW - I have way WAY old manuals on UNIX System V Streams Programming sitting here in my bookshelf in fact...

2 ways that I know of to open files are in BLOCK read/writes, or STREAMS...

Streams are, iirc, what allow us to HAVE, "streaming media", like .mpg files (vs. .AVI type for instance/example):

Streamable data lets you have only PART of a file & still be able to play it AND over the "wire" (internet) too.

Each portion of the TOTAL/COMPLETE file is like a tiny file, unto itself (containing its header & layout)... & not demanding that you have the file from its init. byte up to its "trailer record" (signal of eof (end-of-file)).

APK
 
I mean if GPUs are better than CPUs, are GPUs the next generation of CPUs, and if so then why are ppl still making CPUs? It seems like ppl are just going finding out that the GPU wich has exsisted for so long, is actually more powerfull than the CPU.

Well, CPU's are a "general purpose" mechanism, with a FULLY documented & supported API that programmers can thus, utilize (because it's documented)...

GPU's are more like an ASIC (application specific integrated circuit, & specific to 3D video work - & thus, since they are dedicated to that type of work, or rather, WERE SOLELY DEDICATED TO IT in the past? They will be FASTER @ some things than CPU's are) afaik...

(& apparently, prior to this ATI development, didn't have an generally available API before for 'general purpose calculation usage' from the programmatic end it seems, if that excerpt leads us to any conclusions here).

* Now, it appears they do, via ATI supplying the API & quite possibly documented examples thereof...

APK

P.S.=> You know who probably knows a heck of a lot about this? W1zzard! He may have to correct me on some of the points I brought up earlier, but that's ok - it's NOT my "forte/area of specialization" in this field, things @ this end of it @ least, & I like anyone else, can always stand to gain some knowledge... apk
 
Last edited:
I doubt it, but thats a good idea.

This issue is troubling though. I mean if GPUs are better than CPUs, are GPUs the next generation of CPUs, and if so then why are ppl still making CPUs?
It seems like ppl are just going finding out that the GPU wich has exsisted for so long, is actually more powerfull than the CPU.

Gpu's are better than CPU's, however they are more expensive to make. Plus, all the chip makers have to believe that their CPU's aren't good enough and they have to start over with a new design. Nobody wants to do that, and Intel doesn't even want to risk tweaking it's own manufacturing process to see what works and what doesn't. AMD buying ATI allows them to blend the technology so-to-speak, and avoid totally starting over.

Again, programming code has to be written to take advantage of it.
 
So this program only works for the X1800 and X1900 series correct? My X800 isn't warming up at all so I assume it's not doing anything.
 
No, stream tech has yet to be implemented in F@H...I think.
 
Okay, so the item in the download list is just letting everybody know of the press release. Looking at the link it is sending me to version 5.02, so I guess it's just a place holder.
 
Gpu's are better than CPU's, however they are more expensive to make.

ATI's R600 to have over 500 million transistors

Yet another example, as the S939 AMD CPU's only have 114 million, and the Core 2 Duo has 291 Million.
 
I think you misunderstand...Stream tech isn't a hardware-level thing, it's a type of programming. Your performance in games is quite safe, I assure you :).

If u are using the GPU for programs, and not just rendering, then the rendering in games will not be as fast/good because a certain % of the GPU would be going towards the programs and only a smaller % would go towards rendering in games.

It dosent matter if its software or not - if its using the GPU for for other things, other than rendering then the rendering performance will be lower.
 
I don't think this will be an in-game procees, but probably something like extra processing power during 2D apps.
 
Back
Top