jamsbong
New Member
- Joined
- Mar 17, 2010
- Messages
- 83 (0.02/day)
System Name | 2500Kjamsbong |
---|---|
Processor | Core i5 2500K @ 4.6Ghz |
Motherboard | Asrock Extreme 4 Z68 |
Cooling | Zalman Reserator (CPU and GPU) |
Memory | DDR3 8GB |
Video Card(s) | EVGA Nvidia 560Ti 1GB |
Storage | 60GB Kingston SSD |
Display(s) | 24" Dell IPS |
Case | CoolerMaster 690 Advanced II |
Audio Device(s) | on-board |
Power Supply | Zalman ZM-600HP modular 600watt |
Software | Windows 7 |
Hi Everyone,
I thought I'd share this interesting findings about GPU accelerated video encoding. It may not be interesting especially if you already done some tweaking in the past.
I've used Arcsoft's mediaconverter7 trial version for the tests. To enable GPU encoding, you have to enable hardware encoding in the ATI driver menu. I used ATI tray tool to observe the GPU usage and task manager for the CPU usage.
I found that most of the time, the CPU is doing all the hard work. My Quadcore is flexing its muscle even when GPU encoding is enabled. The most the GPU needed was about 20% (at 240Mhz underclocked speed).
The software utilises GPU with .mp4 file extension conversion only. I did not manage to find other formats that powers up the GPU usage. Another evidence is the quality, you can tell if it is the GPU that converts the file as the quality can be quite bad.
I don't have a Nvidia card yet so I can't really tell if CUDA can do more GPU enable file format.
As for the CPU and GPU utilisation, it seems that the higher resolution conversion will maximises the hardware workload and to me that is more efficient. Kinda like playing game at high resolution to make the GPU work hard. at Low DVD resolution, the CPU was about 50% and GPU... nil. at HD resolution, 90% and up to 20% GPU.
As for quality. It seems that using CPU standalone conversion, the quality at low bitrate is SO much better than GPU and it is what one expects. The right way to say it will be an efficient compromise in quality. In the case of GPU+CPU, once you reduce the bitrate, the picture suddenly becomes pixelated and very ugly. At the recommended high bitrate, both GPU and CPU standalone had similar quality. This means that GPU encoding has a huge drop-off in quality when there is restriction in the bitrate.
In regards to acceleration. It seems there is only a modest amount of speed improvement. I was testing a 47second video. I will try to test a 1hr video to see if there is a significant improvement with speed using GPU. at the moment, the speed is almost identical.
I hope this has been insightful.
Cheers
James
I thought I'd share this interesting findings about GPU accelerated video encoding. It may not be interesting especially if you already done some tweaking in the past.
I've used Arcsoft's mediaconverter7 trial version for the tests. To enable GPU encoding, you have to enable hardware encoding in the ATI driver menu. I used ATI tray tool to observe the GPU usage and task manager for the CPU usage.
I found that most of the time, the CPU is doing all the hard work. My Quadcore is flexing its muscle even when GPU encoding is enabled. The most the GPU needed was about 20% (at 240Mhz underclocked speed).
The software utilises GPU with .mp4 file extension conversion only. I did not manage to find other formats that powers up the GPU usage. Another evidence is the quality, you can tell if it is the GPU that converts the file as the quality can be quite bad.
I don't have a Nvidia card yet so I can't really tell if CUDA can do more GPU enable file format.
As for the CPU and GPU utilisation, it seems that the higher resolution conversion will maximises the hardware workload and to me that is more efficient. Kinda like playing game at high resolution to make the GPU work hard. at Low DVD resolution, the CPU was about 50% and GPU... nil. at HD resolution, 90% and up to 20% GPU.
As for quality. It seems that using CPU standalone conversion, the quality at low bitrate is SO much better than GPU and it is what one expects. The right way to say it will be an efficient compromise in quality. In the case of GPU+CPU, once you reduce the bitrate, the picture suddenly becomes pixelated and very ugly. At the recommended high bitrate, both GPU and CPU standalone had similar quality. This means that GPU encoding has a huge drop-off in quality when there is restriction in the bitrate.
In regards to acceleration. It seems there is only a modest amount of speed improvement. I was testing a 47second video. I will try to test a 1hr video to see if there is a significant improvement with speed using GPU. at the moment, the speed is almost identical.
I hope this has been insightful.
Cheers
James