HD video decoding
Assistance in decoding HD video from Blu-ray discs is also very important for modern integrated chipsets - our tests performed a year ago showed that only a very powerful processor helped cope with this task with only chipset support. Besides, purely software players, famous for their low CPU requirements, such as CoreAVC, tend to simplify the picture (according to many active users of HD content). That's natural - there is no such thing as a free lunch. Besides, an expensive processor is not a perfect match for an inexpensive multimedia computer, especially for a compact mediacenter.
Fortunately, manufacturers expand features of their graphics processors from traditional 3D rendering to full support for all video decode stages. Judging by information we have, this problem has already been solved. Even cheap cards (for example, Radeon HD 2400 Pro) can radically offload a CPU in video decode tasks, although they are a waste of money for a gaming PC. All the more interesting to see how chipsets perform here - they are also advertised now to be able to play Blu-rays. That is they can decode HD video of high bitrate in all popular formats - VC1, H.264, and MPEG2 HD.
But first of all we should say a few words about settings. There is an opinion that enabling hardware-assisted video decoding is not a trivial task. In other words, it's not enough to install the drivers and a player, you may need to experiment with programs and settings, which are not always accessible to common users. In actual fact, most problems are not caused by original discs and their images, but by so-called rips (re-encoded images to make them smaller). However, no one promised to support them. It's not always necessary, if HD video is converted into MPEG4 XviD/DivX, that is with much losses (such recordings are easy to decode even for slow processors). But high-quality rips preserve the H.264/VC1 format, so hardware assistance is welcome. If the drivers are written correctly, and a given rip is not totally screwed, it can also be decoded on the hardware level, because decode algorithms are the same.
Judging by posts in issue-related forums, AMD programmers provide better compatibility with such rips. At least, when we speak about decoding by discrete graphics cards (it's a positive side effect rather than a dedicated optimization). But it's another story. You cannot just take a few rips and make conclusions why some of them are supported, while the others are not. A large selection of files is required, and results will be interesting only to users of the HD tracker, which offers such files.
We have a different task now - find out how efficient hardware-assisted video decoding is in given chipsets by the example of recordings in various formats. All these files have passed the hardware-assisted decoding test via the standard DXVA with the Radeon HD3870 card. That's why we use "remuxes" (Blu-ray/HD-DVD images on a hard drive without additional compression, that is with the original bitrate). It must be noted that HD pirates prefer to use this format for discs with an initially low bitrate, below 20 Mbps (or the resulting file size will be too big). Thus, such files will show us only performance differences between various chipsets, absolute values of CPU load during video playback will be published above.
In practice, if a recording is valid, and a GPU can decode a given format, there are no problems with enabling hardware-assisted video decoding. Moreover, we had a funny situation with AMD 780G, VC1 video, and CyberLink decoder - we failed to disable hardware-assisted decoding (for the sake of comparison). Deselecting the Use DXVA check box and other changes did not increase the CPU load, and the player reported that hardware acceleration was enabled. The reverse is also true - if this support exists only on paper, no combinations of options will revive the hardware decoder.
However, we can formulate some instructions on how to achieve optimal results. First of all, you should update BIOS and use the latest video driver, because the second reason for complains about hardware decoding problems is the following fact - graphics cards' (and chipsets') HD decoding features are "polished on the fly". For the same reason, you should choose the most popular operating system for a media center on a new chipset or a graphics card (to be more exact, popular from the point of view of driver developers). In case of our today's chipsets, it's the 32-bit version of Windows Vista. The second popular system is Windows XP 32-bit (corresponding updates find their way to its drivers a tad later, but we tested video decoding on AMD 780G in XP and obtained the same results as in Vista). In case of 64-bit systems you may have to wait for the drivers to become optimized on the same level. Further step-by-step directions:
- In BIOS, set UMA size to 256 MB. We found this tip in a manual to one motherboard on AMD 780G. But to all appearances, it's valid for all chipsets.
- Install Power DVD 7.3. That may be enough for owners of Blu-ray drives.
- Install Matroska Splitter - it's an unpacker of MKV/TS containers.
- Install Media Player Classic (Home Cinema).
- Start the player, deselect the MPEG PS/TS/PVA check box in the internal filters group.
- In the same menu, add H.264 and VC1 decoders from CyberLink as external filters (choose from the list).
In our tests we registered the average CPU load generated by a player (CPU time of the player process during playback of a test fragment) and fluctuations of the total CPU load during video playback (no additional background processes were started, of course). Cool'n'Quiet was disabled.
|MPEG2, 1080p, 24 fps, 16 Mbps
|MPEG2, 1080i, 30 fps, 13 Mbps
|H264, 1080p, 24 fps, 17 Mbps
|VC1, 1080p, 24 fps, 17 Mbps
AMD 780G performs practically on a par with graphics cards on Radeon HD 3400 (that's where the UVD came from, to all appearances). That is our dual-core processor is almost completely offloaded, and playback is smooth (this is confirmed by playback statistics, no dropped frames). However, HDCP encoding may be used, if a Blu-ray disc is played back via DVI or HDMI. It may significantly increase the load. Besides, we've already mentioned that video files may have higher bitrates - up to 28-30 Mbps. But it's crystal clear that we still have a big performance reserve in this case - updating and indexing an antivirus database had no effect on video playback.
What concerns competitors, they do not look good in this test. However, GeForce 8200 managed to play all our test files smoothly. It lost only five frames in the MPEG2 1080i file, and it was difficult to notice this loss, there were no visible drops in framerate. G35 also lost three frames in this video file, although the CPU load was relatively low. We have no doubts that this chipset helps a CPU decode this format. But we failed to enable hardware acceleration of H264/VC1 decoding on G35. Playback was jerky.
We noticed no static image differences (on screenshots), except for brightness, contrast, and saturation (which can be easily adjusted in ATI Avivo/NV PureVideo options). We also have no gripes with the deinterlacing quality.
The verdict is clear. Further tests of the AMD chipset will be interesting only if we increase their complexity - test dual-threaded decoding, install a weaker CPU, or reduce its clock rate and try to decode not only remuxes, but also hard rips and original Blu-rays (to be continued). On the contrary, we do not want to test competing chipsets any further, because NVIDIA apparently lags behind in drivers (each new version changes something in HD decoding).Perhaps, it's also because NVIDIA is in arrears of AMD here, as in case of discrete cards. The gap is small, but it's at least half a step. And Intel is defeated by both competitors by more than one step. The company does not even promise to add a special unit to its chipsets in the nearest future, and it will be very difficult to catch up in alternative ways (raising chipset clock rates). It will certainly be more costly in power consumption terms.
Write a comment below. No registration needed!