The GeForce 7050 fares better than the GeForce 6150 in Unreal Tournament 2004 in all modes.
On the face of it, our conclusion on game results seems evident. NVIDIA really managed to raise the performance bar for integrated graphics on the AMD platform. But judging by test results in DOOM 3 and FarCry, it's still far from the level of even primitive stand-alone accelerators, such as GeForce 7300 LE.
But it's a shallow conclusion. The fact is, $50 for a discrete graphics card is a noticeable addition to the price of a motherboard on these chipsets ($80-100). Is it really necessary, if it can "help" you only in DOOM III (among our test games) and probably in some other dynamic games of the last but one generation? To my certain knowledge, if you want to play high-tech dynamic games, you'll need a graphics card for at least $100-$150 (and you'll still have to go to serious compromises in settings). And integrated graphics will suffice in games that do not require much GPU resources (both old and modern ones). Fortunately, solutions from AMD and NVIDIA render 3D environments correctly and offer sterling support for DirectX 9.0.
What concerns the direct comparison of integrated chipsets, the GeForce 7050 should be considered the most "playable" solution, if we speak of dynamic arcades that require 30-40 fps. We may even reduce resolution and graphics settings in order to increase the framerate in such games. AMD 690G will offer similar or even better support for quiet games in high resolutions (quests, RTS, etc), which also use DirectX 9.0 functions.
Video decoding tests had a surprise in store for us - the GeForce 7050 and GeForce 6150 SE (6100) demonstrated similar results, to within a measurement error, differences in driver versions, etc. Perhaps, that's the effect of their identical frequencies. So we decided to run additional tests with enabled Cool'n'Quiet, in order to evaluate the driver in modes with a partially loaded CPU. We also used the above-mentioned MSI GeForce 7300 LE in our tests. One more series of tests was run with disabled hardware acceleration (we just removed the video driver).
Decoding MPEG2 of the standard quality is not a difficult task even for a CPU without hardware support from the chipset, at least if you don't watch a movie and run another background task. If you do, hardware support in both chipsets will cope with the task just as well as a graphics card. And the CPU load will be reduced to several percents. When we enabled Cool'n'Quiet in this test, the frequency dropped to minimum (1 GHz, x5 multiplier), and the load grew strictly proportionally.
We used a CyberLink decoder in this test with MPEG2 video of high resolution (1080i). It's our standard video fragment, we've had no problems with it before. But this time the GeForce 7050 played it with some jerking, despite the minimal CPU load. When we switched to VLC (an omnivorous player notable for much better optimizations), it smoothed the playback over a little. But we decided to write this problem off to buggy NVIDIA drivers and skip this test in our analysis.
In return, we added a video fragment in MPEG4 (DivX) format with a low bitrate (0.72 MB/s). But its much higher compression provides a heavier load. However, the difference turned out to be small from the point of view of software decoding. And hardware decoders have noticeably worse optimizations for this format. Pay attention to the result of AMD 690G - it outperformed much even an external graphics card.
WMV records, at least those encoded with the Microsoft codec, are a real fount of illustrative examples for hardware-assisted decoders. We don't know whether it's a peculiarity of this format or the player is not optimized well enough. But this is the only format, which is too tough for our processor. Its playback would freeze from time to time in both cases. The following graph explains this phenomenon for decoding a 720p video fragment (the 50% load is far from critical).
Write a comment below. No registration needed!