iXBT Labs - Computer Hardware in Detail

Platform

Video

Multimedia

Mobile

Other

Counter Counter Strike...
NVIDIA's Answer to ATI's Counter-Offensive -
NVIDIA GeForce 6800 GS 256MB,
NVIDIA GeForce 6600 DDR2 256MB








TABLE OF CONTENTS

  1. Introduction
  2. Video card's features
  3. Testbed configurations, benchmarks
  4. Test results
  5. Conclusions



Not long ago we were enjoying summer, saw the launch of new cards from NVIDIA, participated in a splendid party in Paris... Then the 7800GT appeared... Now it's time of Christmas sales, we are expecting snows and frosts. Time flies. Especially in autumn. Having wound up the spring of time, summer flies in an instant and takes autumn with it... It runs out of the energy after New Year's Eve and the time before spring drags on very slowly...

The bright red-white whirling of autumn announcements from ATI Technologies is over. Leaves had fallen and now we are clear that the month is over and some representatives of the X1xxx have just found their way to the shelves. Strange as it may seem, the low-end solutions are nowhere to be found. Even though it's time for all the products except for the X1600 XT to be in full sale. They must be! That's obviously not the idea of the Canadian specialists. Alas, another paper announcement was made on October the 5th.

Words are faster than deeds. It's a well known fact... Santa Claus will come soon and stimulate people to spend large sums of money. But what will the red-white camp offer? Good dreams and New Year's Eve wishes.

But strange as it may seem, NVIDIA hopes that ATI has something to sell and so it's getting ready for a new counter offensive. Counter counter offensive? It's actually a new attack.

Californian specialists are evidently concerned that after the long years of severe competition the Canadian company suddenly got out of it, weakly throwing out the X800GT and X800GTO, even without any hopes to damage the competitor... What's the most important, they got rid of the R480, as the 100 million losses may be followed by other losses for disposing of the left-overs. As for now, there is some demand and the company can still bite with old-new solutions... To say nothing of CrossFire, which hasn't actually appeared on the mass market. While ATI employees told stories about SLI with enthusiasm and anticipation in late May, they now speak of "a low market demand", etc. Alas, it's hard to implement what was so easily declared...

Even after the failure of CrossFire (yes, we all understand that it's a technology from the future, that it's not implemented in the current solutions; but still even a schoolboy understand that ITS SUCCESS depends on HOW IT'S IMPLEMENTED and HOW FAST IT APPEARS on the market. A long time elapsed - there are still no motherboards on RD480 (rare samples don't count), CrossFire Edition video cards are lost somewhere - perhaps ATI engineers decided to keep them in Toronto; we have heard nothing about similar products based on X1800: this is hardly a success, MAXX spirit seems to overlook no mistakes and is still avenging the past). To make a paper announcement again, to show the cards, and again fail to provide them to mass users is a shame. Considering a bright example from NVIDIA in summer, which delivered hundred thousands of freshly announced cards.

I share the NVIDIA's anxiety. When only two companies are left on the battlefield (Matrox abandoned the market, S3 manufactures only reference cards, which no one wants to manufacture on a large scale, XGI plays the f... that is it's constantly updating its roadmaps), such a passive position of the Canadian company is scary.

Even if not passive but a pile of mistakes for the past year, but it still does not lift their responsibility. Only foolish and small-minded specialists from NVIDIA or ATI (the lower the position, the less they use their brains) may rage and wish death to their competitor, they don't understand how deep they sink afterwards! If Intel learns that AMD is dying, Intel will do everything possible to help it. Because Intel will find itself in a bad situation after that. Microsoft has suffered the "beauty" of this situation to the full.

That's why I hope that the Canadian company will be OK, its products will come out, their prices will not be high (though it's useless to explain the situation to our retailers - they will try to get 100% benefits from a poor new card anyway and then will get stuck with this card - they won't be able to sell it even at the initial price, because the X1800XL will not sell for $650 (I have already seen this price in a Moscow shop)).

And today we are going to review the answer of NVIDIA to ATI RADEON X1600 XT and X1300 PRO. Considering that the full support for DirectX 9.0c with SM3.0 was introduced 1.5 years ago, there is no need to tell you that it's very easy to launch a product that will kill the new card from ATI. An overclocked 6600GT or even 6800 will make the X1600 XT feel bad. Add an intermediate product between the 6600 and the 6600GT (there is a huge frequency gap there) to make the X1300 PRO stagger.

That's actually what NVIDIA has done. It has an excellent NV42 chip, which has been manufactured by 0.11 micron technology, has low cost (relative to the cut down NV40 or NV45), supports the 256bit bus. In fact, 0.11 micron products from NVIDIA are very good, the huge overclocking potential speaks for itself (unlike ATI R430). When high frequencies were not necessary in order not to interfere with the NV45 sales, all GeForce 6800 cards, based on NV42, operated at 350 MHZ (core). We found out that the X1600 XT sometimes outperformed such cards due to the huge 590 MHz core clock.

It's elementary to find a sufficient number of NV42 chips that can run at 400 MHz or 450 MHz. NVIDIA experts came to an accommodation between price and performance and set the core clock to 425 MHz. They called this processor GeForce 6800GS. Why GS? Probably because of the weariness in search of the new suffix. GT and GTS were already used, but GS is still vacant. As is well known, the GeForce 6800 was equipped with DDR memory running at 700 MHz. But the 425 MHz will be gripped in a vice of such a narrow memory bandwidth. That was why a Solomonic decision was made: to equip the 6800GS with 2.0ns GDDR3 memory, there are plenty of such modules on the market and its price has dropped much. Besides, it will be based on the 7800GT PCB, which is cheaper than the 6800GT one (the 6800 PCB is evidently not compatible).

Such a product will have a recommended price of $249, that is it's identical to the price for the X1600 XT 256MB. Considering that the GeForce 6800 can be already found for less than $200 and even $190, the 6800GS price may just as well change depending on the X1600 XT. Of course, the Canadian company is in a better situation - its product has a 128bit bus and thus a much cheaper PCB. But it's also equipped with still expensive memory (it runs at 690 (1380) MHz!). Thus, ATI cannot drop the prices very much. It'll be interesting to watch this opposition.

What concerns the X1300 PRO, which has just 4 pipelines and a huge 600 MHz core clock, it will be easily opposed by the 6600 with 8 pipelines and increased core and memory frequencies. In this case NVIDIA has chosen GDDR2 memory, which is very cheap, even cheaper than DDR. GeForce 6600 DDR2 comes at $120.

So, let's analyze the new cards from NVIDIA and what they really can do. Will they really be a strong counter offensive to the products from the Canadian company.

Video Cards



NVIDIA GeForce 6800 GS 256MB
Interface: PCI-Express x16

Frequencies (chip/memory — physical (memory — effective): 425/500 (1000) MHz (nominal - 425/500 (1000) MHz)

Memory bus width: 256bit

Number of vertex pipelines: 5

Number of pixel pipelines: 12

Dimensions: 200x100x15mm (the last figure is the maximum thickness of a video card).

PCB color: green.

Output connectors: d-Sub, DVI, S-Video.

VIVO: not available

TV-out: integrated into GPU.




NVIDIA GeForce 6600 DDR2 256MB
Interface: PCI-Express x16

Frequencies (chip/memory — physical (memory — effective): 350/400 (800) MHz (nominal - 350/400 (800) MHz)

Memory bus width: 128bit

Number of vertex pipelines: 3

Number of pixel pipelines: 8

Dimensions: 190x100x15mm (the last figure is the maximum thickness of a video card).

PCB color: dark blue.

Output connectors: d-Sub, DVI, S-Video.

VIVO: not available

TV-out: integrated into GPU.






NVIDIA GeForce 6800 GS 256MB
The video card has 256 MB of GDDR3 SDRAM allocated in eight chips on the front side of the PCB.

Samsung memory chips. 2.0ns memory access time, which corresponds to 500 (1000) MHz.




NVIDIA GeForce 6600 DDR2 256MB
The video card has 256 MB of GDDR2 SDRAM memory, allocated in 8 chips on the front side of the PCB.

Infineon memory chips. 2.5 ns memory access time, which corresponds to 400 (800) MHz.






Comparison with the reference design, front view
NVIDIA GeForce 6800 GS 256MB
Reference card NVIDIA GeForce 7800 GT
NVIDIA GeForce 6600 DDR2 256MB
Reference card NVIDIA GeForce 6600


Comparison with the reference design, back view
NVIDIA GeForce 6800 GS 256MB
Reference card NVIDIA GeForce 7800 GT
NVIDIA GeForce 6600 DDR2 256MB
Reference card NVIDIA GeForce 6600


As I have already written, the 6800GS is obviously based on the PCB from 7800GT. All NV4x and G70 chips are identical in terms of their pins, so NV42 can be easily installed on such a PCB. It should be mentioned that this card has SLI connectors. There is a VIVO jack. As you may remember, all 7800GT cards are equipped with this codec. However, the Philips chip is not installed, probably in order to keep status quo with other 6800 cards without VIVO.

What concerns the 6600 DDR2, its title hints that it's a unique card with a new PCB. On the whole, this design sometimes resembles the usual 6600, but chips of a different form factor (packaging) and type leave their stamp.



NVIDIA GeForce 6800 GS 256MB

The chip is equipped with a small copper heatsink, attached to the PCB with four screws to avoid warping. The fan part of the cooler is attached at the side. The entire construction is covered by a single cap. This cooler is surrounded by a memory heatsink.

This solution differs from the 6800 Ultra by its height, that is it takes up only one slot.







NVIDIA GeForce 6600 DDR2 256MB

Everything is simple here. A hackneyed rectangular aluminum heatsink with a fan in the center. However, the 6600 does not need anything better. Even if it runs at 350 MHz instead of 300 MHz.






Cooling systems cover GPUs, as usual. You can hardly expect anything unique from the 6600 DDR2. Simple GeForce 6600. Note that NVIDIA started marking chips right on their dies (they used to be marked on their substrates).




The same concerns the GeForce 6800 GS. Interestingly, judging from its designation, the GPU has been manufactured on Week 33, 2005, that is in September (quite recently). It's already marked as GS. Not just GeForce 6800. We remember that the chips used to be marked on their substrates, but these chips are evidently marked on their dies. Most likely, it's not a previously manufactured NV42 for the 6800 and just selected and re-marked for the new card, but it's been specially manufactured for the GS card.




It's our base-line material on the new products, so we shall not review their boxes and bundles. But we'd like to note that our 6600 DDR2 is an XFX modification (the blue color and a logo on the cooler). That is it's a production-line video card.

Installation and Drivers

Testbed configurations:

  • Athlon 64 (939Socket) based computer
    • CPU: AMD Athlon 4000+ (2400MHz) (L2=1024K)
    • Motherboard: ASUS A8N SLI Deluxe based on NVIDIA nForce4 SLI
    • RAM: 1 GB DDR SDRAM 400MHz (CAS (tCL)=2.5; RAS to CAS delay (tRCD)=3; Row Precharge (tRP)=3; tRAS=6)
    • HDD: WD Caviar SE WD1600JD 160GB SATA

  • Operating system: Windows XP SP2 DirectX 9.0c
  • Monitors: ViewSonic P810 (21") and Mitsubishi Diamond Pro 2070sb (21").
  • ATI drivers 6.575 (CATALYST 5.10a); NVIDIA drivers 81.87.

VSync is disabled.

This picture refers to the NVIDIA GeForce 6800 GS. Please note that the cooler monitoring results are wrong (RivaTuner author will have to fix it). In fact, I set the cooler rpm to minimum - 25%. It's not audible at all!




As we can see, the temperature background does not exceed the reasonable limits (in a closed PC case without additional cooling).

Test results: performance comparison

We used the following test applications:

  • Half-Life2 (Valve/Sierra) — DirectX 9.0, demo (ixbt01, ixbt02, ixbt03 The tests were carried out with maximum quality, option -dxlevel 90, presets for video card types are removed from dxsupport.cfg.

  • FarCry 1.3 (Crytek/UbiSoft), DirectX 9.0, multitexturing, 3 demos from Research, Pier, Regulator levels (-DEVMODE startup option), Very High test settings.

  • DOOM III (id Software/Activision) — OpenGL, multitexturing, test settings — High Quality (ANIS8x), demo ixbt1 (33MB!). We have a sample batch file to start the game automatically with increased speed and reduced jerking (precaching) d3auto.rar. (DO NOT BE AFRAID of the black screen after the first menu, that's how it should be! It will last 5-10 seconds and then the demo should start)

  • 3DMark05 (FutureMark) — DirectX 9.0, multitexturing, test settings — trilinear,

  • F.E.A.R. v.1.01 (Monolith/Sierra) — DirectX 9.0, multitexturing, test settings — maximum, Soft shadows OFF(!!!).

  • Splinter Cell Chaos of Theory v.1.04 (Ubisoft) — DirectX 9.0, multitexturing, maximum test settings, shaders 3.0 (for NVIDIA cards)/shaders 2.0 (for ATI cards); HDR OFF!

  • The Chronicles Of Riddick: Escape From Butcher Bay v.1.04 (Starbreeze/Vivendi) — OpenGL, multitexturing, test settings — maximum texture quality, Shader 2.0, demo 44 and demo ducche.

    I wish to thank Rinat Dosayev (AKA 4uckall) and Alexei Ostrovski (AKA Ducche), who have created a demo for this game. I also want to thank Alexei Berillo AKA Somebody Else for his help.

  • BattleField 2 (Ubisoft) — DirectX 9.0, multitexturing, test settings — maximum, shaders 2.0 tests in Benchemall

  • Call Of Duty 2 DEMO (Ubisoft) — DirectX 9.0, multitexturing, test settings — maximum, shaders 2.0, tested in Benchemall, demo and a startup script, readme contains necessary instructions



Game tests that heavily load vertex shaders, mixed pixel shaders 1.1 and 2.0, active multitexturing.

FarCry, Research



Test results: FarCry Research




Considering that the GeForce 6600 DDR2 price is even lower than that for the X1300 PRO (NVIDIA recommends to position the new card against the X1300), the 6600 DDR2 is evidently successful!

The 6800 GS demonstrated an excellent result, nearly on a par with the 6800 GT. Of course, it outperformed not only the X1600 XT, but also the X800 GT. It has nearly caught up with the X800XL, which is evidently more expensive.

Alas, both ATI products (X1300 PRO and X1600 XT) are inferior. NVIDIA's cards are faster. Even in this game, where ATI cards are traditionally at advantage.

Game tests that heavily load vertex shaders, pixel shaders 2.0, active multitexturing.

F.E.A.R. v.1.01



Test results: F.E.A.R. v.1.01




Considering the absolute performance results in this game, I can say for sure that we should pay attention only to non-AA+AF modes as far as the 6600 DDR2 comparison is concerned. The new card is also successful there.

6800 GS demonstrated a still more brilliant result, having heavily outperformed its competitors. Only the 6800 GT is faster, but it's considerably more expensive (don't forget about it).

Splinter Cell Chaos Theory



Test results: SCCT




We can also establish a fact that the 6600 DDR2 is very expedient compared to its competitor, it's slightly outperformed only in AA+AF mode in low resolutions. But again, considering the performance demonstrated here by these cards, we can assume that AA+AF will hardly be necessary. Of course, a gamer can reduce the quality level (disable some effects) and enable AA+AF instead.

What concerns the 6800 GS, it fairs well, its advantage over the competitors is obvious. Moreover, this card nearly effortlessly caught up with the X800 XL. But we should take into account that all X8xx cards work with Shaders 2.0 in this case, while NVIDIA - 3.0. That's why we should mentally adjust the scores. The only correct comparison is between the X1xxx series and NVIDIA's competitors.

BattleField 2



Test results: BF2




Here is the first game, where the 6600 DDR2 is defeated in the heavy mode with AA+AF. Considering the FPS values in this test, we can say that heavy modes are justified here, so we score a point to the X1300 PRO.

But the 6800 GS still fairs well, the card demonstrates excellent results, it was only outscored by the X800 XL in heavy modes. It should be said that BF2 is an unpredictable test, you have to rerun it 3-4 times in order to make sure that the results are presentable.

Call Of Duty 2 DEMO



Test results: COD2




It's a very heavy test for modern video cards, though I don't understand the reason for such low FPS (I don't see super-excellent graphics here). Both new cards from NVIDIA are also up to the mark here. The first card won parity with the competitor from ATI, the second one easily outperformed all its competitors.

Half-Life2: ixbt01 demo



Test results: Half-Life2, ixbt01




This game is rather simple for state-of-the-art accelerators, so even the 6600 DDR2 copes with it in AA+AF modes. In these mode we can see that the new cards from NVIDIA are again victorious (including the 6800 GS).



Game tests that heavily load pixel pipelines with texturing, active operations of the stencil buffer and shader units

DOOM III High mode



Test results: DOOM III






Chronicles of Riddick, demo 44



Test results: Chronicles of Riddick, demo 44




I guess there is no point in commenting the results, it's quite clear that the video cards from NVIDIA will be the best in these tests. Test results confirm it.



Synthetic tests that heavily load shader units

3DMark05: MARKS



Test results: 3DMark05 MARKS




It's practically the only purely shader test, which compares corresponding units in accelerators. As we can see, the results are different from those we saw previously in games. X1300 PRO is faster than the 6600 DDR2 (having compared the fill rates: 4 x 600 = 2400 megapixel per second and 8 x 350 = 2800 megapixel, we can say that theoretically the 6600 DDR2 must be faster. But we all know well that computational capacities of ATI chips are their only fad. If games had used very complex shaders and little texturing, ATI cards would have always been victorious. Remember TR:AoD).

In case with the 6800 GS and the X1600 XT (their fill rates are 425 * 12 = 5100 and 590 * 12 = 7080 correspondingly), we can see that the X1600 XT has just 4 texture units, even though it must be victorious. The huge advantage in the peak pixel fill rate comes to nought due to the texel fill rate. Even such a shader application as 3DMark still requires high texturing capacities, the 3:1 ratio of shader to texturing capacities does not work, no matter what ATI representatives tell us.



Conclusions

NVIDIA GeForce 6800 GS 256MB is a very good solution from NVIDIA, which has a great number of advantages. First of all, it's the 256bit bus, while the RADEON X1600 XT has only a 128 bit bus. So the memory bandwidth of the first card is evidently higher (256bit * 1000 / 8 = 31.2 GB per second versus 128bit * 1380 / 8 = 21.5 GB per second). Secondly, it uses the streamlined NV42 chips that have no problems (unlike the RV530 - which are still in insufficient numbers). Thirdly, it uses the popular GDDR3 2.0ns memory instead of the super-expensive and rare 1.26ns or 1.4ns memory. Fourthly, NVIDIA actually did nothing new: it just overclocked NV42 to 425 MHz (nearly all NV42 chips can be overclocked to this frequency - we have already tested it), installed the GPU on the previously manufactured PCB from the 7800GT. And that's it. The prime cost is not too high. Besides, it's obviously a temporary solution. There will appear new products based on 0.09 micron cores (G72/74) in January, their prime costs will be even lower. That's why ATI will have to drop prices for the X1600 XT and abate its appetite for the margin from each card. Besides, the fact that 6800 GS is available on the market plays against ATI, while we still don't know when we'll see the X1600 XT and at what price.

What concerns overclocking, this sample shows an excellent potential: 530/1180 MHz! Not to overcrowd the graphs with the results of this product at these frequencies, I will note that the GeForce 6800 GS at 530/1180 MHz is just a couple per cents slower than the 6800 Ultra, its price lower by $150! I repeat that RivaTuner can control the cooler on such a card, so we can make it practically noiseless.

NVIDIA GeForce 6600 DDR2 256MB just as well offers nothing new. Just an overclocked NV43. It's quite clear that part of the 6600GT chips can be used for these cards, there is nothing more to test. Besides, there are a lot of chips in 6600 cards that will not manage 500 MHz (the level of 6600 GT), but will easily cope with 350 MHz. That's why these cards are very cheap to manufacture. What concerns DDR2 and its expedience in a $115-120 card, if ATI can install DDR 2.5 ns memory to such cards, why can't NVIDIA? Besides, DDR2 is already cheaper than DDR1. Considering that NVIDIA recommends to compare the 6600 DDR2 with the X1300 (not PRO), it must have a price reduction capacity. Such cards may easily go down below $100.

As NVIDIA easily made up competitors to ATI products, actually better than the recently launched Canadian products, we can say that their retail advantage will be obvious (of course, ATI employees will oppose such opinion, but I have already written: I have made a lot of forecasts concerning ATI, no matter what its employees told against them, they had to admit that they were true in the long run). Considering that Christmas sales have already started, the results may be quite unmirthful for ATI. The latest market research shows that the Canadians are rapidly losing a share of the desktop market. It still fairs well on the mobile and chipset market, but there are bad tendencies even there.

Did they really need to make so many mistakes to fall into the pit (deep!), to understand what they had done, and only then to brainstorm to find a way out and to go up again? I remember that ATI had already fallen to such a pit in 1998-1999... Not learning their lessons from the past?

We can congratulate NVIDIA with its current situation. At the cost of small defeats in 2004, they built an excellent basis for the future and now they reap the fruits. While the competitor takes the trouble of mastering 0.09 micron solutions, having no capacities to manufacture enough new-generation accelerators with SM 3.0, NVIDIA easily makes a check with its new queens, it already had in store, and quietly streamlines 0.09 chips to be announced a tad later. Why hurry up, if it still hasn't run out of previously manufactured solutions, which can easily compete with the new products from ATI?

I again express my concern and lack of understanding of ATI's policy (this company couldn't fail to understand the full capacity of its competitor) and of the reasons for so many planning mistakes. I just cannot understand what 100-200 employees in Toronto do: they had to predict the aftermath of the company's moves. Did they all hope that re-marking old SM2.0 solutions would help them out? It really helped to some degree. But it couldn't have worked for good.

The Canadian company obviously lost the round with SM3.0 level accelerators. The X1800XT will be opposed in a week (a killer solution), the X1800XL already has an opponent. According to our today's review, the answer to the X1600 XT, X1300 PRO is very strong and even potentially knocking out.

We'll see what happens in the next year, when the titans clash again on the other ring - DirectX 10. The layout of forces and potentials may change cardinally there... The Canadians may be fostering that victory, ignoring the current attitudes and requirements. But the market dictates its own conditions... ATI is already suffering losses from its mistakes. Heavy losses. How can it repair the situation PRIOR to the next generation of graphics products? Only marketing experts have the answer so far in the form of promises to go up and make improvements. But the realia are gloomier.

The Canadians are catastrophically late... In all aspects. They are late with CrossFire, they are late with the X1xxx series, the competitor had time to get ready. They are even late with their chipsets (RD480 is still not available in production-line motherboards, while the second generation of NForce4 SLI X16 is already launched). Something is wrong in Toronto. Something has been broken. It's awfully nasty to see it. It's painful. I'm a partisan of equal competition, it just MUST be!

You can find more detailed comparisons of various video cards in our 3Digest.










Theoretical materials and reviews of video cards, which concern functional properties of the GPU ATI RADEON X800 (R420)/X850 (R480)/X700 (RV410) and NVIDIA GeForce 6800 (NV40/45)/6600 (NV43)



We express our thanks to
NVIDIA Russia
for the provided video cards.


Andrey Vorobiev (anvakams@ixbt.com)

2005



Write a comment below. No registration needed!


Article navigation:



blog comments powered by Disqus

  Most Popular Reviews More    RSS  

AMD Phenom II X4 955, Phenom II X4 960T, Phenom II X6 1075T, and Intel Pentium G2120, Core i3-3220, Core i5-3330 Processors

Comparing old, cheap solutions from AMD with new, budget offerings from Intel.
February 1, 2013 · Processor Roundups

Inno3D GeForce GTX 670 iChill, Inno3D GeForce GTX 660 Ti Graphics Cards

A couple of mid-range adapters with original cooling systems.
January 30, 2013 · Video cards: NVIDIA GPUs

Creative Sound Blaster X-Fi Surround 5.1

An external X-Fi solution in tests.
September 9, 2008 · Sound Cards

AMD FX-8350 Processor

The first worthwhile Piledriver CPU.
September 11, 2012 · Processors: AMD

Consumed Power, Energy Consumption: Ivy Bridge vs. Sandy Bridge

Trying out the new method.
September 18, 2012 · Processors: Intel
  Latest Reviews More    RSS  

i3DSpeed, September 2013

Retested all graphics cards with the new drivers.
Oct 18, 2013 · 3Digests

i3DSpeed, August 2013

Added new benchmarks: BioShock Infinite and Metro: Last Light.
Sep 06, 2013 · 3Digests

i3DSpeed, July 2013

Added the test results of NVIDIA GeForce GTX 760 and AMD Radeon HD 7730.
Aug 05, 2013 · 3Digests

Gainward GeForce GTX 650 Ti BOOST 2GB Golden Sample Graphics Card

An excellent hybrid of GeForce GTX 650 Ti and GeForce GTX 660.
Jun 24, 2013 · Video cards: NVIDIA GPUs

i3DSpeed, May 2013

Added the test results of NVIDIA GeForce GTX 770/780.
Jun 03, 2013 · 3Digests
  Latest News More    RSS  

Platform  ·  Video  ·  Multimedia  ·  Mobile  ·  Other  ||  About us & Privacy policy  ·  Twitter  ·  Facebook


Copyright © Byrds Research & Publishing, Ltd., 1997–2011. All rights reserved.