iXBT Labs - Computer Hardware in Detail

Platform

Video

Multimedia

Mobile

Other

Galaxy GeForce 7300 GT 128MB PCI-E x1




We've almost forgotten about cheap graphics cards. Well, it's not that we've really forgotten about them, since they are represented in our montly 3Digests. But these products change only a little. Performance stays very low, new drivers introduce almost nothing. You also have to sacrifice quality in games, not speaking of AA and AF, which you can just forget about. 3D works only in simple games. Of course, aesthetes may enable all options and enjoy a slide show. But that's not the point today.

Given you have a computer with integrated graphics (it especially concerns barebones) and you want to upgrade your graphics subsystem to a more-or-less modern 3D solution and you don't have a PCI-E x16 slot. What can you do? Yep, you should look for a PCI-E x1 graphics card. One of such products will be reviewed today. This 7300GT card is made and factory-overclocked by Galaxy.

Graphics card

Galaxy GeForce 7300 GT 128MB PCI-E x1
  • GPU: GeForce 7300 GT (G73)
  • Interface: PCI-Express x1
  • GPU frequencies (ROPs/Shaders): 500/500 MHz (nominal - 350/350 MHz)
  • Memory frequencies (physical (effective)): 500 (1000) MHz (nominal - 333 (666) MHz)
  • Memory bus width: 128bit
  • Vertex processors: 8
  • Pixel processors: 8
  • Unified processors: -
  • Texture processors: 8
  • ROPs: 4
  • Dimensions: 170x100x15 mm (the last figure is the maximum thickness of a graphics card).
  • PCB color: blue
  • RAMDACs/TDMS: integrated into GPU
  • Output connectors: DVI (Dual-Link), VGA, TV-Out.
  • VIVO: not available
  • TV-out: integrated into GPU.
  • Multi-GPU operation: SLI (Software).


Galaxy GeForce 7300 GT 128MB PCI-E x1
The graphics card has 128 MB of GDDR3 SDRAM allocated in four chips on the front side of the PCB.

Samsung memory chips (GDDR3). 1.4 ns memory access time, which corresponds to 700 (1400) MHz.






Comparison with the reference design, front view
Galaxy GeForce 7300 GT 128MB PCI-E x1
Reference card NVIDIA GeForce 7300 GT


Comparison with the reference design, back view
Galaxy GeForce 7300 GT 128MB PCI-E x1
Reference card NVIDIA GeForce 7300 GT

The photo shows well that the card has a unique design. It's similar to the 7600 GT in the layout of memory chips. The card is rather small. It has a TV-Out with a unique jack. You will need special bundled adapters to output video to a TV-set via S-Video or RCA. You can read about the TV-Out in more detail here.

Analog monitors with d-Sub (VGA) are connected either via a VGA connector, or special DVI-to-d-Sub adapters (they are not included into the bundle). Maximum resolutions and frequencies:

  • 240 Hz Max Refresh Rate
  • 2048 x 1536 x 32bit x85Hz Max - analog interface
  • 2560 x 1600 @ 60Hz Max - digital interface

Speaking of MPEG2 playback features (DVD-Video), we analyzed this issue in 2002. Little has changed since that time. CPU load during video playback on all modern graphics cards does not exceed 25%. What concerns HDTV and other trendy video features, we are going to sort them out as soon as possible.

Now what concerns the cooling system. There is nothing much to write about. You can see well on the photo that it's a usual round cooler with a fan in the middle. The latter has a low rotational speed, and so there is no noise. However, we cannot guarantee that such a cooler will always remain silent (there were some cases when such devices became noisy in six-twelve months).

We monitored temperatures using RivaTuner (written by A.Nikolaychuk AKA Unwinder) and obtained the following results:

Galaxy GeForce 7300 GT 128MB PCI-E x1



As we can see, the temperatures are not very high, despite the increased frequencies (500 MHz versus 350 MHz).

Bundle

Galaxy GeForce 7300 GT 128MB PCI-E x1

The bundle is minimal: User Manual, CD with drivers, TV extension cord. There are no adapters in the bundle.

Box

Galaxy GeForce 7300 GT 128MB PCI-E x1

A small cardboard box, almost an OEM bundle (the so-called "Light Retail").

Installation and drivers

Testbed configuration:

  • Intel Core2 Duo (775 Socket) based computer
    • CPU: Intel Core2 Duo Extreme X6800 (2930 MHz) (L2=4096K)
    • Motherboard: EVGA nForce 680i SLI on NVIDIA nForce 680i
    • RAM: 2 GB DDR2 SDRAM Corsair 1142MHz (CAS (tCL)=5; RAS to CAS delay (tRCD)=5; Row Precharge (tRP)=5; tRAS=15)
    • HDD: WD Caviar SE WD1600JD 160GB SATA
    • PSU: Tagan 1100-U95 (1100W).
  • Operating system: Windows XP SP2 DirectX 9.0c
  • Monitor: Dell 3007WFP (30").
  • Drivers: ATI CATALYST 7.4; NVIDIA Drivers 97.93.

VSync is disabled.

Test results: performance comparison

We used the following test applications:

  • Splinter Cell Chaos Theory v.1.04 (Ubisoft) - DirectX 9.0, shaders 3.0 (with/without HDR), maximum settings.
  • Call Of Juarez (Techland/Ubisoft) - DirectX 9.0, shaders 3.0 (HDR), demo (demo Tests were run with maximum quality. The batch file is included.
  • FarCry 1.4 (beta) (Crytek/UbiSoft), DirectX 9.0, shaders 2.0b/3.0 (with/without HDR), 3 demos from the Research level (-DEVMODE startup option), Very High test settings. We used HDRRendering=1 for HDR tests.
  • PREY 1.01 (3D Realms Entertainment / Human Head Studios / 2K Games) - OpenGL, shaders 2.x, demo003 (40MB!). Tests were run with maximum quality. The batch file is included.
  • 3DMark05 1.20 (FutureMark) - DirectX 9.0, multitexturing, trilinear filtering.
  • Serious Sam II 1.068 (Croteam/2K Games) - DirectX 9.0, shaders 3.0 (with/without HDR), batch file to start the test. It's the standard demo0002 that comes with the game. Tests settings - maximum. We express our thanks to our reader, Vozniuk Valery AKA Px, for his batch file to run this game.
  • F.E.A.R. v.1.08 (Multiplayer) (Monolith/Sierra) - DirectX 9.0, shaders 2.0, maximum test settings, Soft shadows disabled.
  • Company Of Heroes (Relic Entertainment/THQ) - DirectX 9.0, shaders 2.0, startup batch file. When you start the game, you should go to options, choose the graphics section, and press the test button. Tests were run with maximum quality.
  • 3DMark06 1.02 (FutureMark) - DirectX 9.0c, multitexturing, trilinear test settings.

Graphics card performance

Those of you who are into 3D graphics will understand the charts themselves. And newbies and those who have just begun choosing graphics cards can read our comments.

First of all, you should browse our brief references dedicated to modern graphics card series and underlying GPUs. Note clock rates, shader support, and pipeline architecture.

ATI RADEON X1300-1600-1800-1900 Reference

NVIDIA GeForce 7300-7600-7800-7900 Reference

Secondly, you can browse our 3D graphics section for the basics of 3D and some new product references. There are only two companies that release graphics processing units: ATI (recently merged with AMD and obtained its name) and NVIDIA. Therefore all information is generally separated in two respective parts. You can also read our montly 3Digests, which sums up all graphics card performance charts.

Thirdly, have a look at the test results. We'll not analyze each test, because it makes more sense to draw a bottom line in the end of the article. As this is a budget product, designed to work in 3D only at minimal resolutions, we have run our tests in 1024x768 and 1280x1024. There is no point in using higher resolutions.

FarCry, Research (No HDR)

Test results: FarCry Research (No HDR)






FarCry, Research (HDR)

Test results: FarCry Research (HDR)






F.E.A.R.

Test results: F.E.A.R.






Splinter Cell Chaos Theory (No HDR)

Test results: SCCT (No HDR)






Splinter Cell Chaos Theory (HDR)

Test results: SCCT (HDR)






Call Of Juarez

Test results: CoJ






Company Of Heroes

Test results: CoH






Serious Sam II (No HDR)

Test results: SS2 (No HDR)






Serious Sam II (HDR)

Test results: SS2 (HDR)






Prey

Test results: Prey






3DMark05: MARKS

Test results: 3DMark05 MARKS






3DMark06: SHADER 2.0 MARKS

Test results: 3DMark06 SM2.0 MARKS






3DMark06: SHADER 3.0 MARKS

Test results: 3DMark06 SM3.0 MARKS






Conclusions

Galaxy GeForce 7300 GT 128MB PCI-E x1 demonstrated oppressive results at first sight. The 128 MB of memory are not enough even for budget cards now. Besides, lower video memory capacity increases the amount of data pumped through the bus (in order to use system memory), and this bus is narrow. Hence the low performance in games, which have a lot of textures and where the size of local video memory is very important. Only when shader computing prevails, we can see the Galaxy card get on a par or shoot forward thanks to the increased frequencies.

But the situation is not that bad, considering the objective of this product. It's a priori designed for integrated systems with terribly weak 3D. Or for machines without PCI-E x16 (this happens). And it's just indispensable in this case! It's cheap, it offers some 3D functionality, and it will let you play some simple modern games. So, while there's no use in advertising it widely, there's a niche of usage, where it's essential.

However, we haven't mentioned the multi-monitor configurations. If you have only one PCI-E x16 slot and one graphics card for it, the second graphics card with the x1 interface will allow to use more monitors with this system.

And another thing that we are not tired to repeat from article to article. Having decided to choose a graphics card by yourself, you have to realize you're to change one of the fundamental PC parts, which might require additional tuning to improve performance or enable some qualitative features. This is not a finished product, but a component part. So, you must understand that in order to get the most from a new graphics card, you will have to acquire some basic knowledge of 3D graphics and graphics in general. If you are not ready for this, you should not perform upgrades by yourself. In this case it would be better to purchase a ready chassis with preset software (along with vendor's technical support,) or a gaming console that doesn't require any adjustments.

More comparative charts of this and other graphics cards are provided in our 3Digest.



Galaxy GeForce 7300 GT 128MB PCI-E x1 gets the Original Design award (June 2007):




PSU for the testbed was kindly provided by TAGAN




The Dell 3007WFP monitor for the testbeds was kindly provided by NVIDIA






 
Andrey Vorobiev (anvakams@ixbt.com)
June 27, 2007

Write a comment below. No registration needed!


Article navigation:



blog comments powered by Disqus

  Most Popular Reviews More    RSS  

AMD Phenom II X4 955, Phenom II X4 960T, Phenom II X6 1075T, and Intel Pentium G2120, Core i3-3220, Core i5-3330 Processors

Comparing old, cheap solutions from AMD with new, budget offerings from Intel.
February 1, 2013 · Processor Roundups

Inno3D GeForce GTX 670 iChill, Inno3D GeForce GTX 660 Ti Graphics Cards

A couple of mid-range adapters with original cooling systems.
January 30, 2013 · Video cards: NVIDIA GPUs

Creative Sound Blaster X-Fi Surround 5.1

An external X-Fi solution in tests.
September 9, 2008 · Sound Cards

AMD FX-8350 Processor

The first worthwhile Piledriver CPU.
September 11, 2012 · Processors: AMD

Consumed Power, Energy Consumption: Ivy Bridge vs. Sandy Bridge

Trying out the new method.
September 18, 2012 · Processors: Intel
  Latest Reviews More    RSS  

i3DSpeed, September 2013

Retested all graphics cards with the new drivers.
Oct 18, 2013 · 3Digests

i3DSpeed, August 2013

Added new benchmarks: BioShock Infinite and Metro: Last Light.
Sep 06, 2013 · 3Digests

i3DSpeed, July 2013

Added the test results of NVIDIA GeForce GTX 760 and AMD Radeon HD 7730.
Aug 05, 2013 · 3Digests

Gainward GeForce GTX 650 Ti BOOST 2GB Golden Sample Graphics Card

An excellent hybrid of GeForce GTX 650 Ti and GeForce GTX 660.
Jun 24, 2013 · Video cards: NVIDIA GPUs

i3DSpeed, May 2013

Added the test results of NVIDIA GeForce GTX 770/780.
Jun 03, 2013 · 3Digests
  Latest News More    RSS  

Platform  ·  Video  ·  Multimedia  ·  Mobile  ·  Other  ||  About us & Privacy policy  ·  Twitter  ·  Facebook


Copyright © Byrds Research & Publishing, Ltd., 1997–2011. All rights reserved.