iXBT Labs - Computer Hardware in Detail






Galaxy GeForce 7300 GT 256MB GDDR3 PCI-E

August 3, 2006


  1. Introduction
  2. Video card's features
  3. Testbed configuration, benchmarks
  4. Test results
  5. Conclusions

Hi-End video cards are traditionally much more interesting to review, because they are the engine of this industry - they either promote new technologies or expand the performance horizon with increasingly heavy AA and AF modes in high resolutions. But they have a major drawback from the point of view of a regular consumer. It's a very high price. That's why many users read such reviews only out of curiosity or just to dream a little... about buying such a device in distant future.

Why do reviewers pay so little attention to budget models (prices below $100)? Only because they are not interesting and boring, because they are simply halves, quarters, octants of Hi-End models? Not just that. Let me tell you, that's not the main reason. The fact is, all new cards below $100 are usually on a par with former representatives of Middle-End cards, which got cheaper. That segment is actually overcrowded as well! Judge for yourself: There were GeForce 6600 and 6600GT cards. Now we have 7300GS, 7300GT, 7600GS cards (which are steadily going down to the sacramental mark of $100). These are just modifications of THE SAME CARD that come from different production lines and offer different frequencies! But they bring essentially nothing new in 3D. Yep, they offer improvements, they feature Dual Link DVI, etc. But that's not the point.

Manufacturers and we understand that the process technology grows thin, there appear opportunities to raise core clocks, to equip budget cards with unprecedentedly fast local memory, as it's getting cheaper. Hence all those new numbers in card names. It pays to manufacturers. But do users benefit from it as well?

They are already confused by all those multiple solutions, each one being said the best. Here is a question right off the bat: what card is better, GeForce 7300 GT or GeForce 6600 GT? The latter being even more expensive. It's impossible to answer without analyzing their architectures, frequencies, and test results.

So as there is some lull in launching new expensive solutions, we should demonstrate budget cards, which can be very advantageous in terms of price/performance.

So, GeForce 7300 GT. We already reviewed GeForce 7300 GS - its core contains only four pixel shader units, it operates at 550 MHz, and it has a 64-bit memory bus. Logically, we understand that the 7300 GT is an overclocked modification of the 7300 GS, the manufacturer is sure to raise frequencies and equip it with a 128-bit bus. Not by any means. The gap is even larger! The 7300 GT card is not based on G72 - a small core of GeForce 7300 GS, but on G73, that is the chip used in the GeForce 7600 series. But its GPU is curtailed from 12 to 8 pixel units. And its vertex units are reduced by one. That is 8 and 4, instead of 12 and 5 in the 7600GT/GS.

I don't know why this card is not called 7600 LE or XT (in the old NVIDIA's fashion). But the fact remains, the difference between the 7300 GS and the GT is huge. Well, it's not twofold, but something about it - the reference 7300 GT card operates at 350/666 MHz, while the clock core of the 7300 GT is 550 MHz.

Ironically, the first representative of the 7300 GT in our lab is a Galaxy card, which frequencies differ much from the reference card. It must be noted that GeForce 7300 GT has TWO modifications: with DDR2 and with GDDR3 memory. I repeat, you must always CHECK what memory is USED in such a card, when you buy it. If it's DDR2, you'll get low frequencies (350/666 MHz). If it's GDDR3, frequencies may reach 700 (1400!!!) MHz for memory and up to 500 MHz for the core. Fancy that! - The cards have the same name, but their frequencies differ like heaven and earth, like the 6600 and the 6600 GT.

Again we see this ugly situation, when customers are taken for the fools and different cards are sold under the same name. That's too bad. We'll analyze other 7300 GT cards as well, to get the general idea of such "floating frequencies".

Of course, this product and such frequencies may be blamed on Galaxy, its initiative. But I think that NVIDIA must not hush such things. I can understand the difference in 20-30 MHz - it's fashionable to manufacture overclocked cards (Golden-something or Extreme-to-the-skies products). But it's too much, when the difference between the reference card (350 MHz) and the production-line product (500 MHz) reaches 150 MHz. Let manufacturers use different names and not confuse our readers. Yep, a label on the box may run that the card is equipped with GDDR3, but it does not answer the question about frequencies and how much they are raised from the nominal value. Plenty of such products are sold as OEM, when you cannot find frequencies or a memory type of a card.

So, let's proceed to the examination of GeForce 7300 GT from Galaxy, equipped with GDDR3 memory and offering much higher than nominal frequencies.

Video card

Galaxy GeForce 7300 GT 256MB GDDR3 PCI-E
GPU: GeForce 7300 GT (G73)

Interface: PCI-Express x16

GPU frequencies: 500 MHz (nominal — 350 MHz)

Memory frequencies (physical (effective)): 700 (1400) MHz (nominal — 333 (666) MHz)

Memory bus width: 128bit

Number of Shader Vertex Processors: 4

Number of Shader Pixel Processors: 8

Number of texture processors: 8

Number of ROPs: 4

Dimensions: 1850x100x32mm (the last figure is the maximum thickness of a video card).

PCB color: dark blue.

Output connectors: 1x VGA (d-Sub), 1 x DVI (Dual-Link), TV-Out.

VIVO: not available

TV-out: integrated into GPU.

Galaxy GeForce 7300 GT 256MB GDDR3 PCI-E
The video card has 256 MB of GDDR3 SDRAM allocated in four chips on the front side of the PCB.

Samsung (GDDR3) memory chips. 1.2ns memory access time, which corresponds to 800 (1600) MHz.

Comparison with the reference design, front view
Galaxy GeForce 7300 GT 256MB GDDR3 PCI-E
Reference card NVIDIA GeForce 7600 GT

Comparison with the reference design, back view
Galaxy GeForce 7300 GT 256MB GDDR3 PCI-E
Reference card NVIDIA GeForce 7600 GT

You can see well on the photos that the Galaxy card has its own design, which only distantly resembles the reference design of the 7600 GT. The video card is equipped with very fast 1.2ns memory, which runs at frequencies below its nominal value. Considering that the GPU is already overclocked and is equipped with a huge cooler from Zalman (obeisance to overclockers), the card is generally intended for overclockers.

The card has TV-Out with a unique jack. You will need special bundled adapters to output video to a TV-set via S-Video or RCA. You can read about the TV-Out in more detail here.

Analog monitors with d-Sub (VGA) interface are connected with special DVI-to-d-Sub adapters. Maximum resolutions and frequencies:

  • 240 Hz Max Refresh Rate
  • 2048 x 1536 x 32bit x85Hz Max - analog interface
  • 2560 x 1600 @ 60Hz Max - digital interface

What concerns the cooler, the Galaxy product is equipped with a good modern and efficient solution from ZALMAN. You can read about this cooler at the link above.

The device efficiently cools the overclocked core, remaining noiseless:

Memory chips are also equipped with heatsinks. So, running a few steps forward, I can tell you that we managed to overclock the card to 1680 MHz (memory) and 560 MHz (core).


Galaxy GeForce 7300 GT 256MB GDDR3 PCI-E
User manual, CD with drivers, TV cord, composite output adapter.


Galaxy GeForce 7300 GT 256MB GDDR3 PCI-E

A large box with a window in the center to show the card. Everything is secured into plastic inside.

Installation and Drivers

Testbed configuration:

  • Athlon 64 (939Socket) based computer
    • CPU: AMD Athlon 64 4000+ (2400MHz) (L2=1024K)
    • Motherboard: ASUS A8N32 SLI Deluxe on NVIDIA nForce4 SLI X16
    • RAM: 2 GB DDR SDRAM 400MHz (CAS (tCL)=2.5; RAS to CAS delay (tRCD)=3; Row Precharge (tRP)=3; tRAS=6)
    • HDD: WD Caviar SE WD1600JD 160GB SATA

  • Operating system: Windows XP SP2 DirectX 9.0c
  • Monitor: Mitsubishi Diamond Pro 2070sb (21").
  • ATI CATALYST 6.6; NVIDIA Drivers 91.31.

VSync is disabled.

Test results: performance comparison

We used the following test applications:

  • Splinter Cell Chaos Theory v.1.04 (Ubisoft) — DirectX 9.0, multitexturing, test settings — maximum, shaders 3.0 (for NVIDIA cards)/shaders 2.0 (for ATI cards); HDR OFF!

  • Half-Life2 (Valve/Sierra) — DirectX 9.0, demo (ixbt01 . The tests were carried out with maximum quality, option -dxlevel 90, presets for video card types are removed from dxsupport.cfg.

  • FarCry 1.33 (Crytek/UbiSoft), DirectX 9.0, multitexturing, demo from the Research level (-DEVMODE startup option), Very High test settings.

  • DOOM III (id Software/Activision) — OpenGL, multitexturing, test settings — High Quality (ANIS8x), demo ixbt1 (33MB!). We have a sample batch file to start the game automatically with increased speed and reduced jerking (precaching) d3auto.rar. (DO NOT BE AFRAID of the black screen after the first menu, that's how it should be! It will last 5-10 seconds and then the demo should start)

  • 3DMark05 1.20 (FutureMark) — DirectX 9.0, multitexturing, test settings — trilinear,

  • 3DMark06 1.02 (FutureMark) — DirectX 9.0, multitexturing, test settings — trilinear,

  • The Chronicles Of Riddick: Escape From Butcher Bay 1.10 (Starbreeze/Vivendi) — OpenGL, multitexturing, maximum texture quality, Shader 2.0, demo 44.

    I wish to thank Rinat Dosayev (AKA 4uckall) and Alexei Ostrovski (AKA Ducche), who have created a demo for this game. I also want to thank Alexei Berillo AKA Somebody Else for his help.

  • F.E.A.R. v.1.02 (Multiplayer) (Monolith/Sierra) — DirectX 9.0, multitexturing, maximum test settings, Soft shadows disabled.

  • Call Of Duty 2 DEMO (Ubisoft) — DirectX 9.0, multitexturing, test settings — maximum, shaders 2.0, tested in Benchemall, demo and a startup script, readme contains necessary instructions

Summary performance diagrams

Game tests that heavily load vertex shaders, mixed pixel shaders 1.1 and 2.0, active multitexturing.

FarCry, Research

Test results: FarCry Research

Game tests that heavily load vertex shaders, pixel shaders 2.0, active multitexturing.


Test results: F.E.A.R

Splinter Cell Chaos Theory

Test results: SCCT

Call Of Duty 2 DEMO

Test results: COD2

Half-Life2: ixbt01 demo

Test results: Half-Life2, ixbt01

Game tests that heavily load pixel pipelines with texturing, active operations of the stencil buffer and shader units

DOOM III High mode

Test results: DOOM III

Chronicles of Riddick, demo 44

Test results: Chronicles of Riddick, demo 44

Synthetic tests that heavily load shader units

3DMark05: MARKS

Test results: 3DMark05 MARKS

3DMark06: Shader 2.0 MARKS

Test results: 3DMark06 Shader 2.0 MARKS

3DMark06: Shader 3.0 MARKS

Test results: 3DMark06 Shader 3.0 MARKS

You can find our comments in the conclusions.


Galaxy GeForce 7300 GT 256MB GDDR3 PCI-E. There is no need to comment on each test, everything is crystal clear. The accelerator brilliantly copes with its task and outperforms all competitors. Moreover, it's sometimes faster than the 7600 GS! (the total of 500 MHz and 8/4 pipes turned out faster than 400 MHz and 12/5 pipes). If the card is below $100, it will be evidently an excellent accelerator for its level! Of course, we have some doubts that the price of such cards will be much higher than for the regular 7300 GT cards.

What concerns the card itself, it has worked great, we have no gripes with its operation, it has demonstrated excellent 2D quality! I'll remind you that we managed to overclock the card to 560 MHz (core) and 840 (1680) MHz (memory).

I repeat that if the price for this product is just a tad higher than for the regular 7300 GT card, it will certainly be a phenomenal success. The card has put up an excellent performance.

You can find more detailed comparisons of various video cards in our 3Digest.

Galaxy GeForce 7300 GT 256MB GDDR3 PCI-E gets the Original Design award (July).

Theoretical materials and reviews of video cards, which concern functional properties of the GPU ATI RADEON X800 (R420)/X850 (R480)/X700 (RV410) and NVIDIA GeForce 6800 (NV40/45)/6600 (NV43)

PSU for the testbed was kindly provided by HIPER

                               Memory for the testbeds is kindly provided by

Andrey Vorobiev (anvakams@ixbt.com)

August 2, 2006

Write a comment below. No registration needed!

Article navigation:

blog comments powered by Disqus

  Most Popular Reviews More    RSS  

AMD Phenom II X4 955, Phenom II X4 960T, Phenom II X6 1075T, and Intel Pentium G2120, Core i3-3220, Core i5-3330 Processors

Comparing old, cheap solutions from AMD with new, budget offerings from Intel.
February 1, 2013 · Processor Roundups

Inno3D GeForce GTX 670 iChill, Inno3D GeForce GTX 660 Ti Graphics Cards

A couple of mid-range adapters with original cooling systems.
January 30, 2013 · Video cards: NVIDIA GPUs

Creative Sound Blaster X-Fi Surround 5.1

An external X-Fi solution in tests.
September 9, 2008 · Sound Cards

AMD FX-8350 Processor

The first worthwhile Piledriver CPU.
September 11, 2012 · Processors: AMD

Consumed Power, Energy Consumption: Ivy Bridge vs. Sandy Bridge

Trying out the new method.
September 18, 2012 · Processors: Intel
  Latest Reviews More    RSS  

i3DSpeed, September 2013

Retested all graphics cards with the new drivers.
Oct 18, 2013 · 3Digests

i3DSpeed, August 2013

Added new benchmarks: BioShock Infinite and Metro: Last Light.
Sep 06, 2013 · 3Digests

i3DSpeed, July 2013

Added the test results of NVIDIA GeForce GTX 760 and AMD Radeon HD 7730.
Aug 05, 2013 · 3Digests

Gainward GeForce GTX 650 Ti BOOST 2GB Golden Sample Graphics Card

An excellent hybrid of GeForce GTX 650 Ti and GeForce GTX 660.
Jun 24, 2013 · Video cards: NVIDIA GPUs

i3DSpeed, May 2013

Added the test results of NVIDIA GeForce GTX 770/780.
Jun 03, 2013 · 3Digests
  Latest News More    RSS  

Platform  ·  Video  ·  Multimedia  ·  Mobile  ·  Other  ||  About us & Privacy policy  ·  Twitter  ·  Facebook

Copyright © Byrds Research & Publishing, Ltd., 1997–2011. All rights reserved.