Gigabyte GV-3D1 Dual
nVIDIA 6600GT PCI-E Graphics Card Review (7)
It has been sometime since we see a dual gpu graphics card. Gigabyte has made a breakthrough with such an innovative card that comes in time when NVIDIA is heavily promoting its SLI technology which links up two graphics card of the same build to boost the graphics performance.
During this short period of time, we have been trying hard to fish out more information on how it is done but Gigabyte seems to be quite secretive about this latest weapon of theirs. Anyway, what I could gather and if my guess is correct, they probably linked up the two GPUs through SLI within the card. This reduces the latency of communicating than to link up through the bridge and mainboard PCI-E slot. With this method, this would definitely reduce any possible lags. Based on our test results, we compared it to the official results obtained by 2 x 6600GT tested under the same configuration. The results is similar if 66.93 driver is used. Of course with the newer driver, the dual GPU card performs a few percent faster than 2 x 6600GT in SLI mode.
As Gigabyte hasn't officially announce how they intend to sell this card, there are some rumours that they will bundle this card with their high end K8NXP-SLI mainboard. They also claimed that the price will be cheaper than buying two separate 6600GT cards. Will a pair of such cards work (4 GPUs) on the K8NXP-SLI ? That remains to be answered. Although it might sound silly since there are no golden fingers to pair up two 3D1 cards, anything is possible. What about a "3D2" ?
I know that you are waiting for this information on whether this card will work with other boards. Basically during the installation of the card, the K8NXP-SLI is configured to run in NORMAL mode and not SLI mode configured on GA-K8NXP-SLI (since we are only using one card with 2 GPUs). In the windows xp properties page, two instances of 6600GT is detected. The graphics driver detects that a SLI card (2 GPU) is installed and prompt you to enable the SLI link. Once that is enabled, you would need to reboot the system and it is done.
After the completion of the tests, I did try to test the card on two other PCI-E boards, e.g. MSI ATi K8 and EPoX 5EPA+ but it just couldn't POST at all. With the EPoX 5EPA+, the DEBUG leds shows "FF" and it wouldn't continue. I suspect that the graphics card has some form of detection algorithm within the BIOS that detects which board it is used on. Well, that is again my guess and I find that is highly possible as NVIDIA will come knocking on Gigabyte's door as it might affect the sales of SLI boards if the card can be used on any PCI-E board.
In the overclocking department, the card can be tuned using V-Tuner2 which comes bundled with the graphics card. We only managed to go stable with 3Dmark03 running at Core/Memory at 510/600. The score for 3Dmark03 is 14792 and 104.5 for Doom 3 when overclocked. In fact, I noticed that no matter how far you push the memory (since it is rated 1.6ns), the performance doesn't differ a lot. As for the core clock, screen tearing appears when it is overclocked beyond 530 and will lead to hang. Well I think the room for overclocking is quite limited as it is already at its peak.
Together with the GA-K8NXP-SLI mainboard which offers such a good combination of performance and features, this is a formidable pair which would definitely challenge the top players in the market.
Overall, Gigabyte has delivered a card that has perform well and is leading the industry with such a innovative design. I am sure that other 1st tier makers already has plans to do something likewise in the near future. Lets hope to see more innovative products in year 2005!
I hereby declare both GA-K8NXP-SLI and GV-3D1
wins our Product Excellence Award.
(C) Copyright 1998-2009 OCWorkbench.com