中文版 | News | Archives | Reviews | Forum | $ DealsLinks | History | Contact | Privacy

Smooth Gaming with Triple Buffering
CrazyRhino 17 Mar 2006

Have you ever being in a situation where you enable v-sync to get rid of annoying image tearing, but only found out it killed your frame rate? We know v-sync caps your maximum frame rate at your screen refresh rate, which is 60 time per second for common LCD monitors. Actually playing games at 60 fps is not bad and it should provide a smooth gaming experience. However, you sometimes find the frame rate strangely capped at 30fps as soon as it drops below 60fps. At this point you probably start screaming "OMG! tearing sucks, and v-sync sucks no less!!!". It's not v-syncs fault all alone, usually it's the combination of the images being rendered using double buffering with v-sync enabled. So you asked why is double buffering evil? Basically there are two buffers in the graphics hardware, the image you seeing on the monitor is in the front buffer and the next rendered frame is in the back buffer. Since we have v-sync enabled, before the graphics hardware can swap the front and the back buffer, it needs to wait for the next vertical blank period (happens every 1/60th seconds on monitors with refresh rate 60Hz) to maintain synchronization with the monitor's refresh rate. This works fine when the graphics card can render frames faster than 60fps. If all of this makes sense to you, perhaps you can imagine what happens when the graphics card is unable to bump 60 frames per second. When that happens, the graphics card is unable to make the buffer swap because the next frame is not ready in the back buffer and it'll have to wait for the next vertical blank period to make the swap. The end result is that instead of swapping buffers 60 times per second, it's only swapping the buffer 30 times a second, and that's the reason why frame rate is capped at 30fps.

This is where triple buffering comes into play. With triple buffering enabled, now we have 3 buffers and the graphics hardware can start rendering into the 3rd buffer without having to wait for the front buffer gets swapped out. Let's just say it helps maintaining frame rate when v-sync is enabled. Both ATi and nVidia provide an option to enable triple buffering in their drivers. Unfortunately it's only half right, the triple buffering option in their drivers only have effect in OpenGL games. Considering the numbers of OpenGL games is largely inferior to D3D games, it's even less than half right.

How to enable Triple Buffering for D3D games?

Here comes the solution - DirectX Tweaker. You can use this tool to force triple buffering via DirectX API. It has nice a GUI and it's simple to setup. Just download the tool and unzipped it. Before you start the application, you'll need .net framework installed prior. Upon running DXTweaker, you'll see the screen below:

Type a name for your project, select the path for the game executable, and remember to check the "Active" checkbox. Once you have done that, click on "Modules to load" and you'll see another screen below:

Find the line "Present Changer", check it. Then put in the number "2" in the "Count" box. As simple as that, you can now launch your game by clicking the "Start" button in the previous screen shot.

To show you how effectively triple buffering impacts your frame rate, I have F.E.A.R. benchmark included here. The results speak for themselves.

Triple Buffering OFF

Triple Buffering ON

Test setup
Resolution: 1280x1024
Graphics settings: All max, SoftShadow disabled.
AA/AF: 0xAA / 8xAF

Notice the maximum frame rate does not change due to v-sync, the percentage of the benchmark running above 40fps has doubled. This shows that triple buffering helps smoothing the gameplay and greatly reduce the stuttering effect.

Memory Usage and Triple Buffering

Triple buffering seems like the perfect solution to cure low frame rate, however, it does not come free. Enable triple buffering also means it requires 50% more frame buffer spaces. Under certain conditions, it can also negatively impact your gaming experience. A simple example can illustrate this potential problem. Let's say we are running a game at 1600x1200 resolution. Each pixel needs 32 bits to store the information, 1600x1200x32 = 61,440,000 bits, converting it to megabytes, it equals to 7.32MB. To use double buffering, it requires 14.64MB video memories; to use triple buffering, it requires 21.96MB video memories. Heck, it's only 21.96MB, what's the big deal? Modern video cards have 256MB video rams onboard. You are certainly right, it does not seem like memory usage would be an issue. However, once we enabled 4x FSAA, the number gets inflated really fast. With 4x FSAA enabled, there are 4 times more pixels being sampled, thus you need to multiple 21.96MB by 4 and it's whopping 87.84MB! That's more than 1/3 of your total video ram. If a game requires 200MB of space to store textures, light maps, bump maps, normal maps......etc., you are going to have a very bad gameplay experience with lots of pauses caused by hard drives accesses. This is just something to watch out for, if you find the game accessing the hard drive too frequently, it can be an indication that triple buffering is eating up too much of your video memories.

Conclusion

If you cannot tolerate tearing and often find enabling v-sync only brings low frame rates and stuttering, triple buffering may be the right solution for you.

(C) Copyright 1998-2009 OCWorkbench.com