Geek 101: What Is V-Sync?

vsync tweak
Curious about whether or not you need vertical sync? We can help.

If you’ve played a PC game in the past decade, you’ve probably found a mysterious "V-Sync" option while fooling around with your graphics card settings. Enabling V-Sync can make fast-paced action games like Portal 2 look smoother, but run slower; if you're lucky you'll have the option to switch between multiple forms of vertical synchronization like double or triple buffering, but what's the best choice for your needs? Is vertical synchronization even necessary if you own an LCD display?

To answer these questions and more we did a bit of research, did away with the jargon and created a brief guide to what V-Sync means, how it works and how you can use it to get the most out of your machine.

What Is V-Sync?

It's short for vertical synchronization, an optional setting on your graphics card that throttles the frames being drawn to match the number of times your monitor refreshes itself every second. If you have a 60Hz monitor (i.e. one that refreshes 60 times a second), V-Sync will adjust the framerate for the game you're playing or app you're using to max out at 60 frames per second. This GPU feature became necessary back when everyone played games on big CRT monitors, which refreshed themselves by physically moving an electron emitter back and forth across the interior of the screen at regular intervals to redraw the entire image.

In an ideal world the frames per second generated by your graphics card would sync up perfectly with the refresh rate of your monitor, ensuring that every time the GPU writes a frame into video memory the monitor is ready and waiting to pluck that image data out of memory and draw that frame on-screen. The problem comes when your GPU starts spitting frames into video memory faster than your monitor can retrieve them, causing graphical distortion as the images start to overwrite one another.

Why Should You Use It?

You should enable V-Sync if you notice a lot of graphical distortion caused by movement during action sequences when playing games or watching movies on your PC. When your graphics card renders individual display frames faster than your monitor can refresh itself, the extra frames end up partially overwriting previous frames to create odd graphical glitches like fractured lines or objects that look as though they've been sliced in half. These distortions are colloquially known as "screen tearing", and enabling V-Sync eliminates them by keeping your graphics card from sending frames to the monitor before the monitor is ready to display them, ensuring smooth performance.

The phrase "vertical synchronization" is an antiquated reference to CRT monitors, which were designed to refresh themselves vertically at regular intervals; modern LCD monitors don't actually have physical refresh cycles, but rather a response time rating (5 milliseconds, for example) that denotes how long it takes a single pixel to change color from black to white. Of course your LCD monitor still needs to query your graphics card for new frames at regular intervals, and is thus still vulnerable to distortion when displaying frames faster or slower than the GPU can render them. When we're speaking of LCD refresh rates, we're actually talking about how often the display polls the input device for a fresh image.

For example, let's say your spare 24-inch LCD monitor demonstrates a refresh rate of 60 frames per second, but your GeForce GTX 560 Ti graphics card consistently spits out 90 frames per second when you're playing Team Fortress 2. That means that every second your graphics card is providing 90 new images while your monitor is only updating itself 60 times, creating a serious sync problem.

Why Shouldn't You Use It?

Depending on which form of V-Sync you use, enabling it can have a deleterious effect on your PC's performance. There are two popular V-Sync algorithms: straight frame buffering and ping-pong buffering (also known as page flipping.)

The simplest and most common way to solve GPU/monitor sync issues is to create a double (and sometimes triple) frame buffer in system memory where extra frames are stored and fed to the monitor as needed. This buffer ensures a much smoother and more appealing image, but can cause problems when playing games that demand quick responses to onscreen events because the GPU already has two or three frames rendered and stored in the buffer beyond what you're seeing onscreen at any given moment. That means that while the GPU is rendering images in direct response to your actions, there is a miniscule delay (measured in milliseconds) between when you perform those actions and when they actually appear onscreen. Most users will never notice such slight input lag, but hardcore competitive gamers may want to disable frame buffer V-Sync and put up with a few funky graphical effects in exchange for maximum performance.

Ping-pong buffering doesn't have the same input lag; rather than a straight frame buffer which just backs up excess frames and feeds them to the monitor one at a time, this method of vertical synchronization actually renders multiple frames in video memory at the same time and flips between them every time your monitor requests a new frame from your graphics card. This kind of "page flipping" eliminates the lag from copying a frame from the system memory into video memory, which means there's less input lag and thus less impact on your actions per minute while playing a game like Starcraft II.

Of course you might want to keep your GPU settings at minimum to maximize your competitive edge in multiplayer games, but that's a story for another day. For now, you should probably keep V-Sync enabled unless you notice a significant performance boost from turning it off. Screen tearing is no joke, and software improvements like page flipping ensure the negative aspects of V-Sync (input lag, poor framerate) are almost nonexistent.

To comment on this article and other PCWorld content, visit our Facebook page or our Twitter feed.
Shop Tech Products at Amazon
Notice to our Readers
We're now using social media to take your comments and feedback. Learn more about this here.