One of the most basic graphics options of modern games is the V-Sync toggle. Almost every PC game released over the past decade has it. Most people know that v-sync locks the in-game frame rate to your monitor’s refresh rate, but how exactly does it work? And more importantly, should you turn it on or keep it off in fast-paced shooters like Fortnite and PUBG? Let’s take a look:
What does V-Sync do?
Your GPU renders frames as fast as it can, barring any bottlenecks. A well-fed high-end GPU can easily spit out hundreds of frames per second at 1080p, in mainstream PC titles. Whenever your GPU draws a frame, it renders it in what’s called a framebuffer. The framebuffer is a portion of VRAM that stores the information for the final image to be shown on the display. When Vertical Sync is turned off, your GPU renders to the framebuffer as fast as it can, while your monitor draws them onscreen as fast as it can.
There’s a big capability gap here. Most monitors on the market are 60 Hz panels. This means that they refresh the image on-screen 60 times per second. Workloads in games are almost never constant. This means that your GPU will almost always send more–or less–than 60 frames per second to the framebuffer. In a non-buffered situation, you’ll encounter what’s called “screen tear.”
This looks like a horizontal line that moves across the display. It arises when the next consecutive frame is displayed before the monitor gets a chance to refresh, essentially displaying parts of two frames simultaneously. If you dip below 60 FPS, you’ll have to deal with stuttering, as there will be a momentary pause while the GPU draws a new frame in. With just a single framebuffer, this means that you will encounter screen tearing, stuttering, or both. This is where V-sync comes into the picture.
Why does Screen tearing Take Place?
Consider your monitor has a refresh rate of 60 Hz and you’re playing a game at 100 FPS. What does this basically entail? This means that your monitor can update itself (and render the next frame) 60 times per second but your GPU is sending it 100 frames per sec. That is 66% faster than what the monitor can display.
Since the monitor can’t display all 100 frames, it ends up displaying 33% of every x frame and 66% of the following x+1 frame simultaneously. As a result, you see a third of frame x while the remaining is frame x+1. This continues as long as the frame rate is stuck at 100 FPS.
Before we move onto V-Sync, you should know what Double Buffering is. It is a technology used to reduce screen tearing by leveraging two buffers instead of one, namely a back buffer and a front buffer. The latter is on the monitor while the former resides on the GPU.
As soon as the monitor is done with a frame, then the back buffer and the front buffer are swapped. As in, they are renamed, nothing is copied as such. Then the former back-buffer becomes the front-buffer and vise versa. The monitor then “refreshes” and displays this frame.
With this kind of double buffering, the “swap” (renaming) can happen anytime, even when the front buffer is sending data to the monitor panel. When this happens, half of the screen consists of pixels from the earlier frame while the rest of them are composed of the newer frame This is what essentially causes tearing.
The most common approach to combat tearing is to wait to swap buffers until the monitor is done displaying the present frame and ready to receive the next. Synchronizing buffer swaps with the Vertical refresh is called V-sync. When the frame rate is higher than the refresh rate, this works perfectly (although there is an input lag at times).
This becomes a major problem if your frame rate drops below the refresh rate (here 60). Consider you’re getting only 45 FPS instead of 60. That’s 33% lower than the refresh rate. Now, every time the monitor refreshes, only 66% of the next frame will have been rendered (1). As a result, the same frame is displayed again.
After that, the GPU completes drawing the frame in the frame buffer, but the front buffer already contains a frame (2) and it can’t swap it till the monitor refreshes. So it waits for the monitor to refresh and then swaps the frame (3). Then, the second frame is finally displayed (4), and the whole process repeats.
As you can see, one frame is displayed every two cycles, and the rendering is effectively slowed down by 2x, dropping the resultant FPS to 30 (locking it). This is the second drawback of double buffering (other than the input lag). If the frame rate is less than 60, you’ll get 30 FPS with v-sync turned on.
Triple Buffering: Third Time’s a Charm
This is where Triple Buffering comes in. As the name suggests, it adds a third buffer. This extra buffer lets the GPU continue rendering as the screen buffer re-displays the same frame twice, and the pipeline isn’t hindered. In double buffering (with v-sync on), the back buffer has to wait for the front buffer to refresh and it can’t start rendering the next frame until the frame within is swapped.
In triple buffering, one of the back buffers stays locked to the refresh rate to avoid tearing and the other one is used by the GPU to continue drawing the next frame. The frames are swapped back and forth between the back buffers as soon as one is drawn. Ideally, the front buffer takes the most recently rendered frame from the locked back buffer once every refresh cycle.
Let’s consider your GPU is rendering 45 FPS but you have a 60 Hz monitor. Frame x will first be in the front buffer, and 66% of the next frame (x+1) will be rendered by the GPU in the back buffer. Then the monitor will refresh and display frame x for the first time and the remaining 33% of frame x+1 is rendered by the GPU in the first back buffer and another 33% of the next consecutive frame is rendered in the second back buffer.
After that, the monitor refreshes, displaying frame x a second time and swapping frame x+1 to the front buffer. At the same time, the remaining 33% of x+1 will finish rendering and move to the first back buffer while the GPU continues rendering the first half of the next frame.
Then, the monitor displays x+1 for the first time, swapping the next frame in line to the front buffer and the GPU completes rendering the succeeding one. This way you are still able to see two frames in three refresh cycles, retaining 45 FPS instead of locking the display to 30.
Triple buffering uses more VRAM than double buffering due to the use of an additional buffer but it’s hardly a problem for modern graphics cards. Furthermore, the frame rates obtained when triple buffering are often misleading. Most OSD tools like Afterburner and Precision X1 measure the times the front buffer is replaced once every second, not the number of frames rendered by the GPU per second. To get the exact figure, use NVIDIA’s Frame View. It directly interfaces with the API to calculate the frame rate.
What is Adaptive Sync?
This is an option you’ll see in the Nvidia control panel. It’s not globally supported on AMD parts, although games that implement it will you use it. Adaptive sync essentially synchronizes your framerates and refresh rate when the two are above 60, and disables v-sync when they dip below. This means that you’ll get screen tear and stutter when the framerate drops, but you won’t incur the same penalties as with leaving V-sync on.
What is Variable Refresh Rate?
Variable refresh rate monitors address the key tradeoff of V-sync: you need to synchronize a fixed framerate with a fixed monitor refresh rate. Variable refresh monitors feature display controllers that can vary the refresh rate, based on the rate that the GPU renders frames to the framebuffer.
So if your GPU’s framerate varies between, say 50, 51, and 57 frames per second (depending on the workload), your monitor will vary its refresh rate to ensure that frames are always displayed on time. You don’t have to deal with screen tear or the performance and latency implications of V-sync.
What are FreeSync and G-Sync?
To be honest, they both do the same thing, regardless of what AMD or Nvidia might tell you. FreeSync and G-Sync are just the company-specific terms that the two GPU manufacturers use to refer to variable refresh rate support on their graphics cards. Different monitors support Freesync, G-Sync, or both. Only AMD graphics cards can use Freesync, while only Nvidia graphics cards can use G-Sync. The primary difference between the two is that while G-Sync syncs the frame rate to the refresh rate of the monitor, FreeSync allows the frame rate to vary over a range without tearing, dynamically changing the refresh rate of the monitor.
Read more here:
AMD FreeSync vs NVIDIA G-Sync Comparison: Which One is Better?
Nvidia’s newer G-sync Compatible program is basically FreeSync and allows FreeSync monitor users to try adaptive sync without paying for the traditional pricier monitors.
What are Enhanced Sync and Fast Sync?
These are two proprietary terms for a software feature that does essentially the same thing on AMD and Nvidia cards. Fast sync/Enhanced sync is similar to triple buffering in that there are three framebuffers. However, instead of being accessed sequentially, the oldest buffer is always overwritten.
By constantly rendering frames as if v-sync was off, and then just grabbing the most recent frame and discarding the rest, FastSync means that v-sync can still be used to prevent tearing without losing latency.
With FastSync, Input lag will be higher than with v-sync off, but lower than with v-sync on. It’s ideal for games like CS: GO that run at a very high framerate. In the end, FastSync is primarily about input lag and doesn’t fix smoothness, especially if the frame rate isn’t much higher than the refresh rate. You’re basically dropping frames and the time duration between each frame can vary. At times, this results in miro-stutters. As such, FastSync is best for cases when the frame rates are twice as much as the refresh rates.
If you have a regular monitor and aren’t too concerned with input latency, it’s a good idea to just enable Adaptive V-Sync globally if you’re an Nvidia user. Enhanced Sync and Fast Sync sound good in theory. But unless your GPU is rendering frames at double or triple your monitor’s refresh rate, it can cause unpleasant micro-stuttering.
Avoid double-buffered V-sync when possible, and disable in-game V-sync as well, if you’re using the global setting. AMD users don’t get Adaptive V-sync out of the box. Enable it in games that ship with the feature. However, if you use RadeonPro, you can enable Dynamic V-Sync which does essentially the same thing. If you’re looking to buy a new monitor, it’s a great idea to get hold of a FreeSync panel. G-Sync is almost always more expensive and, thanks to the Fresync Compatible program, many Freesync monitors will work with Nvidia GPUs. As always, though, do your research first.
The post What is V-Sync? Is it Better to Turn it on or off? appeared first on Hardware Times.
Title: What is V-Sync? Is it Better to Turn it on or off?
Sourced From: www.hardwaretimes.com/what-is-v-sync-is-it-better-to-turn-it-on-or-off/
Published Date: Sun, 31 Jan 2021 12:46:13 +0000