Istrebitel's picture

If using VSync, when exactly is the frame displayed on the screen?

Greetings.

I'm writing an application that outputs an image on screen (text string) for a limited period of time (like, 33ms) and then waits for the user to react by pressing any mouse button. I'm using VSync in order to ensure that exactly X frames with the image are displayed. So, I wonder, when exactly does the image get on the monitor?

I understand that people have reaction times (that's the point of my application time - to measure those) and that LCD monitors have pixel transition times. But if we disregard LCD transition time (assume it's 0.0001 ms), then what is the time interval between calling the "SwapBuffers()" and the picture appearing on the monitor?

From what I understand, VSync works by delaying every SwapBuffers until right after a frame has been sent to the monitor, so I assume, the time interval between the moment right after SwapBuffer() finishes, and the moment frame gets sent to the monitor, should be equal to time between monitor updates, meaning 1 / Refresh_Rate. Am I correct?


Comments

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.
the Fiddler's picture

The only way to answer this is to use a high-speed camera to record your monitor. This will measure the exact timing between triggering an update and receiving the result on the monitor.

Sources of latency include:

  • Buffering in the GPU drivers ("prerender frames" in Nvidia, triple-buffering on Ati, no idea about Intel)
  • Buffering in the system compositor
  • Input latency in the monitor electronics (depends on the monitor, but it can go as high as 3 or 4 frames!)
  • Pixel transition latency
  • Electron beam latency (on a CRT)

Also note that there is another indeterminate amount of latency between the user clicking a mouse button and the OS delivering this to the application.

If you are planning to do any sort of scientific measurements (in the wide sense of "scientific"), I would strongly advise you to measure the whole chain (mouse click -> result on screen) with a high-fps camera.

If you need to minimize latency, a CRT or TFT monitor capable of 120 Hz would also help.

I don't know this, but it's possible that there are WGL or GL extensions to control the buffering done by the driver. That might be worth a search.

Istrebitel's picture

Thanks, I know about those sources of latency. These are mostly things and nuances that I cannot influence (end users of my app will have whatever monitors they already have installed, with their delays and latencies, and so on). I also considered high-fps camera, but obviously, they (my employers) wouldn't buy it just for testing.

What I wanted to ask is more about when the frame is output relative to the points in the code - my goal being to eliminate as much of the latency as possible (either by moving the point in code from which I'm counting the time, or just subtracting from the time calculated the amount of time which will always pass between the point at which i start counting and the frame appearing on the screen).