This section will be shorter than the others because this topic has been addressed by others. Here are some links that go deeply in this:
The gist of it
As we have seen in the game engine section, a video signal consist of sending pixels one at a time, from top-left to bottom-right. It is the duty of the display to present those pixels to the user.
A screen can be abstracted like this: it gets a pixel, this pixel is translated to electrical characteristics for the target
display technology (example red:
0.2V), then it transmit this data to the physical pixel that
may also take time to reach a stable state.
If we look at some example technologies:
CRT: the internal state is calculated using analog components. The pixel transmission is done via an amplifier each cathode ray. Because everything is analog, no lag is introduced.
LCD: the internal state is done using a digital device that will compute the best state change. Additional logic may be done at this step to allow overdrive: use a higher value than expected to allow the pixel to be displayed faster at the risk of overshooting and have visual artifacts. The pixel transmission is done using the LCD technology that it has which can take several milliseconds to reach its desired state.
OLED: the internal state is computed in the same fashion as the LCD. But the pixel transmission is faster because the response time is ~100µs.
Variable Refresh Rate
The Variable Refresh Rate technologies (
FreeSync) are an alternative that is able to transmit the frame faster
allowing the controller to be able to send the video at a irregular pace. The main advantage is that the display is better
synced to the game engine allowing better latencies.
For in-detail results, consult the guides by blurbusters.
The Smart TVs have been notorious to introduce major lag issues. The main concern is that the display lag chain is a bit different. Smart TVs will use somewhat of a capture device that will read the frame buffer over HDMI. Then they may apply filters and compose the frame within the smart TV operating system and its UI. Depending on how well the TV is built, this costly process may introduce several frames of lag.
That is why most TVs nowadays support a "gaming mode" that by-passes most of this processing to get a faster result.
To measure display input lag you must have a video signal that you trust to be extremely stable. Afterwards, you need to have either a high speed camera, or a specific device.
There are three devices to my knowledge that are able to compute input lag.
The main advantage is that you will get a fixed value showing you precisely what is the display lag. They are great to showcase screen that have a major lag problems (+1 or 2 frames).
- It will only support a specific set of resolutions in 60Hz. So you won't be able to measure the impact of 240Hz monitors.
- Similarly it is impossible to do measures in a VRR context.
- Time Sleuth and Leo Bodnar lag testers use a metric that measures vsync to pixel displayed while the OSSC does a pixel sent to pixel displayed. This introduces confusion because vsync to pixel includes the data transmission lag (which is to be expected, even in CRTs).
- They all measure only the transition from black to white and the threshold of detection may be different across devices.
High speed camera
Blurbusters made a great article on how to properly measure using a high speed camera. This will cover all the previous issues that you can find using specific devices but it requires expensive hardware and more time to analyse the results.