I am a huge believer in latency as an important part of UI feel. I am also a believer in optimizing against what you can measure. In this regard I see a hole in the open-source ecosystem: We have no structured ways to measure GUI lag.

Lag

When I say lag I mean the time between you pressing a physical button on your keyboard/mouse, and a change becoming visible on screen. I worry that this measurement in particular is suffering on modern systems with modern programming techniques and with no cheap and easy way to measure the effect we can’t even discuss it.

For that reason, I was thrilled when NVIDIA announced the Reviewer Toolkit for Graphics Performance, because it included a tool (the LDAT) for measuring total system lag. From the pictures and videos it looked to be a pretty simple circuit inside a 3D printed enclosure. It wouldn’t be hard for NVIDIA to open the design up to the community for the benefit of everybody. Instead NVIDIA productized it in the form of a proprietary monitor for GAMERS.

On the positive side, that means I get to make it myself!

FrameTime

This is where I introduce my Holiday project: FrameTime

Frametime is exactly the tool I think I need to measure end-to-end latency on modern graphical systems. It emulates a keyboard to send input. Using a photodiode it then watches the monitor for a change in brightness. The time between those two events is defined as the lag.

FrameTime is free open-source software.

Terminal comparison

So what kind of data can we get out of this device? I’ve taken measurements of a few terminal emulators. All tests were taken by configuring the terminal emulator to a black background with a white cursor. The terminals were all placed at roughly the same spot on the monitor, with the FrameTime device placed such that the cursor would be measured when the button was pressed. The fonts were configured to be Monaco in size 100. The device ran the test 10000 times for each terminal emulator, I then trimmed the outer percent of each end to eliminate outliers.

title signal lag_min lag_delta rise_mean rise_stddev
xterm 18.876 136379.000 130050.000 65343.759 4494.641
alacritty 20.532 370491.000 133613.000 57126.701 6083.740
termite 18.242 277158.000 143515.000 67793.107 5793.508
urxvt 18.657 413547.000 131844.000 66218.334 5249.306

The signal column is the change in brightness. The other columns are in cpu cycles. To get that into seconds the value must be divided by 16MHz. The lag is the value defined earlier, and the rise is the time it takes the monitor to stabilize on the new brightness value.

I think it’s interesting to note that lag_delta is around 133Kcycles for all terminals. If we convert that values to seconds, 133000/16MHz = 8.3ms which happens to be the time between frames on my 120Hz monitor. I think that speaks to the accuracy.