I had some time this weekend to look back at my frametime project from last year. Drawing inspiration from an article on lwn from 2018 called A look at terminal emulators I figured it could be fun to dive into the relative performance of terminal emulators is 2022.

The lwn article used an application called Typometer to asses the relative latency of the different terminal emulators. Typometer works by sending some input to the application, and monitoring the display buffer until the change is reflected. Assuming the other parts of the chain are unaffected by HOW this change comes to be, it’s a really simple pure software solution to getting a latency number.

Part of the fun of doing this for me though is the hardware part, so I’ll be using my frametime device. Much like typometer it works by sending an input event and watching for a change. The major difference is that frametime emulates a hardware USB keyboard and watches the brightness of the monitor, instead of the framebuffer. The idea being that the time from USB event to detectable brightness change is more representable.

The versions of the tests software is as follows:

Linux delusionalStation 5.16.11-arch1-1 #1 SMP PREEMPT Thu, 24 Feb 2022 02:18:20 +0000 x86_64 GNU/Linux
OpenGL version string: 4.6 (Compatibility Profile) Mesa 21.3.7
version number:    11.0
X.Org version: 1.21.1.3
XTerm(370)
urxvt 9.26
alacritty 0.10.1 (2844606d)
lxterminal 0.4.0

The test setup itself is fairly simple. The frametime device is attached to the monitor. I have a script that places the terminal emulator window just right such that the appearance of a letter places the block cursor underneath the frametime photodiode. All terminal emulators are configured to run with the same huge font, namely monaco in size 100pt, and launched with cat instead of the shell. The background is pure black and the text is pure white, giving the largest possible light difference. For each terminal 2000 samples are taken, consisting of simulating a press of the ‘a’ key, and observing the light level followed by a backspace. The raw results are graphed below.

Figure 1: Raw output values

That’s a little bit of a messy graph, but let me explain. The y-axis is arbitrary, and the x-axis is in cycles since sample start. Since the microcontroller runs at 16MHz, 16million of them is a second. The lightly colored bands show the range which all the sample lie within, while the more opaque lines show the distinct sample with the fastest and slowest latency. We can see that lxterminal seems to vary widely, whereas the others are rather consistent. We can also see that xterm seems by far the fastest, while urxvt and alacritty are more similar.

If we break the data down into numbers it becomes a little clearer.

               title     signal    lag_min  lag_delta  rise_mean rise_stddev
           alacritty     20.344 247552.000 134550.000  48169.349    2663.773
               urxvt     18.745 317563.000 130962.000  56380.486   17153.828
               xterm     18.510  43958.000 131859.000  56646.447   15582.215
          lxterminal     19.245 154289.000 366873.000  49682.588   16226.389

Again, all the numbers are in cycles, so divide by 16MHz to get the value in seconds. From the data we can see that xterm is still significantly faster than any other terminal emulator, with a best case latency of just north of 43000 cycles (2ms). Both alacritty and urxvt are significantly slower with a best case latency of 15ms and 19ms respectively1. Interestingly, lxterminal sticks out with a best case latency almost matching the worst case latency of xterm at 9ms, but then failing at consistency leaving the worst case latency worse than any of the others.

From the rise times we can tell that alacritty does something different from the others, with a significantly tighter distribution of light rise times. The reason becomes clear if we plot the slowest rise time for each terminal.

Figure 2: Slowest rise time

We see a normal rise that’s suddenly paused before continuing after roughly 130kcycles (8.1ms). My monitor runs at 120Hz which results in a frametime of 8ms. What we are seeing is the results of display tearing. Half of the cursor is drawn in front of the photodiode, giving us half light intensity, before the other half is drawn next frame. Alacritty seems to exhibit vsync behaviour, even without a compositor, making its performance a lot more impressive, since vsync usually requires double buffering which incurs a one frame (8ms) penalty.


  1. If you’re wondering about why the delta seems to be multiples of around 130000. That’s because 130000 is 8ms, which is the frametime of my 120Hz monitor. ↩︎