The lagometer is a small graph you can display in the bottom-right corner of your screen. Turn it on with the variable CG_LAGOMETER.
You can bind a key to show/hide this graph with this script:
set lag_on "set CG_LAGOMETER 1; set tog_lag vstr lag_off" set lag_off "set CG_LAGOMETER 0; set tog_lag vstr lag_on" set tog_lag vstr lag_on bind "M" tog_lag
There are two graphs in the lagometer, one above the other.
The upper graph advances one pixel for every rendered frame on the client side. Blue lines below the baseline mean that the frame is interpolating between two valid snapshots. Yellow lines above the baseline mean the frame is extrapolating beyond the latest valid time. The length of the line is proportional to the time.
Basically, blue is good: it's based on two snapshots received from the server, so the frame will not be revoked. Yellow means that the client is predicting what will happen, but hasn't received a snapshot to corroborate it yet. If the next snapshot comes in and doesn't correspond to the prediction, the prediction is revoked and the client is update with the snapshot data. This results in the skipping you experience where you'll end up somewhere very different from where you thought you were (and often with less health than you thought, because you were shot on the server, but the client couldn't predict that happening for you).
The upper graph indicates the consistency of your connection. Ideally, you should always have blue bars of only a pixel or two in height. If you are commonly getting big triangles of yellow on the graph, your connection is inconsistent.
The lower graph slides one pixel for every snapshot received from the server. By default, snapshots come 20 times a second, so if you are running >20 fps, the top graph will move faster, and vice versa. A red bar means the snapshot was dropped by the network (indicating packet loss). Green and yellow bars are properly received snapshots, with the height of the bar proportional to the ping. A yellow bar indicates that the previous snapshot was intentionally supressed to stay under the rate limit.
In a heavy firefight, it is normal for modem players to see yellow bars in the bottom graph, which should return to green when the action quiets down. If you are getting several red bars visible, you may want to look for a server that drops less packets.
Fine-tuning the network
Straight from the Carmack:
There are a few tuning variables for people trying to optimize their connection:
The most important one is rate, which is what the connection speed option in the menu sets. We are fairly conservative with the values we set for the given modem speeds: 2500 for 28.8, 3000 for 33, and 3500 for 56k. You may actually be connecting faster than that, and modem compression may be buying you something, so you might get a better play experience by increasing the values slightly. If you connect at 50000 bps, try a rate of 5000, etc. I err on the conservative side, because too low of a rate will only make the movement of other things in the world choppy, while too high of a rate can cause huge amounts of lag. Note that the optimal rate will be somewhat lower than a rate for QW or Q2, because I now include the UDP packet header length in the bandwidth estimate.
You can ask for a different number of snapshots by changing the snaps variable, but there isn't a lot of benefit to that. Dedicated servers run at 40hz, so stick to divisors of that: 40, 20 (default), 10. A snaps of 40 will usually just cause you to hit your rate limit a lot faster. It may be useful for tuning rate, if nothing else.
You can adjust the local timing point with cg_timenudge <value>, which effectively adds local lag to try to make sure you interpolate instead of extrapolate. If you really want to play on a server that is dropping a ton of packets, a timenudge of 100 or so might make the game smoother.