Skip to content

2021

EARRRL – the Estimated Average Recent Request Rate Limiter - the Mathy Bits

In the companion post I introduced a problem with naive, window-based rate limiters – they're too forgiving! The user's request count is stored in a key in Redis with a TTL of, say, 15 minutes, and once the key expires, the abusive user can come back and immediately offend again. Effectively the abusive user is using your infrastructure to rate limit their requests.

In this post we'll investigate an alternative approach to windowed rate limiting which keeps a running estimate of each user's request rate and rejects requests for users whose rate is above the prescribed threshold. The focus of this post is on the math behind the approach. For a practical implementation, usage, and motivation for why the math might be worth looking at, please take a look at the companion post.

EARRRL – the Estimated Average Recent Request Rate Limiter

You've got a problem: a small subset of abusive users are body slamming your API with extremely high request rates. You've added windowed rate limiting, and this reduces the load on your infrastructure, but behavior persists. These naughty users are not attempting to rate-limit their own requests. They fire off as many requests as they can, almost immediately hit HTTP 429 Too Many Requests, and even then don't let up. As soon as a new rate limit window is available, the pattern starts all over again.

In order to curtail this behavior, it would be nice to penalize bad users according to their recent average request rate. That is, if a user responsibly limits their own requests, then they never get a 429. However if the user has the bad habit of constantly exceeding the rate limit, then we stop them from making any more requests – forever ... no new windows and no second chances... that is until they mend their ways and start monitoring their own rate more responsibly. Once their average request rate falls below the prescribed threshold, then their past sins are forgiven, and they may begin anew as a responsible user of our API.