Rate Limiting in Nim

Here’s the translation of the Go rate limiting example to Nim, formatted in Markdown for Hugo:

Rate limiting is an important mechanism for controlling resource utilization and maintaining quality of service. Nim supports rate limiting with threads, channels, and timers.

import std/[asyncdispatch, times, strformat]

proc main() {.async.} =
  # First we'll look at basic rate limiting. Suppose
  # we want to limit our handling of incoming requests.
  # We'll serve these requests off a channel of the
  # same name.
  var requests = newAsyncChannel[int](5)
  for i in 1..5:
    await requests.send(i)
  requests.close()

  # This limiter will receive a value every 200 milliseconds.
  # This is the regulator in our rate limiting scheme.
  var limiter = newAsyncEvent()

  # By waiting for the limiter before serving each request,
  # we limit ourselves to 1 request every 200 milliseconds.
  proc rateLimiter() {.async.} =
    while true:
      await sleepAsync(200)
      limiter.fire()

  asyncCheck rateLimiter()

  while true:
    try:
      let req = await requests.recv()
      await limiter.wait()
      echo fmt"request {req} {now()}"
    except CancelledError:
      break

  # We may want to allow short bursts of requests in
  # our rate limiting scheme while preserving the
  # overall rate limit. We can accomplish this by
  # buffering our limiter channel. This burstyLimiter
  # will allow bursts of up to 3 events.
  var burstyLimiter = newAsyncChannel[Time](3)

  # Fill up the channel to represent allowed bursting.
  for i in 0..2:
    await burstyLimiter.send(now())

  # Every 200 milliseconds we'll try to add a new
  # value to burstyLimiter, up to its limit of 3.
  proc burstyRateLimiter() {.async.} =
    while true:
      await sleepAsync(200)
      await burstyLimiter.send(now())

  asyncCheck burstyRateLimiter()

  # Now simulate 5 more incoming requests. The first
  # 3 of these will benefit from the burst capability
  # of burstyLimiter.
  var burstyRequests = newAsyncChannel[int](5)
  for i in 1..5:
    await burstyRequests.send(i)
  burstyRequests.close()

  while true:
    try:
      let req = await burstyRequests.recv()
      discard await burstyLimiter.recv()
      echo fmt"request {req} {now()}"
    except CancelledError:
      break

waitFor main()

Running our program we see the first batch of requests handled once every ~200 milliseconds as desired.

$ nim c -r rate_limiting.nim
request 1 2023-05-25T12:34:56+00:00
request 2 2023-05-25T12:34:56+00:00
request 3 2023-05-25T12:34:56+00:00
request 4 2023-05-25T12:34:56+00:00
request 5 2023-05-25T12:34:57+00:00

For the second batch of requests we serve the first 3 immediately because of the burstable rate limiting, then serve the remaining 2 with ~200ms delays each.

request 1 2023-05-25T12:34:57+00:00
request 2 2023-05-25T12:34:57+00:00
request 3 2023-05-25T12:34:57+00:00
request 4 2023-05-25T12:34:57+00:00
request 5 2023-05-25T12:34:57+00:00

This example demonstrates how to implement basic and bursty rate limiting in Nim using asynchronous programming concepts. The asyncdispatch module is used for asynchronous operations, and channels are used to simulate requests and implement the rate limiting mechanism.