Rate Limiting in Groovy

Rate limiting is an important mechanism for controlling resource utilization and maintaining quality of service. Groovy elegantly supports rate limiting with threads and timers.

import java.util.concurrent.*

// First we'll look at basic rate limiting. Suppose
// we want to limit our handling of incoming requests.
// We'll serve these requests off a queue.
def requests = new LinkedBlockingQueue<Integer>(5)
1.upto(5) { requests.offer(it) }

// This timer will generate a value every 200 milliseconds.
// This is the regulator in our rate limiting scheme.
def timer = new Timer()
def limiter = new LinkedBlockingQueue<Date>()
timer.scheduleAtFixedRate(new TimerTask() {
    void run() {
        limiter.offer(new Date())
    }
}, 0, 200)

// By blocking on a take from the limiter queue
// before serving each request, we limit ourselves to
// 1 request every 200 milliseconds.
requests.each { req ->
    limiter.take()
    println "request $req ${new Date()}"
}

// We may want to allow short bursts of requests in
// our rate limiting scheme while preserving the
// overall rate limit. We can accomplish this by
// buffering our limiter queue. This burstyLimiter
// queue will allow bursts of up to 3 events.
def burstyLimiter = new LinkedBlockingQueue<Date>(3)

// Fill up the queue to represent allowed bursting.
3.times { burstyLimiter.offer(new Date()) }

// Every 200 milliseconds we'll try to add a new
// value to burstyLimiter, up to its limit of 3.
timer.scheduleAtFixedRate(new TimerTask() {
    void run() {
        burstyLimiter.offer(new Date())
    }
}, 0, 200)

// Now simulate 5 more incoming requests. The first
// 3 of these will benefit from the burst capability
// of burstyLimiter.
def burstyRequests = new LinkedBlockingQueue<Integer>(5)
1.upto(5) { burstyRequests.offer(it) }

burstyRequests.each { req ->
    burstyLimiter.take()
    println "request $req ${new Date()}"
}

timer.cancel()

Running our program we see the first batch of requests handled once every ~200 milliseconds as desired.

request 1 Wed Jul 19 10:30:00 EDT 2023
request 2 Wed Jul 19 10:30:00 EDT 2023
request 3 Wed Jul 19 10:30:00 EDT 2023
request 4 Wed Jul 19 10:30:00 EDT 2023
request 5 Wed Jul 19 10:30:01 EDT 2023

For the second batch of requests we serve the first 3 immediately because of the burstable rate limiting, then serve the remaining 2 with ~200ms delays each.

request 1 Wed Jul 19 10:30:01 EDT 2023
request 2 Wed Jul 19 10:30:01 EDT 2023
request 3 Wed Jul 19 10:30:01 EDT 2023
request 4 Wed Jul 19 10:30:01 EDT 2023
request 5 Wed Jul 19 10:30:01 EDT 2023

In this Groovy version, we use java.util.concurrent.LinkedBlockingQueue to simulate channels. The Timer class is used to create tickers. The overall structure and logic of the rate limiting remain the same, but the implementation details are adapted to Groovy and Java concurrency utilities.