Rate Limiting in Miranda
Here’s the translation of the rate limiting example to Java:
Our first example demonstrates basic rate limiting. We’ll use a ScheduledExecutorService
to simulate a ticker and control the rate at which we process requests.
import java.util.concurrent.*;
import java.time.Instant;
public class RateLimiting {
public static void main(String[] args) throws InterruptedException {
// Create a channel to hold our requests
BlockingQueue<Integer> requests = new ArrayBlockingQueue<>(5);
for (int i = 1; i <= 5; i++) {
requests.offer(i);
}
// This scheduler will act as our rate limiter
ScheduledExecutorService limiter = Executors.newScheduledThreadPool(1);
// Process requests at a rate of one every 200 milliseconds
for (int i = 0; i < 5; i++) {
limiter.schedule(() -> {
try {
int req = requests.take();
System.out.println("request " + req + " " + Instant.now());
} catch (InterruptedException e) {
e.printStackTrace();
}
}, i * 200, TimeUnit.MILLISECONDS);
}
// Allow time for all requests to be processed
Thread.sleep(1100);
// Now let's implement a bursty rate limiter
BlockingQueue<Instant> burstyLimiter = new ArrayBlockingQueue<>(3);
// Fill the bursty limiter with initial capacity
for (int i = 0; i < 3; i++) {
burstyLimiter.offer(Instant.now());
}
// Refill the bursty limiter every 200 milliseconds
ScheduledExecutorService refiller = Executors.newScheduledThreadPool(1);
refiller.scheduleAtFixedRate(() -> {
burstyLimiter.offer(Instant.now());
}, 200, 200, TimeUnit.MILLISECONDS);
// Process bursty requests
BlockingQueue<Integer> burstyRequests = new ArrayBlockingQueue<>(5);
for (int i = 1; i <= 5; i++) {
burstyRequests.offer(i);
}
for (int i = 0; i < 5; i++) {
burstyLimiter.take(); // This will block if the limiter is empty
int req = burstyRequests.take();
System.out.println("bursty request " + req + " " + Instant.now());
}
// Shutdown our executor services
limiter.shutdown();
refiller.shutdown();
}
}
This Java implementation uses BlockingQueue
to simulate channels and ScheduledExecutorService
to simulate tickers. The basic rate limiting is achieved by scheduling tasks at fixed intervals. For the bursty rate limiting, we use a BlockingQueue
with a fixed capacity to allow bursts of requests while maintaining the overall rate limit.
Running our program, we’ll see the first batch of requests handled once every ~200 milliseconds as desired:
request 1 2023-05-25T10:15:00.123Z
request 2 2023-05-25T10:15:00.323Z
request 3 2023-05-25T10:15:00.523Z
request 4 2023-05-25T10:15:00.723Z
request 5 2023-05-25T10:15:00.923Z
For the second batch of requests, we’ll see the first 3 processed immediately due to the burstable rate limiting, then the remaining 2 with ~200ms delays each:
bursty request 1 2023-05-25T10:15:01.123Z
bursty request 2 2023-05-25T10:15:01.123Z
bursty request 3 2023-05-25T10:15:01.123Z
bursty request 4 2023-05-25T10:15:01.323Z
bursty request 5 2023-05-25T10:15:01.523Z
This example demonstrates how to implement basic and bursty rate limiting in Java. While the approach is different from the original due to language differences, the core concept of controlling the rate of operations remains the same.