Rate Limiting in TypeScript

Here’s the translation of the Go rate limiting example to TypeScript, formatted in Markdown suitable for Hugo:

Rate limiting is an important mechanism for controlling resource utilization and maintaining quality of service. TypeScript can support rate limiting with asynchronous functions, Promises, and timeouts.

import { setTimeout } from 'timers/promises';

async function main() {
    // First we'll look at basic rate limiting. Suppose
    // we want to limit our handling of incoming requests.
    // We'll serve these requests off an array of the same name.
    const requests: number[] = [1, 2, 3, 4, 5];

    // This limiter function will return a Promise that resolves
    // after 200 milliseconds. This is the regulator in our rate
    // limiting scheme.
    const limiter = () => setTimeout(200);

    // By awaiting the limiter before processing each request,
    // we limit ourselves to 1 request every 200 milliseconds.
    for (const req of requests) {
        await limiter();
        console.log("request", req, new Date());
    }

    // We may want to allow short bursts of requests in
    // our rate limiting scheme while preserving the
    // overall rate limit. We can accomplish this by
    // using a token bucket algorithm.
    class TokenBucket {
        private tokens: number;
        private lastRefill: number;

        constructor(private capacity: number, private refillRate: number) {
            this.tokens = capacity;
            this.lastRefill = Date.now();
        }

        async getToken(): Promise<void> {
            this.refill();
            if (this.tokens < 1) {
                const waitTime = (1 - this.tokens) / this.refillRate * 1000;
                await setTimeout(waitTime);
                this.refill();
            }
            this.tokens -= 1;
        }

        private refill() {
            const now = Date.now();
            const timePassed = now - this.lastRefill;
            this.tokens = Math.min(this.capacity, this.tokens + timePassed * this.refillRate / 1000);
            this.lastRefill = now;
        }
    }

    // This burstyLimiter will allow bursts of up to 3 events.
    const burstyLimiter = new TokenBucket(3, 5); // 5 tokens per second, max 3

    // Now simulate 5 more incoming requests. The first
    // 3 of these will benefit from the burst capability
    // of burstyLimiter.
    const burstyRequests: number[] = [1, 2, 3, 4, 5];
    for (const req of burstyRequests) {
        await burstyLimiter.getToken();
        console.log("request", req, new Date());
    }
}

main().catch(console.error);

Running our program we see the first batch of requests handled once every ~200 milliseconds as desired.

request 1 2023-06-01T12:00:00.000Z
request 2 2023-06-01T12:00:00.200Z
request 3 2023-06-01T12:00:00.400Z
request 4 2023-06-01T12:00:00.600Z
request 5 2023-06-01T12:00:00.800Z

For the second batch of requests we serve the first 3 immediately because of the burstable rate limiting, then serve the remaining 2 with ~200ms delays each.

request 1 2023-06-01T12:00:01.000Z
request 2 2023-06-01T12:00:01.000Z
request 3 2023-06-01T12:00:01.000Z
request 4 2023-06-01T12:00:01.200Z
request 5 2023-06-01T12:00:01.400Z

This TypeScript implementation uses async/await and Promise-based timeouts to achieve similar functionality to the original example. The bursty rate limiting is implemented using a token bucket algorithm, which is a common approach in TypeScript/JavaScript for this kind of rate limiting.