Rate Limiting in Swift

Here’s the translation of the Go rate limiting example to Swift, formatted in Markdown suitable for Hugo:

Rate limiting is an important mechanism for controlling resource utilization and maintaining quality of service. Swift supports rate limiting through the use of Grand Central Dispatch (GCD) and timers.

import Foundation

func main() {
    // First we'll look at basic rate limiting. Suppose
    // we want to limit our handling of incoming requests.
    // We'll serve these requests off a DispatchQueue.
    let requests = DispatchQueue(label: "com.example.requests")
    for i in 1...5 {
        requests.async {
            print("Enqueueing request", i)
        }
    }

    // This timer will fire every 200 milliseconds.
    // This is the regulator in our rate limiting scheme.
    let limiter = DispatchSource.makeTimerSource()
    limiter.schedule(deadline: .now(), repeating: .milliseconds(200))
    
    // By waiting for the timer to fire before serving each request,
    // we limit ourselves to 1 request every 200 milliseconds.
    limiter.setEventHandler {
        requests.async {
            if let req = requests.sync(execute: { return requests.label }) {
                print("request", req, Date())
            }
        }
    }
    limiter.resume()

    // We may want to allow short bursts of requests in
    // our rate limiting scheme while preserving the
    // overall rate limit. We can accomplish this by
    // using a semaphore. This burstyLimiter
    // will allow bursts of up to 3 events.
    let burstyLimiter = DispatchSemaphore(value: 3)

    // Every 200 milliseconds we'll try to add a new
    // value to burstyLimiter, up to its limit of 3.
    let burstyTimer = DispatchSource.makeTimerSource()
    burstyTimer.schedule(deadline: .now(), repeating: .milliseconds(200))
    burstyTimer.setEventHandler {
        burstyLimiter.signal()
    }
    burstyTimer.resume()

    // Now simulate 5 more incoming requests. The first
    // 3 of these will benefit from the burst capability
    // of burstyLimiter.
    let burstyRequests = DispatchQueue(label: "com.example.burstyRequests")
    for i in 1...5 {
        burstyRequests.async {
            burstyLimiter.wait()
            print("bursty request", i, Date())
        }
    }

    // Keep the program running
    RunLoop.main.run()
}

main()

Running our program we see the first batch of requests handled once every ~200 milliseconds as desired.

$ swift run
request com.example.requests 2023-06-10 12:34:56 +0000
request com.example.requests 2023-06-10 12:34:56 +0000
request com.example.requests 2023-06-10 12:34:57 +0000
request com.example.requests 2023-06-10 12:34:57 +0000
request com.example.requests 2023-06-10 12:34:57 +0000

For the second batch of requests we serve the first 3 immediately because of the burstable rate limiting, then serve the remaining 2 with ~200ms delays each.

bursty request 1 2023-06-10 12:34:58 +0000
bursty request 2 2023-06-10 12:34:58 +0000
bursty request 3 2023-06-10 12:34:58 +0000
bursty request 4 2023-06-10 12:34:58 +0000
bursty request 5 2023-06-10 12:34:59 +0000

This example demonstrates how to implement rate limiting in Swift using Grand Central Dispatch (GCD) and semaphores. The concepts are similar to the original example, but adapted to Swift’s concurrency model.