Rate limiting is an important mechanism for controlling resource utilization and maintaining quality of service. Groovy elegantly supports rate limiting with threads and timers.
Running our program we see the first batch of requests handled once every ~200 milliseconds as desired.
For the second batch of requests we serve the first 3 immediately because of the burstable rate limiting, then serve the remaining 2 with ~200ms delays each.
In this Groovy version, we use java.util.concurrent.LinkedBlockingQueue to simulate channels. The Timer class is used to create tickers. The overall structure and logic of the rate limiting remain the same, but the implementation details are adapted to Groovy and Java concurrency utilities.