Rate Limiting in C# Rate limiting is an important mechanism for controlling resource utilization and maintaining quality of service. C# supports rate limiting using tasks, channels, and timers.
using System ;
using System.Collections.Concurrent ;
using System.Threading ;
using System.Threading.Tasks ;
class RateLimiting
{
static async Task Main ()
{
// First we'll look at basic rate limiting. Suppose
// we want to limit our handling of incoming requests.
// We'll serve these requests off a channel of the
// same name.
var requests = new BlockingCollection < int >();
for ( int i = 1 ; i <= 5 ; i ++)
{
requests . Add ( i );
}
requests . CompleteAdding ();
// This timer will trigger every 200 milliseconds.
// This is the regulator in our rate limiting scheme.
using var limiter = new Timer ( _ => { }, null , 0 , 200 );
// By waiting for the timer to elapse before serving each request,
// we limit ourselves to 1 request every 200 milliseconds.
foreach ( var req in requests . GetConsumingEnumerable ())
{
await Task . Delay ( 200 );
Console . WriteLine ( $"request {req} {DateTime.Now}" );
}
// We may want to allow short bursts of requests in
// our rate limiting scheme while preserving the
// overall rate limit. We can accomplish this by
// using a semaphore. This burstyLimiter
// will allow bursts of up to 3 events.
using var burstyLimiter = new SemaphoreSlim ( 3 , 3 );
// Every 200 milliseconds we'll try to add a new
// permit to burstyLimiter, up to its limit of 3.
_ = Task . Run ( async () =>
{
while ( true )
{
await Task . Delay ( 200 );
burstyLimiter . Release ( 1 );
}
});
// Now simulate 5 more incoming requests. The first
// 3 of these will benefit from the burst capability
// of burstyLimiter.
var burstyRequests = new BlockingCollection < int >();
for ( int i = 1 ; i <= 5 ; i ++)
{
burstyRequests . Add ( i );
}
burstyRequests . CompleteAdding ();
foreach ( var req in burstyRequests . GetConsumingEnumerable ())
{
await burstyLimiter . WaitAsync ();
Console . WriteLine ( $"request {req} {DateTime.Now}" );
}
}
}
Running our program we see the first batch of requests handled once every ~200 milliseconds as desired.
request 1 5/26/2023 10:30:00 AM
request 2 5/26/2023 10:30:00 AM
request 3 5/26/2023 10:30:00 AM
request 4 5/26/2023 10:30:01 AM
request 5 5/26/2023 10:30:01 AM
For the second batch of requests we serve the first 3 immediately because of the burstable rate limiting, then serve the remaining 2 with ~200ms delays each.
request 1 5/26/2023 10:30:01 AM
request 2 5/26/2023 10:30:01 AM
request 3 5/26/2023 10:30:01 AM
request 4 5/26/2023 10:30:01 AM
request 5 5/26/2023 10:30:01 AM
In this C# version, we use BlockingCollection<T>
to simulate channels, Timer
for regular intervals, and SemaphoreSlim
for the bursty limiter. The Task.Delay
method is used to introduce delays, and async/await
is used for asynchronous operations.