Approaches to solving application throttling issues in Java

Sometime back, I was working on a Java web application, and the application used Tomcat as a web container. The application was used by many end-users and over a period of time, we started noticing that some of the application APIs are having extremely high invocations that in turn started impacting the application stability.

So in this blog, I will be discussing some approaches that I came across while looking out for ways to solve the throttling issues.

1) Connector element in Tomcat server.xml file

Tomcat server.xml will have a Connector component that listens for connections/requests on a specific TCP port number on the server. Each incoming request requires a thread for the duration of that request. If more simultaneous requests are received than can be handled by the currently available request processing threads, additional threads will be created up to the configured maximum (the value of the maxThreads attribute). If still more simultaneous requests are received, they are stacked up inside the server socket created by the Connector, up to the configured maximum (the value of the acceptCount attribute). Any further simultaneous requests will receive "connection refused" errors until resources are available to process them.
By controlling maxThreads and acceptCount, we can control the number of requests made by users to the application. But we cannot control invocations to a particular API. This will be helpful to control overall concurrent requests made to the application.

2) Tomcat Semaphore Valve

The Semaphore Valve will be able to limit the number of concurrent requests processing threads. You can enable this by making a small change in Tomcat server.xml as shown in the code snippet below:

The above changes work, but with some major issues:

  • You cannot control specific paths.
  • In case of an error, you still get an HTTP 200

To solve that the valve is designed to be inherited and you get two smooth hooks for these two use cases. The code would look something like this:

3) Throttling using Apache Web Server

Apache doesn’t support rate limiting based on user requests using its own modules. It supports Rate limiting based on bandwidth.

To overcome this, we have one solution. We can make use of ModSecurity. I won’t be discussing it in detail here as you can find the same in this blog:

4) API Gateway

API Gateway is a fully managed service that makes it easy for developers to create, publish, maintain, monitor, and secure APIs at any scale. API Gateways will not only help solve the Rate Limiting issues but also brings in more benefits like Authentication and Security, Monitoring, Dynamic Routing, Static Response handling, etc.
The only challenge with this is the initial efforts that are involved in adding an API Gateway layer in the existing application.

5) Adding a Custom RateLimit Filter

We can create a custom servlet filter and configure it in your web.xml file to apply to the path that you want to limit the number of concurrent requests. The code would look something like this:

We can even make use of a cache like Guava LoadingCache where the values are automatically loaded by the cache and are stored in the cache until either evicted or manually invalidated. This way we can even handle API throttling at the user level for a given time interval.

6) Guava RateLimiter

RateLimiter is another class in the Guava library that can be used to do throttling. It has the concept of permit and acquire. Basically, every request acquire a lock and we can define how many permits are allowed. Any request above the permit will be blocked by the RateLimiter. Below is the code snippet:

That’s it for now.



Software Engineer at MIQ Digital India Pvt Ltd

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store