Any experience with Nginx as load balancer traffic handling?

Sakamoto Ryōma

Verified User
Joined
Jun 4, 2011
Messages
69
I want to know how much concurrent user a 1CPU/1GB ram vps(vultr or digitalocean cloud) can handle?
I'm not asking how much traffic my app can handle as it depends on the app but I want to know average limits of nginx server as a load balancer. So think about the servers behind the load balancer has unlimited resources, how much concurrent user can the nginx load balancer handle? Any experience will help, 1CPU/1GB memory, 1CPU/2GB, 2CPU/4GB etc.
 
Hello,

I'd rather say the question isn't correct. Nginx does not count users, and its performance is not limited to users, but rather a number of concurrent connections the webserver and a network device can handle. Each NGINX worker can handle up to 512-1024 concurrent connections. Concurrent connections do not necessary equal to a number of concurrent users. And we don't know how many concurrent connections will your users have.

Check https://www.nginx.com/blog/tuning-nginx/ for more information.
 
Basically, I tested an nginx server which serves only html file based website with 500 concurrent user test from loader.io for a 1CPU/1GB Memory VPS. For 500 concurrent user test it was still around 100ms response time, but if I go over 500 concurrent users, server response increased to 2 seconds. I want to know if I use an nginx server in front of other servers as a load balancer, how much concurrent users it can redirect to app servers (lets think app servers has unlimited resources). I'm planning to create network of servers globally and some datacenters do not have autoscaling option, so why I'm asking this question is, I want to automatically add edge servers to nearest datacenter which has api to add servers automatically to that location by turning a nginx server to load balancer mode if server load passes certain point.
 
Last edited:
If I build a 3 meters wide road, how many cars can use it? Sorry, but your question sounds the same to me. I've never seen a calculator for your case, this is not a math. Too many unknown variables.

First of all nginx as a balancer (even if you have 2+) won't either make your application to work faster or allow serve more concurrent users, if you have only one backend server without caching content on nginx's side.

Secondly, if a backend server is slow responding on each connection, your load balancer will keep at least x2 connections opened.

Thirdly, modern browsers allow multiple connections, it means every user will have multiple connections to your server: css, js, images, dynamic content, web-sockets, etc.

The best way to find answers is to setup a test environment and try it. And loader.io is not the best tool for it I believe.
 
If I build a 3 meters wide road, how many cars can use it? Sorry, but your question sounds the same to me. I've never seen a calculator for your case, this is not a math. Too many unknown variables.

First of all nginx as a balancer (even if you have 2+) won't either make your application to work faster or allow serve more concurrent users, if you have only one backend server without caching content on nginx's side.

Secondly, if a backend server is slow responding on each connection, your load balancer will keep at least x2 connections opened.

Thirdly, modern browsers allow multiple connections, it means every user will have multiple connections to your server: css, js, images, dynamic content, web-sockets, etc.

The best way to find answers is to setup a test environment and try it. And loader.io is not the best tool for it I believe.
"First of all nginx as a balancer (even if you have 2+) won't either make your application to work faster or allow serve more concurrent users, if you have only one backend server without caching content on nginx's side."
I will not use load balancer to make the app faster. Its role is just redirecting traffic to multiple servers. Lets say having 10 servers which can handle as much as traffic as load balancer can handle. So I'm not after performance of load balancer or app, rather the limit of a load balancer to redirect traffic to other server which do the hard work.

"Thirdly, modern browsers allow multiple connections, it means every user will have multiple connections to your server: css, js, images, dynamic content, web-sockets, etc."

As far as I know, http2 (which is quite standard nowadays) download all content over one connection. https://stackoverflow.com/questions/36517829/what-does-multiplexing-mean-in-http-2

The best way to find answers is to setup a test environment and try it. And loader.io is not the best tool for it I believe.

You are righti, it is best to test but I thought there can be someone who has experience on similar setup and give some advice with numbers.
 
Search Bots, spam bots, other scanners, old browsers, etc would still use HTTP/1.x. How much requests will you have over HTTP/1.x? And how much over HTTP/2?

If you are about to handle only HTTP/2, then you can base on the statement:

- Each NGINX worker can handle up to 512-1024 concurrent connections

And 1 CPU on a server will allow to use 1 worker. See https://www.nginx.com/blog/tuning-nginx/

Thus a NGINX server with 1 CPU would handle up to 512-1024 concurrent connections. This is a theory. A live server might give different results.
 
Yes I need to setup a test environment and test it myself. When I did 500 concurrent user test with loaderio, I checked server load and it was around 20% for CPU, but probably reached server limit for memory, this is for serving static html file. In load balancer mode, I expect something better than this but I dont know how much it will be.
 
Back
Top