Using Nginx As HTTP Load Balancer Last Updated : 05 Mar, 2024 Comments Improve Suggest changes Like Article Like Report Load balancing is a technique used in modern application servers for fault-tolerant systems. The load balancer is a networking device that splits network traffic across multiple backend servers so that the load on each server becomes constant making the overall system persistent and fault-tolerant. Nginx server is an enterprise-grade web server that is widely used across the industry. In this article, we will see how to configure nginx for HTTP load balancing. Primary TerminologiesLoad Balancer: A load balancer is a device that distributes incoming network traffic across multiple servers. The goal is to ensure no single server bears too much demand, preventing performance degradation and improving overall availability and reliability.Upstream Servers: These are the backend servers that receive traffic from the load balancer. Nginx distributes incoming requests among these servers based on the configured load-balancing algorithm.Using Nginx as HTTP load balancer:NOTE: For this article, we will create Virtual Machines in Azure for use as servers. You can also use local machines for configuration. Step 1: Create and set up servers.Create virtual machines in Azure. We will create 3 machines. One machine will act as load balancing server while other two will act as backend servers. Allow port 22 and port 80 for access. Add username and password . Leave everything else as default and review then create.Once all three machines are ready proceed with next steps.Step 2: Install and Configure Nginx.SSH on each machine through IP address. Update application directory on each machine.sudo apt update Once updating is done install Nginx server using below command.sudo apt install nginx You can check the status of nginx using below command.systemctl status nginx You can also verify the installation by hitting public ip address in browser. You should see nginx default landing page. Step 3: Configure Nginx Web pages.Let's add some informative text to each nginx web page to identify each machine.Run below command as a root user. Switch user using below command.sudo suGo to /var/www/html directory and open index.html page which will be available by default.nano index.debian.htmlRemove extra lines from body part of html and put some informative message to identify each server.<h1>Hello From Server 1</h1> Configure all the machines as above.Now hit the public of each machine you should see message that you have configured. Step 4: Configure Load balancer.On machine where you want to configure load balancer open nginx.conf file.Go to /etc/nginx and open nginx.conf in your favourite editor.nano nginx.confIn http block comment below two lines.include /etc/nginx/conf.d/*.conf;include /etc/nginx/sites-enabled/*; Now add upstream backend block which will be used to specify load balancing algorithm and backend server.We will be using least_conn algorithm for load balacing. The block should look like below.upstream backend { least_conn; server <PUBLIC IP OF SERVER 1>; server <PUBLIC IP OF SERVER 2>;}Now add server block which will pass the load balancer traffic to backend servers.server { listen 80; server_name <PUBLIC IP OF LOAD BALANCER>; location / { proxy_pass http://backend; proxy_set_header Host $host; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_set_header X-Forwarded-Proto $scheme;}Save and close the configuration file. Restart the nginx server after configuring.systemctl restart nginx Step 5 : Test the load balancerNow hit the load balancer IP in browser and you should see Server 1 message.Refresh the page and you will see Server 2 message.This will continue alternatively as load balancer will split the traffic across server 1 and 2. ConclusionThus, we have configured HTTP load balancer with the help of NGINX server. We have split the traffic across two backend server which can be edited by adding more backend server .More configuration can be added to configure load balancer for other purposes. Are there alternatives to Nginx for HTTP load balancing?Yes, there are alternative load balancing solutions, including HAProxy, Apache HTTP Server with mod_proxy, and cloud-specific load balancers provided by cloud service providers (e.g., AWS Elastic Load Balancing, Azure Load Balancer). Can I use Nginx as a load balancer in a microservices architecture?Yes, Nginx is well-suited for load balancing in a microservices environment. It can be used to distribute traffic among multiple microservices, providing flexibility and scalability in handling diverse workloads. How do I monitor and troubleshoot Nginx load balancing?Nginx provides various log files for monitoring and diagnostics. Additionally, external monitoring tools, Nginx status modules, and access to error logs can be valuable for identifying and resolving issues. Regularly reviewing logs and metrics is essential for effective troubleshooting. Is Nginx suitable for large-scale deployments and high traffic websites?Yes, Nginx is renowned for its ability to handle a large number of concurrent connections and high traffic volumes. It is widely used by websites and applications with high traffic demands due to its efficiency and low resource usage. How does Nginx handle load balancing?Nginx uses load balancing algorithms (e.g., round-robin, least_conn, ip_hash) to distribute incoming requests among a group of backend servers defined in the configuration. The proxy_pass directive is commonly used to forward requests to the specified upstream group. Comment More infoAdvertise with us Next Article What is DevOps ? D deepcodr Follow Improve Article Tags : DevOps Dev Scripter AWS Dev Scripter 2024 Similar Reads DevOps Tutorial DevOps is a combination of two words: "Development" and "Operations." Itâs a modern approach where software developers and software operations teams work together throughout the entire software life cycle.The goals of DevOps are:Faster and continuous software releases.Reduces manual errors through a 7 min read IntroductionWhat is DevOps ?DevOps is a modern way of working in software development in which the development team (who writes the code and builds the software) and the operations team (which sets up, runs, and manages the software) work together as a single team.Before DevOps, the development and operations teams worked sepa 10 min read DevOps LifecycleThe DevOps lifecycle is a structured approach that integrates development (Dev) and operations (Ops) teams to streamline software delivery. It focuses on collaboration, automation, and continuous feedback across key phases planning, coding, building, testing, releasing, deploying, operating, and mon 10 min read The Evolution of DevOps - 3 Major Trends for FutureDevOps is a software engineering culture and practice that aims to unify software development and operations. It is an approach to software development that emphasizes collaboration, communication, and integration between software developers and IT operations. DevOps has come a long way since its in 7 min read Version ControlVersion Control SystemsA Version Control System (VCS) is a tool used in software development and collaborative projects to track and manage changes to source code, documents, and other files. Whether you are working alone or in a team, version control helps ensure your work is safe, organized, and easy to collaborate on. 5 min read Merge Strategies in GitIn Git, merging is the process of taking the changes from one branch and combining them into another. The merge command in Git will compare the two branches and merge them if there are no conflicts. If conflicts arise, Git will ask the user to resolve them before completing the merge.Merge keeps all 4 min read Which Version Control System Should I Choose?While building a project, you need a system wherein you can track the modifications made. That's where Version Control System comes into the picture. It came into existence in 1972 at Bell Labs. The very first VCS made was SCCS (Source Code Control System) and was available only for UNIX. When any p 5 min read Continuous Integration (CI) & Continuous Deployment (CD)What is CI/CD?CI/CD is the practice of automating the integration of code changes from multiple developers into a single codebase. It is a software development practice where the developers commit their work frequently to the central code repository (Github or Stash). Then there are automated tools that build the 10 min read Understanding Deployment AutomationIn this article we will discuss deployment automation, categories in Automated Deployment, how automation can be implemented in deployment, how it is assisting DevOps and finally the benefits and drawbacks of Deployment Automation. So, let's start exploring the topic in detail. Deployment Automation 4 min read ContainerizationWhat is Docker?Have you ever wondered about the reason for creating Docker Containers in the market? Before Docker, there was a big issue faced by most developers whenever they created any code that code was working on that developer computer, but when they try to run that particular code on the server, that code 12 min read What is Dockerfile Syntax?Pre-requsites: Docker,DockerfileA Dockerfile is a script that uses the Docker platform to generate containers automatically. It is essentially a text document that contains all the instructions that a user may use to create an image from the command line. The Docker platform is a Linux-based platfor 5 min read Kubernetes - Introduction to Container OrchestrationIn this article, we will look into Container Orchestration in Kubernetes. But first, let's explore the trends that gave rise to containers, the need for container orchestration, and how that it has created the space for Kubernetes to rise to dominance and growth. The growth of technology into every 4 min read OrchestrationKubernetes - Introduction to Container OrchestrationIn this article, we will look into Container Orchestration in Kubernetes. But first, let's explore the trends that gave rise to containers, the need for container orchestration, and how that it has created the space for Kubernetes to rise to dominance and growth. The growth of technology into every 4 min read Fundamental Kubernetes Components and their role in Container OrchestrationKubernetes or K8s is an open-sourced container orchestration technology that is used for automating the manual processes of deploying, managing and scaling applications by the help of containers. Kubernetes was originally developed by engineers at Google and In 2015, it was donated to CNCF (Cloud Na 12 min read How to Use AWS ECS to Deploy and Manage Containerized Applications?Containers can be deployed for applications on the AWS cloud platform. AWS has a special application for managing containerized applications. Elastic Container Service (ECS) serves this purpose. ECS is AWS's container orchestration tool which simplifies the management of containers. All the containe 4 min read Infrastructure as Code (IaC)What is Infrastructure as Code (IaC)?Infrastructure as Code (IaC) is a method of managing and provisioning IT infrastructure using code rather than manual configuration. It allows teams to automate the setup and management of their infrastructure, making it more efficient and consistent. This is particularly useful in the DevOps enviro 7 min read Introduction to TerraformMany people wonder why we use Terraform when there are already so many Infrastructure as Code (IaC) tools out there. So, before learning Terraform, letâs understand why it was created.Terraform was made to solve some common problems with existing IaC tools. Some tools, like AWS CloudFormation, only 15 min read What is AWS Cloudformation?Amazon Web Services(AWS) offers cloud formation as a service by which you can provision and manage complicated services offered by AWS by using the code. CloudFormation will help you to manage the infrastructure and the services in the form of a declarative way. Table of ContentIntroduction to AWS C 14 min read Monitoring and LoggingWorking with Prometheus and Grafana Using HelmPre-requisite: HELM Package Manager Helm is a package manager for Kubernetes that allows you to install, upgrade, and manage applications on your Kubernetes cluster. With Helm, you can define, install, and upgrade your application using a single configuration file, called a Chart. Charts are easy to 5 min read Working with Monitoring and Logging ServicesPre-requisite: Google Cloud Platform Monitoring and Logging services are essential tools for any organization that wants to ensure the reliability, performance, and security of its systems. These services allow organizations to collect and analyze data about the health and behavior of their systems, 5 min read Microsoft Teams vs Slack Both Microsoft Teams and Slack are the communication channels used by organizations to communicate with their employees. Microsoft Teams was developed in 2017 whereas Slack was created in 2013. Microsoft Teams is mainly used in large organizations and is integrated with Office 365 enhancing the feat 4 min read Security in DevOpsWhat is DevSecOps: Overview and ToolsDevSecOps methodology is an extension of the DevOps model that helps development teams to integrate security objectives very early into the lifecycle of the software development process, giving developers the team confidence to carry out several security tasks independently to protect code from adva 10 min read DevOps Best Practices for KubernetesDevOps is the hot topic in the market these days. DevOps is a vague term used for wide number of operations, most agreeable defination of DevOps would be that DevOps is an intersection of development and operations. Certain practices need to be followed during the application release process in DevO 11 min read Like