0% found this document useful (0 votes)
36 views32 pages

MONI Report Final One

The document is a technical seminar report on 'Serverless Computing: The Future of Cloud,' submitted by Monica K as part of her Bachelor of Engineering degree requirements. It discusses the serverless computing model, its advantages, challenges, and evolution from traditional cloud services, highlighting its event-driven nature and cost efficiency. The report also includes acknowledgments, an abstract, and a detailed table of contents outlining various chapters related to serverless computing.

Uploaded by

bhargavi45680
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
36 views32 pages

MONI Report Final One

The document is a technical seminar report on 'Serverless Computing: The Future of Cloud,' submitted by Monica K as part of her Bachelor of Engineering degree requirements. It discusses the serverless computing model, its advantages, challenges, and evolution from traditional cloud services, highlighting its event-driven nature and cost efficiency. The report also includes acknowledgments, an abstract, and a detailed table of contents outlining various chapters related to serverless computing.

Uploaded by

bhargavi45680
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 32

VISVESVARAYA TECHNOLOGICAL UNIVERSITY

“Jnana Sangama”, Belagavi-590018, Karnataka, India

A TECHNICAL SEMINAR REPORT ON

“SERVERLESS COMPUTING: THE


FUTURE OF CLOUD”
Submitted in partial fulfilment of the requirements
For the Eighth Semester Bachelor of Engineering Degree
SUBMITTED BY
Monica K (1IC21CD002)

Under the guidance of


Dr. Kaipa Sandhya
Professor & Head
Dept. of CSE (CD)

IMPACT COLLEGE OF ENGINEERING AND APPLIED SCIENCES


Sahakarnagar, Banglore-560092

2024-2025
IMPACT COLLEGE OF ENGINEERING AND APPLIED SCIENCES
Sahakarnagar, Banglore-560092

DEPARTMENT OF DATA SCIENCE

CERTIFICATE
This is to certify that the Technical Seminar entitled "Serverless Computing: The future of
Cloud” carried out by Monica K (1IC21CD002) is a bonafide student of Impact College of
Engineering and Applied Sciences Bangalore has been submitted in partial fulfilment of
requirements of VIII semester Bachelor of Engineering degree in Computer Science &
Engineering (Data Science) as prescribed by VISVESVARAYA TECHNOLOGICAL
UNIVERSITY during the academic year of 2024-2025.

Signature of the Guide Signature of the HoD Signature of the Principal

Dr. Kaipa Sandhya Dr. Kaipa Sandhya Dr. Jalumedi Babu


Prof. & Head Prof. & Head ICEAS, Bangalore
Dept. of CSE (CD) Dept. of CSE (CD)
ICEAS, Bangalore. ICEAS, Bangalore.

Name of Examiner Signature with date

1. 1.

2. 2.
ACKNOWLEDGEMENT
The satisfaction and euphoria that accompany the successful completion of any task would
be incomplete without the mention of the people who made it possible and whose constant
encouragement and guidance crowned my efforts with success.

I consider proud to be part of Impact College of Engineering and Applied Sciences


family, the institution which stood by us in our endeavor.

I am grateful to our guide Dr. Kaipa Sandhya, Head of Department Computer Science
of Engineering (Data Science), Impact College of Engineering and Applied Sciences
Bangalore for guiding and correcting various documents of us with attention and care.
She has taken lot of effort to go through the document and make necessary corrections as
and when needed.

I am grateful to Dr. Kaipa Sandhya, Head of Department Computer Science of


Engineering (Data Science), Impact College of Engineering and Applied Sciences
Bangalore who is source of inspiration and of invaluable help in channelizing our efforts
in right direction.

I express my deep and sincere thanks to our Management and Principal, Dr. Jalumedi
Babu for their continuous support.

We would like to thank the faculty members and supporting staff of the Department of
Computer Science and Engineering-Data Science, ICEAS for providing all the support
for completing the Technical Seminar.

Finally, we are grateful to our parents and friends for their unconditional support and help
during the course of our Technical Seminar.

Monica K(1IC21CD002)

i
ABSTRACT

Serverless computing is a cloud computing execution model that allows developers to build and
run applications without managing infrastructure. In this model, cloud providers automatically
allocate resources and execute code in response to events, enabling scalability, cost efficiency,
and ease of deployment. Unlike traditional cloud computing, serverless architectures eliminate
the need for server provisioning and maintenance, allowing developers to focus solely on
writing code. This approach is widely used in real-time applications such as IoT, machine
learning, and API backends. Popular platforms like AWS Lambda, Azure Functions, and
Google Cloud Functions offer seamless integration with cloud services, enhancing performance
and reliability. However, serverless computing also presents challenges such as cold start
latency, vendor lock-in, and limited execution time.

ii
CONTENTS

ACKNOWLEDGEMENT i
ABSTRACT ii

CHAPTER No. TITLE PAGE NO

1 INTRODUCTION 1

2 BACKGROUND AND EVOLUTION OF 4


SERVERLESS COMPUTING

2.1 EVOLUTION OF CLOUD 5

3 HISTORY 9

4 ARCHITECTURE OF SERVERLESS 12
COMPUTING

5 WORKING MECHANISM 14

6 TRADITIONAL COMPUTING VS. 17


SERVERLESS COMPUTING

7 ADVANTAGES OF SERVERLESS 19
COMPUTING

8 DISADVANTAGES OF SERVERLESS 21
COMPUTING

9 USE CASES OF SERVERLESS 22


COMPUTING

10 REAL TIME EXAMPLES OF SERVERLESS 23


COMPUTING

11 FUTURE TRENDS IN SERVERLESS 24


COMPUTING

CONCLUSION 25

REFERENCES 26
FIGURE NO FIGURE NAME PAGE NO.
Fig 1.1 Serverless Computing 2

Fig 2.1 Evolution of Cloud 7

Fig 4.1 Serverless Architecture 12

Fig 5.1 Working Mechanism 15

Fig 6.1 Traditional Computing vs. Serverless Computing 17


Serverless Computing: The Future of Cloud

CHAPTER 1
INTRODUCTION
Cloud computing is the delivery of computing services including servers, storage, databases,
networking, software, and analytics over the internet (the “cloud”). It allows individuals and businesses
to access and use computing resources on demand without owning or maintaining physical
infrastructure.

Serverless computing is a modern cloud computing model that allows developers to build and deploy
applications without managing infrastructure. Unlike traditional computing, where servers must be
provisioned, maintained, and scaled manually, serverless computing abstracts infrastructure
management, enabling automatic resource allocation based on demand. Cloud providers, such as AWS
Lambda, Google Cloud Functions, Azure Functions, and IBM Cloud Functions, handle execution,
scaling, and maintenance, allowing developers to focus solely on writing code.

The term "serverless" does not mean that servers are absent; rather, it signifies that the underlying
infrastructure is fully managed by cloud providers, removing the operational burden from users.
Serverless computing is event-driven, meaning applications run only when triggered by events such as
HTTP requests, database changes, or file uploads. This model follows a pay-per-use pricing structure,
where organizations only pay for the actual execution time of their functions, leading to significant cost
savings.

Serverless computing offers several advantages, including automatic scalability, reduced operational
complexity, and cost efficiency. However, it also has challenges, such as cold start latency, limited
execution time per function, and vendor lock-in. Despite these limitations, serverless computing has
become increasingly popular for building microservices, APIs, real-time applications, and event-driven
workflows. By eliminating infrastructure management, serverless computing enhances agility and
accelerates software development, making it a preferred choice for modern cloud-native applications.

The world of cloud computing is undergoing a transformation, with serverless architecture emerging as a
game-changer. Serverless computing is a cloud computing model where cloud providers handle the
underlying infrastructure, including provisioning, scaling, and maintenance. Serverless computing is a
way to run applications without managing servers. Instead of setting up and maintaining physical or
virtual machines, cloud providers like AWS, Google Cloud, and Microsoft Azure handle everything for
you. Instead of managing servers, developers write code in small, independent units called "functions,"
which execute only when triggered by specific events. This means developers can focus on writing code
and building applications rather than worrying about server management.

Dept. of CSE (CD), ICEAS 2024-2025 1


Serverless Computing: The Future of Cloud

With serverless computing:

• You write your code, and the cloud provider runs it whenever needed.

• You don’t have to worry about setting up, maintaining, or scaling servers.

• The system automatically adjusts resources based on demand.

• You only pay when your code runs, making it cost-effective.

With serverless computing, applications respond to events, such as user requests, database updates, or
file uploads, and execute only when needed. This event-driven model ensures that resources are
efficiently used, reducing costs and improving scalability. Since companies only pay for actual execution
time, serverless computing is highly cost-effective compared to traditional cloud models, where
businesses must pay for idle server capacity.

Fig 1.1: Serverless Computing

When considering the "main groups" in serverless computing, it's most accurate to categorize them by
their primary function and influence within the ecosystem.

• Cloud Providers: Each cloud provider offers unique serverless computing solutions, enabling
businesses to build scalable and cost-effective applications. Choosing the right provider depends
on factors like integration needs, pricing, and ecosystem compatibility.

• Open-Source Communities: Open-source communities and projects support serverless


computing by providing frameworks, tools, and platforms for developers who want flexibility
beyond proprietary cloud provider solutions. It plays a vital role in advancing serverless
computing by providing vendor-neutral, flexible, and scalable solutions.

Dept. of CSE (CD), ICEAS 2024-2025 2


Serverless Computing: The Future of Cloud

• Industry and Professional Organizations: Industry and professional organizations play a


significant role in advancing serverless computing by setting standards, conducting research,
and fostering collaboration. These organizations help shape the future of cloud computing,
including serverless technologies. It plays a crucial role in the development, adoption, and
standardization of serverless computing. They help businesses and developers stay up to date
with best practices, security guidelines, and emerging trends in cloud computing.

Dept. of CSE (CD), ICEAS 2024-2025 3


Serverless Computing: The Future of Cloud

CHAPTER 2

BACKGROUND AND EVOLUTION OF SERVERLESS COMPUTING

Serverless computing evolved as a response to the limitations of traditional cloud services. It emerged from
the need to create a more efficient, scalable, and cost-effective computing paradigm. Serverless computing
evolved as a response to the limitations of traditional cloud services. It emerged from the need to create a
more efficient, scalable, and cost-effective computing paradigm, allowing developers to focus solely on
application logic while cloud providers handle all infrastructure-related tasks. Unlike traditional models
where users must provision, configure, and maintain servers, serverless computing abstracts away these
complexities, automatically scaling resources based on demand and charging users only for the execution
time of their applications.
Before serverless computing, cloud providers primarily offered two models:
• Infrastructure as a Service (IaaS): IaaS provides virtualized computing resources over the
internet, allowing users to rent servers, storage, and networking infrastructure on a pay-as-you-go
basis. With IaaS, developers have full control over virtual machines, operating systems, and
applications but are responsible for managing scaling, security, and maintenance. While this model
eliminated the need for physical hardware, it still required significant administrative overhead, as
developers had to configure load balancers, monitor performance, and handle system failures
manually. Examples include Amazon EC2, Microsoft Azure Virtual Machines, and Google
Compute Engine.
• Platform as a Service (PaaS): PaaS provides a higher level of abstraction than IaaS by offering a
fully managed environment for application development and deployment. Cloud providers manage
the underlying infrastructure, operating systems, and runtime environments, allowing developers to
focus on writing code. PaaS simplifies application hosting by handling scalability, operating
system patches, and database management. However, developers still need to configure application
scaling, integrate third-party services, and manage dependencies. Examples include Google App
Engine, Microsoft Azure App Services, and AWS Elastic Beanstalk.
• The Shift to Serverless Computing: While IaaS and PaaS significantly reduced the need for
physical hardware management, they still required developers to allocate resources, configure
scaling rules, and maintain application infrastructure. This gap led to the development of Function
as a Service (FaaS), a core component of serverless computing that enables developers to execute
individual functions in response to events without provisioning or managing servers. The
introduction of serverless computing further simplified cloud application development by
eliminating the need for managing servers altogether.

Dept. of CSE (CD), ICEAS 2024-2025 4


Serverless Computing: The Future of Cloud
2. 1 Evolution of Cloud
Serverless computing has evolved significantly over the years, building upon traditional cloud models and
transforming the way applications are deployed and managed. The key milestones in this evolution
showcase how serverless computing has continuously advanced, enabling greater efficiency, automation,
and scalability in cloud computing. Serverless computing has evolved significantly over the years,
building upon traditional cloud models and transforming the way applications are deployed and managed.
The key milestones in this evolution are as follows:
• 2006 – AWS EC2 and the Start of Modern Cloud Computing: Amazon Web Services (AWS)
launched Amazon EC2 (Elastic Compute Cloud), introducing the pay-as-you-go model for virtual
servers. This eliminated the need for businesses to maintain physical infrastructure, allowing them
to scale resources based on demand. However, EC2 still required users to manage virtual
machines, operating systems, and scaling configurations manually. While EC2 significantly
reduced the complexity of hardware provisioning, it still placed the burden of server management
and maintenance on developers and IT teams. Businesses had to predict resource requirements,
configure load balancers, and manually set up redundancy mechanisms to ensure uptime and
reliability. These challenges paved the way for the emergence of more abstracted cloud services,
such as Platform as a Service (PaaS) and eventually Serverless Computing, which further
simplified infrastructure management and improved development efficiency.
• 2008 – Google App Engine and the Rise of PaaS: Google introduced Google App Engine, an
early Platform as a Service (PaaS) solution that simplified cloud application deployment.
Developers could deploy applications without managing infrastructure, but they still needed to
configure scaling and optimize performance manually, limiting full automation in cloud
computing. GAE provided developers with a fully managed platform to build, deploy, and scale
applications without having to manage the underlying infrastructure. Unlike Infrastructure as a
Service (IaaS) offerings like Amazon EC2, which required users to configure and maintain virtual
machines, App Engine abstracted infrastructure complexities by automatically provisioning
resources, handling load balancing, and managing runtime environments.
• 2014 – AWS Lambda and the Birth of Serverless Computing: A major breakthrough in cloud
computing came in 2014 when AWS introduced AWS Lambda, marking the rise of Function as a
Service (FaaS) and a fundamental shift toward true serverless computing. AWS Lambda
revolutionized cloud application development by allowing developers to run individual functions
in response to specific events without provisioning, managing, or maintaining servers. This event-
driven computing model significantly reduced operational complexity and infrastructure costs, as
users were billed only for the compute time consumed, eliminating the need for always-on servers.

Dept. of CSE (CD), ICEAS 2024-2025 5


Serverless Computing: The Future of Cloud
• 2015-2016 – Expansion of Serverless with Google & Microsoft: Serverless computing gained
significant momentum with the introduction of Google Cloud Functions (2017) and Microsoft
Azure Functions (2016), both of which provided event-driven execution models similar to AWS
Lambda. These services allowed developers to run code in response to events without worrying
about server management, enabling greater scalability, flexibility, and cost efficiency. As a result,
competition in the Function as a Service (FaaS) market intensified, driving innovation and
accelerating adoption across various industries.
• 2017-2018 – Emergence of Serverless Databases & Storage: Advancements in serverless
computing led to the introduction of serverless databases and storage solutions, enabling fully
serverless architectures. Services like AWS Aurora Serverless allowed databases to automatically
scale based on workload demand, eliminating manual capacity planning. Additionally, cloud
providers improved storage solutions like AWS S3 and Firebase Storage, which seamlessly
integrated with serverless applications. These developments enabled businesses to build efficient,
cost-effective backend systems without managing infrastructure.
• 2019-2021 – Integration with Kubernetes & Edge Computing: Serverless computing expanded
beyond cloud environments with integration into Kubernetes and edge computing. Technologies
like Knative and OpenFaaS allowed organizations to run serverless workloads on Kubernetes,
enabling hybrid cloud strategies across on-premises and multi-cloud environments. Meanwhile,
edge computing services like Cloudflare Workers and AWS Lambda@Edge allowed functions to
process data closer to users, reducing latency and improving performance. While traditional
serverless solutions (e.g., AWS Lambda, Google Cloud Functions, and Azure Functions) operate
within a specific cloud provider’s ecosystem, enterprises sought greater control, portability, and
multi-cloud capabilities. To address this, technologies like Knative, OpenFaaS, and Kubeless
extended serverless computing into Kubernetes-based environments.
• 2022-Present – AI-Powered Automation & Future Trends: The latest advancements in
serverless computing focus on AI-powered automation and machine learning integration. Cloud
providers now offer intelligent auto-scaling mechanisms that predict workload demands,
improving efficiency and reducing cold start latency. Serverless computing is also playing a key
role in real-time data processing, AI inference, and IoT applications, enabling businesses to build
highly scalable, event-driven applications with minimal manual intervention. One of the major
challenges in traditional serverless computing has been cold start latency—the delay when a
function is invoked after being idle for a period of time. AI-powered auto-scaling mechanisms
have emerged to address this issue by using predictive scaling to anticipate workload demands and
proactively provision resources.

Dept. of CSE (CD), ICEAS 2024-2025 6


Serverless Computing: The Future of Cloud

Fig 2.1: Evolution of Cloud

The transition from traditional cloud models to serverless computing has been driven by several key
factors, including cost efficiency, operational simplicity, scalability, and faster development cycles.
Unlike traditional Infrastructure as a Service (IaaS) and Platform as a Service (PaaS) models, where
developers were responsible for managing servers, scaling, and infrastructure configurations, serverless
computing abstracts these complexities and allows developers to focus purely on application logic. With
advancements in AI-driven automation, hybrid cloud deployments, and decentralized computing,
serverless is becoming a foundational component of cloud-native application development.
Organizations are adopting serverless for a wide range of applications, from real-time analytics to
scalable backend systems, ensuring a future-proof and cost-effective computing paradigm.

• Pre-Cloud (Traditional Computing): Before cloud computing, organizations managed


physical servers in on-premises data centers, requiring extensive hardware investments,
maintenance, and manual provisioning. These traditional setups demanded dedicated IT teams
to handle server administration, network security, and software updates, leading to high
operational costs and complexity.

• Cloud 1.0 (Virtualization & Cloud Computing): Cloud computing introduced virtual
machines (VMs), allowing users to rent computing resources instead of owning physical servers.
This shift eliminated the need for on-premises infrastructure, reducing capital expenditure
(CapEx) while offering greater flexibility and scalability. Organizations could provision VMs
on-demand, paying only for the resources they used, rather than investing in expensive hardware
that required ongoing maintenance.

Dept. of CSE (CD), ICEAS 2024-2025 7


Serverless Computing: The Future of Cloud

• Serverless Computing (Cloud 2.0): Serverless computing eliminates infrastructure


management by running event-driven functions that automatically scale based on demand.
Unlike traditional computing models, where developers must provision and maintain servers,
serverless architectures allow cloud providers to handle the provisioning, execution, scaling, and
maintenance of resources. This significantly reduces operational complexity, allowing
developers to focus entirely on writing and deploying code.

Serverless computing represents a transformative shift in cloud technology, offering a more efficient,
cost-effective, and scalable alternative to traditional computing models. By eliminating the need for
server management, organizations can focus on innovation and accelerate application development
without worrying about infrastructure maintenance. This transition from on-premises data centres (Pre-
Cloud) to Virtualized Cloud Computing (Cloud 1.0), and ultimately to Serverless Architectures (Cloud
2.0) highlights the continuous evolution of computing towards greater automation, flexibility, and cost
savings. With the growing adoption of AI-driven automation, hybrid cloud strategies, and edge
computing, serverless architectures are set to become even more integral to modern cloud-native
applications. Businesses leveraging serverless computing benefit from faster deployment cycles,
seamless scalability, and optimized resource utilization, making it a key driver of digital
transformation. As cloud providers continue to enhance serverless offerings, addressing challenges
such as cold start latency and vendor lock-in, the future of serverless computing looks promising,
paving the way for next-generation applications and services.

Dept. of CSE (CD), ICEAS 2024-2025 8


Serverless Computing: The Future of Cloud

CHAPTER 3
HISTORY

The term 'serverless' can be traced to its original meaning of not using servers and typically referred to peer-
to-peer (P2P) software or client-side only solutions. Serverless seems to be the natural progression
following recent advancements and adoption of VM and container technologies, where each step up the
abstraction layers led to more lightweight units of computation in terms of resource consumption, cost, and
speed of development and deployment. Serverless platforms can be considered an evolution of Platform-as-
a-Service (PaaS) as provided by plat- forms such as Cloud Foundry, Heroku, and Google App Engine
(GAE).
Serverless computing has evolved alongside cloud computing, transforming how applications are built and
deployed. Below is a timeline of its development, from the early days of cloud computing to modern
serverless architectures.
• The pre-Serverless Era (2000s – 2013): Before the advent of serverless computing, cloud
providers primarily relied on Infrastructure as a Service (IaaS) and Platform as a Service (PaaS)
models, which required developers to manage virtual machines, networking, and scaling. In 2006,
Amazon Web Services (AWS) launched Amazon EC2 (Elastic Compute Cloud), marking the
beginning of cloud computing by allowing businesses to rent virtual servers on demand. Later, in
2008, Google introduced Google App Engine, an early form of PaaS, which simplified cloud
application deployment by providing a managed environment. Although these cloud services
significantly reduced the need for physical infrastructure, developers were still responsible for
managing servers, configuring scaling, and optimizing resource usage manually, leading to the
evolution of serverless computing as a fully managed cloud model.
• Birth of Serverless Computing (2014 – 2016): The concept of serverless computing gained
mainstream attention with the introduction of Function as a Service (FaaS) platforms by major
cloud providers. In 2014, AWS launched Lambda, allowing developers to run functions in
response to events without provisioning servers, marking the first major step toward serverless
computing. Following this, in 2015, Google introduced Cloud Functions, offering a direct
competitor to AWS Lambda and accelerating the adoption of event-driven applications. By 2016,
Microsoft Azure Functions entered the market, expanding serverless computing into the
Microsoft cloud ecosystem. This period saw rapid adoption as organizations recognized the cost
efficiency, scalability, and operational simplicity that serverless architectures provided.

Dept. of CSE (CD), ICEAS 2024-2025 9


Serverless Computing: The Future of Cloud
• Expansion and Adoption (2017 – 2020): With the increasing adoption of serverless computing,
new advancements emerged to enhance its capabilities. In 2017, the introduction of the Serverless
Framework, an open-source tool, simplified serverless application development, making it easier
for developers to build and deploy applications. Additionally, AWS launched Aurora Serverless,
enabling automatic scaling for cloud databases without manual intervention. By 2018, companies
began embracing multi-cloud serverless solutions to avoid vendor lock-in, ensuring flexibility
across different cloud providers. This period also saw the rise of edge computing, with Cloudflare
Workers extending serverless capabilities to distributed environments, reducing latency by
running functions closer to users. In 2019, serverless computing further evolved with Kubernetes
integration, as frameworks like Knative and OpenFaaS enabled serverless workloads on
Kubernetes, combining the benefits of containerization and serverless architecture. During this
time, hybrid serverless computing also gained popularity, especially in enterprise environments,
offering greater flexibility and control over cloud and on-premises workloads.
• Serverless in the AI and Edge Era (2021 – Present): Serverless computing has now become a
crucial component of modern cloud architecture, seamlessly integrating with AI, machine
learning, and edge computing. In 2021, cloud providers introduced serverless AI inference,
enabling real-time machine learning applications without requiring developers to manage
infrastructure. By 2022, serverless containers gained traction as AWS, Google, and Microsoft
launched solutions like AWS Fargate, allowing containerized applications to run without server
management. During this period, microservices-based applications increasingly adopted
serverless computing, enhancing scalability and efficiency. From 2023 to the present, serverless
edge computing has become a reality with platforms like Cloudflare Workers, Fastly
Compute@Edge, and AWS Lambda@Edge, reducing latency by running functions closer to
users. Additionally, AI-powered automation in serverless computing has further optimized
scaling and execution performance, making applications more efficient and intelligent.
• The Future of Serverless Computing: The future of serverless computing is expected to bring
greater intelligence, decentralization, and expanded use cases. AI-driven auto-scaling will
enhance workload management, allowing cloud platforms to dynamically allocate resources
based on real-time demands, optimizing performance and cost. Additionally, decentralized
serverless computing will emerge, enabling applications to run on blockchain-based decentralized
networks, reducing reliance on traditional cloud providers. Enterprises will also leverage hybrid
and multi-cloud serverless architectures, combining serverless computing with on-premises
infrastructure and multiple cloud providers to increase flexibility and avoid vendor lock-in.

Dept. of CSE (CD), ICEAS 2024-2025 10


Serverless Computing: The Future of Cloud
Serverless computing has come a long way from traditional cloud infrastructure models like IaaS and
PaaS. It has revolutionized the way applications are developed, deployed, and scaled. As AI, edge
computing, and hybrid cloud architectures evolve, serverless computing is set to become an even more
dominant force in modern cloud environments. From its early days with IaaS and PaaS to its
transformation into serverless computing, the cloud industry has undergone significant evolution. Today,
serverless computing is a fundamental part of modern cloud architecture, driving innovation in AI,
machine learning, IoT, and edge computing. With advancements in multi-cloud, hybrid serverless, and
AI-driven automation, serverless computing is set to redefine the future of cloud technology.

Dept. of CSE (CD), ICEAS 2024-2025 11


Serverless Computing: The Future of Cloud

CHAPTER 4

ARCHITECTURE OF SERVERLESS COMPUTING

The architecture of serverless computing is designed to eliminate the need for developers to manage
infrastructure while ensuring scalability, high availability, and event-driven execution. It consists of
multiple components that work together to provide a fully managed computing environment.

Fig 4.1: Serverless Architecture

• Core Components of Serverless Architecture: Serverless computing relies on several key


components to deliver a fully managed and event-driven environment.

o Function as a Service (FaaS): The core of serverless computing, where code runs in
response to events without managing servers. Developers write functions that execute
automatically when triggered. Examples include AWS Lambda, Google Cloud Functions,
and Azure Functions, used for API processing, automation, and real-time data
transformations.

o Backend as a Service (BaaS): Provides managed backend services like authentication,


databases, storage, and messaging, eliminating the need for server-side management.
Examples include Firebase Authentication (auth), AWS S3 (storage), and Firebase
Firestore (database), commonly used in web and mobile app development.

Dept. of CSE (CD), ICEAS 2024-2025 12


Serverless Computing: The Future of Cloud

o Event-Driven Execution Model: Functions are triggered by events such as HTTP requests
(AWS API Gateway), database changes (AWS DynamoDB Streams), file uploads (AWS
S3), and scheduled jobs (Google Cloud Scheduler). This enables real-time data
processing, automation, and task scheduling.

o Serverless Databases and Storage: Automatically scalable databases that eliminate the
need for manual provisioning. Examples include AWS DynamoDB, Firebase Firestore,
and AWS Aurora Serverless, ideal for applications requiring dynamic database
workloads.

o API Gateway: A managed API service that routes requests to serverless functions,
handling authentication, rate limiting, and monitoring. Examples include AWS API
Gateway, Google Cloud Endpoints, and Azure API Management, commonly used for
building RESTful and GraphQL APIs.

• Serverless Execution Flow: Serverless computing follows an event-driven execution flow: A


user action (e.g., API request, file upload) triggers an event source (e.g., API Gateway, database
change). This event invokes a serverless function to process the request. The cloud provider
automatically scales the function as needed. If required, the function interacts with serverless
databases or storage (e.g., AWS DynamoDB, Google Cloud Storage). Finally, a response is sent
back to the user or another service, ensuring seamless and efficient execution.

• Serverless Computing Deployment Models:

o Fully Managed Serverless: Cloud providers handle execution, scaling, and infrastructure,
offering zero server management and high availability. Examples include AWS Lambda
and Google Cloud Functions. Challenges include cold starts and vendor lock-in.

o Serverless Containers: Enables running containerized applications without server


management using services like AWS Fargate and Google Cloud Run. It offers flexibility
but requires more configuration than traditional FaaS.

o Hybrid Serverless: Integrates serverless with on-premises or cloud environments using


tools like Knative and OpenFaaS. It prevents vendor lock-in but demands Kubernetes
expertise.

Dept. of CSE (CD), ICEAS 2024-2025 13


Serverless Computing: The Future of Cloud

CHAPTER 5
WORKING MECHANISM

Serverless computing operates on an event-driven model, where code is executed only in response to
specific triggers, eliminating the need for manual infrastructure management. This paradigm optimizes
resource utilization and cost efficiency by allocating compute power only when required. The typical
workflow of a serverless function execution involves several key stages:
• User Interaction: The process begins when a user performs an action that generates an event.
This could be making an API request, uploading a file to a cloud storage service, modifying a
database record, or triggering a scheduled job. These interactions serve as catalysts for function
execution.
• Event Trigger: The user's action generates an event that is captured by an event source. Common
event sources include API Gateway (for HTTP requests), cloud storage (for file uploads),
message queues (for asynchronous task processing), databases (for change data capture), and
scheduled triggers (for periodic execution). This event acts as the signal for the cloud provider to
initiate the function execution.
• Function Invocation: Once the cloud provider detects the event, it locates the appropriate
function mapped to the trigger and provisions the required execution environment. Unlike
traditional applications, where servers continuously run, serverless functions are invoked on
demand, reducing idle resource consumption.
• Automatic Scaling: Serverless computing inherently supports automatic scaling, meaning the
cloud provider dynamically adjusts the number of function instances based on workload demand. If
multiple events occur simultaneously, multiple instances of the function are spawned in parallel,
ensuring high availability and responsiveness. This elasticity prevents resource over-provisioning
and under-utilization.
• Database/Storage Interaction: If the function requires data access, it interacts with a serverless
database or storage system to fetch or store information. Popular choices include AWS
DynamoDB, Google Firestore, or Azure Cosmos DB for databases, and Amazon S3, Google Cloud
Storage, or Azure Blob Storage for file storage. These managed services further enhance scalability
and reduce administrative overhead.
• Response Processing: After processing the event, the function generates a response. This response
may be sent back to the user, logged for analytics, or trigger another event for further processing. In
cases where the function is part of a larger workflow, it may pass data to another microservice,
message queue, or API.

Dept. of CSE (CD), ICEAS 2024-2025 14


Serverless Computing: The Future of Cloud
• Billing & Execution Termination: Once execution completes, the function shuts down
automatically, and the user is billed only for the exact compute time consumed. Unlike traditional
hosting models where resources must be provisioned in advance, serverless computing follows a
pay-per-use model, ensuring cost-effectiveness.

Fig 5.1: Working Mechanism

This image visually represents the working mechanism of serverless computing in a step-by-step
process:

• Writing a Function – Developers begin by writing a small, modular function that is designed to
perform a specific task. This function is typically stateless and optimized for rapid execution. It
can be written in various programming languages, including Python, Node.js, Java, Go, and C#,
depending on the cloud provider’s supported environments

• Defining an Event – Developers define the conditions or triggers that will execute the function,
such as an HTTP request, database update, or file upload. Developers then specify the conditions
or triggers that will invoke the function.

• Triggering an Event – When the predefined event occurs, it acts as a signal to invoke the
function. When the specified event occurs, it acts as a signal to invoke the function. The cloud
provider listens for these triggers and automatically spins up the necessary computing resources
to execute the function. This event-driven approach ensures that functions run only when
needed, avoiding unnecessary resource usage.

• Deploying a Function – The serverless platform automatically deploys the function on-demand,
ensuring it runs only when needed. Unlike traditional deployment, where servers must be pre-
provisioned and configured, serverless platforms handle deployment automatically. The function
is packaged and stored in the cloud, ready for execution whenever triggered.

Dept. of CSE (CD), ICEAS 2024-2025 15


Serverless Computing: The Future of Cloud

• Executing the Function – The function runs within a fully managed runtime environment. It
processes input parameters, performs computations, interacts with other cloud services if
needed, and generates an output. Serverless platforms ensure that functions execute quickly and
efficiently, although cold start latency may occur if a function is invoked after a period of
inactivity.

• Passing the Result – Once the function execution is complete, the result is processed and sent
back to the user or another system for further processing.
By following this workflow, serverless computing ensures high efficiency, scalability, and cost
optimization. This model is widely used for API-driven applications, data processing pipelines, IoT event
handling, and other event-based architectures, allowing developers to focus on code rather than
infrastructure management. The serverless execution model streamlines cloud application development by
removing infrastructure concerns, enabling event-driven workflows, and optimizing resource utilization.
By following this process, organizations can rapidly develop, deploy, and scale applications while
focusing on business logic rather than server management.

Dept. of CSE (CD), ICEAS 2024-2025 16


Serverless Computing: The Future of Cloud

CHAPTER 6

TRADITIONAL COMPUTING VS. SERVERLESS COMPUTING

Fig 6.1: Traditional Computing vs. Serverless Computing

Traditional computing requires organizations to manage their own servers, including provisioning
resources, configuring infrastructure, and handling software updates, security patches, and scaling manually.
This approach often leads to inefficiencies, as resources must be allocated in advance to handle peak traffic,
resulting in higher costs even during periods of low usage. Businesses need to invest in dedicated IT teams
to monitor and maintain the servers, ensuring uptime and performance. Additionally, scaling traditional
infrastructure is time-consuming and requires manual intervention, making it less adaptable to sudden
spikes in demand. Traditional computing follows a server-centric model where organizations own or rent
physical or virtual servers to run applications. This approach requires businesses to manage infrastructure,
handle scalability, and ensure uptime, leading to high operational costs and increased maintenance
overhead. Organizations using traditional computing often over-provision resources to handle peak traffic,
resulting in inefficiencies and wasted capacity. Additionally, maintaining security, compliance, and
software updates demands dedicated IT teams, making infrastructure management complex. However,
traditional computing provides full control over hardware, networking, and security, making it ideal for
industries requiring strict regulatory compliance, such as banking and healthcare.

Dept. of CSE (CD), ICEAS 2024-2025 17


Serverless Computing: The Future of Cloud
In contrast, serverless computing eliminates the need for server management, as cloud providers handle
infrastructure, scaling, and maintenance automatically. Resources are allocated dynamically based on
demand, ensuring that applications scale up or down seamlessly without human intervention. This approach
optimizes resource usage and significantly reduces operational costs since users are billed only for the actual
compute time consumed rather than for idle server capacity. Serverless architectures enable businesses to
focus on application development rather than infrastructure management, improving agility, cost efficiency,
and scalability. serverless computing eliminates infrastructure management by automatically scaling
resources based on real-time demand. Cloud providers like AWS Lambda, Google Cloud Functions, and
Azure Functions handle execution, allowing developers to focus on coding without worrying about server
provisioning. Serverless operates on a pay-per-use model, significantly reducing costs as organizations only
pay for execution time. This model enhances agility and accelerates deployment, making it suitable for
microservices, APIs, and event-driven applications. However, serverless computing comes with challenges
such as cold start latency, limited execution time per function, and potential vendor lock-in, as applications
become tightly coupled with specific cloud services.

Serverless computing simplifies application deployment by eliminating server management, reducing costs,
and enabling automatic scaling. Unlike traditional computing, which requires manual infrastructure
management, serverless allows developers to focus on coding while cloud providers handle scaling and
maintenance. This shift enhances efficiency, agility, and cost-effectiveness, making serverless the preferred
choice for modern applications. The choice between traditional and serverless computing depends on the
application’s needs. Traditional computing is best suited for workloads requiring consistent uptime,
predictable performance, and full infrastructure control. In contrast, serverless computing excels in
environments demanding scalability, cost efficiency, and rapid development. For example, a banking
application handling real-time transactions would benefit from traditional computing’s stability and security,
whereas an e-commerce platform experiencing fluctuating traffic would gain from serverless computing’s
ability to scale dynamically. Ultimately, businesses must evaluate factors like performance, budget, security,
and compliance requirements to determine the most suitable computing model.

Dept. of CSE (CD), ICEAS 2024-2025 18


Serverless Computing: The Future of Cloud

CHAPTER 7

ADVANTAGES OF SERVERLESS COMPUTING

Serverless computing offers numerous benefits, from automatic scaling and cost efficiency to improved
security and seamless cloud integration. By eliminating the need for server management, businesses can
focus on innovation, reduce operational overhead, and enhance agility. As more organizations adopt
serverless architectures, the future of cloud computing is expected to be increasingly event-driven and
function-based.
• No Server Management: Serverless computing eliminates the need for developers to manage
physical or virtual servers. Traditionally, organizations had to set up and maintain servers, handle
software updates, security patches, and ensure system availability. With serverless computing,
cloud providers such as AWS, Microsoft Azure, and Google Cloud handle all infrastructure
management, allowing developers to focus solely on application development.
• Automatic Scaling: One of the biggest advantages of serverless computing is its ability to scale
automatically. Unlike traditional architectures where businesses must pre-allocate server capacity,
serverless functions scale dynamically based on real-time demand. If traffic increases, the cloud
provider automatically provisions more function instances. When demand decreases, it scales
down, ensuring optimal performance without over-provisioning resources.
• Cost Efficiency: With traditional computing models, organizations must pay for servers even when
they are not fully utilized. Serverless computing follows a pay-as-you-go model, where users are
charged only for the exact execution time of their functions. Since there are no costs for idle
resources, businesses can significantly reduce their cloud expenses. This cost-efficient pricing
model makes serverless ideal for startups and enterprises looking to optimize IT budgets.
• Faster Development & Deployment: By removing the burden of infrastructure management,
serverless computing accelerates software development. Developers can deploy applications
quickly by writing code and deploying it directly without configuring backend servers. Continuous
integration and continuous deployment (CI/CD) pipelines can be easily integrated with serverless
environments, further speeding up the development lifecycle.
• High Availability & Fault Tolerance: Cloud providers ensure high availability and fault tolerance
by distributing serverless functions across multiple data centres. If one function instance fails,
another is automatically provisioned, maintaining service continuity. This redundancy ensures that
applications remain available even in the event of a hardware failure.

Dept. of CSE (CD), ICEAS 2024-2025 19


Serverless Computing: The Future of Cloud
• Event-Driven Execution: Serverless computing operates on an event-driven model, meaning
functions execute only when a specific event triggers them. Common triggers include HTTP
requests (via API Gateway), file uploads to cloud storage, database modifications, or scheduled
tasks. This design reduces unnecessary processing and optimizes resource consumption.
• Better Resource Utilization: Unlike traditional computing models where resources must be pre-
allocated to handle peak loads, serverless computing optimizes resource utilization. Functions are
executed on-demand and terminated after completion, ensuring that cloud resources are efficiently
allocated based on actual usage rather than estimations.
• Security & Compliance: Cloud providers handle security measures such as encryption,
authentication, and compliance with industry regulations (e.g., GDPR, HIPAA). Since serverless
functions are stateless and execute in isolated containers, security risks associated with long-
running servers are minimized.
• Seamless Integration with Cloud Services: Serverless functions integrate effortlessly with various
cloud services, including databases, storage, and messaging systems. For example, AWS Lambda
can work with Amazon DynamoDB (database), Amazon S3 (storage), and Amazon SNS
(messaging). This allows developers to build complex workflows without having to manage
additional backend infrastructure.
• Support for Multiple Programming Languages: Most serverless platforms support a wide range
of programming languages, including Python, JavaScript (Node.js), Go, Java, C#, and more. This
flexibility allows developers to write functions in their preferred language, making serverless
computing accessible to a broader audience.

Dept. of CSE (CD), ICEAS 2024-2025 20


Serverless Computing: The Future of Cloud

CHAPTER 8

DISADVANTAGES OF SERVERLESS COMPUTING

While serverless computing offers scalability, cost efficiency, and ease of deployment, it also comes with
trade-offs such as cold starts, vendor lock-in, and debugging complexities. It is most suitable for event-
driven applications, short-lived workloads, and applications that require automatic scaling. However, for
high-performance computing, stateful applications, and long-running processes, alternative solutions such
as containers, virtual machines, or hybrid architectures may be more appropriate.
• Cold Start Latency: Serverless functions are stateless and execute on demand. When a function is
triggered after a period of inactivity, the cloud provider needs to initialize a new instance before
execution. This initialization time, known as a cold start, causes latency, which can impact
performance. Cold starts are especially problematic for applications requiring real-time responses,
such as financial transactions or interactive user interfaces.
• Vendor Lock-in: Serverless computing relies on proprietary cloud provider services (AWS
Lambda, Azure Functions, Google Cloud Functions). Applications built using one provider’s
ecosystem may require significant modifications to migrate to another platform.
• Limited Execution Time: Most serverless platforms impose execution time limits on functions. If
a function takes longer than the provider’s limit, it gets automatically terminated.
• Higher Costs for Long-running Workloads: Although serverless computing follows a pay-as-
you-go model, it may become expensive for applications that run continuously for extended
periods. Serverless is cost-efficient for short, event-driven tasks but can be more expensive than
dedicated servers for sustained workloads.
• Security Concerns: Serverless applications face unique security risks due to their event-driven
nature and multi-tenant execution environments.
• Limited Control Over Environment: Since the cloud provider manages the execution
environment, developers have limited control over configuration settings, runtime versions, and
system updates. This can lead to compatibility issues.

Dept. of CSE (CD), ICEAS 2024-2025 21


Serverless Computing: The Future of Cloud

CHAPTER 9

USE CASES OF SERVERLESS COMPUTING

Serverless computing has transformed how applications are built and deployed by eliminating the need for
infrastructure management. It is widely adopted across industries for its scalability, cost-effectiveness, and
event-driven execution. Various use cases demonstrate how businesses leverage serverless architectures to
optimize performance, automate workflows, and handle real-time data processing.
• Web and Mobile Applications: Serverless computing is ideal for web and mobile apps that require
scalability. It allows developers to run backend logic without managing servers, handling
authentication, database interactions, and business logic efficiently.
Example: Netflix and Airbnb use serverless computing for handling user requests dynamically.
• API Backend Development: Serverless architectures are widely used for developing RESTful and
GraphQL APIs, where functions execute only when an API request is made.
Example: AWS Lambda with API Gateway enables a fully serverless API backend without
provisioning or managing infrastructure.
• IoT (Internet of Things) Applications: IoT devices generate vast amounts of data that need real-
time processing. Serverless functions can handle event-driven IoT tasks, processing sensor data
and sending notifications.
Example: Smart home devices use AWS Lambda to trigger actions based on user input or sensor
data.
• Machine Learning Model Inference: Serverless computing can run AI models for tasks like image
recognition, sentiment analysis, and recommendation systems, executing only when needed.
Example: A serverless function processes an uploaded image and returns detected objects using a
machine learning model.
• Microservices Architecture: Serverless computing is an excellent fit for microservices, where
individual functions handle different parts of an application.
Example: A shopping website can have separate serverless functions for checkout, payment
processing, and order tracking.
• Chatbots and Virtual Assistants: Serverless platforms provide scalable backend logic for chatbots
and voice assistants, processing user queries dynamically.
Example: Amazon Alexa skills and Slack bots use serverless functions to process conversations and
provide responses.

Dept. of CSE (CD), ICEAS 2024-2025 22


Serverless Computing: The Future of Cloud

CHAPTER 10

REAL TIME EXAMPLES OF SERVERLESS COMPUTING

Serverless computing enables applications to execute tasks instantly without managing servers, making it
ideal for real-time processing. Cloud platforms like AWS Lambda, Google Cloud Functions, and Azure
Functions handle event-driven workloads efficiently, ensuring scalability and cost-effectiveness.
• Netflix – Video Processing and Automation: Netflix uses AWS Lambda to automate video
encoding workflows, ensuring that new content is processed efficiently. When a video file is
uploaded, a serverless function automatically triggers encoding, optimizing it for multiple devices.
This helps scale video processing without managing infrastructure.
• Airbnb – Real-Time Notifications: Airbnb uses serverless functions to send real-time notifications
to users. For instance, when a booking is confirmed or a host responds, an event-driven function
triggers an instant notification via email or mobile push notifications, ensuring seamless user
communication.
• Coca-Cola – Smart Vending Machines: Coca-Cola uses serverless computing to manage
transactions in its vending machines. When a customer makes a purchase using a mobile payment
app, a serverless function processes the transaction and updates inventory data in real time. This
eliminates the need for maintaining dedicated servers for vending machine operations.
• Capital One – Secure Banking APIs: Capital One leverages AWS Lambda to manage secure
banking transactions through APIs. Whenever a customer accesses their bank account or performs a
transaction, serverless functions handle authentication, fraud detection, and data retrieval, ensuring
high availability and security.
• Spotify – Personalized Playlists: Spotify uses serverless architecture to generate personalized
playlists based on user preferences. When a user listens to songs, an event triggers a function that
analyses listening history and updates recommendations dynamically, offering a seamless music
experience.
These examples highlight how serverless computing enhances scalability, efficiency, and cost-effectiveness
across different industries.

Dept. of CSE (CD), ICEAS 2024-2025 23


Serverless Computing: The Future of Cloud

CHAPTER 11

FUTURE TRENDS IN SERVERLESS COMPUTING

The future of serverless computing is intelligent, secure, and highly scalable. AI-driven auto-scaling, hybrid
cloud adoption, edge computing, enhanced security, and better debugging tools will drive its evolution.
These advancements will make serverless computing more efficient, flexible, and suitable for a broader
range of applications, from real-time IoT processing to enterprise-scale automation.
• AI-Driven Auto-Scaling: Traditional serverless scaling relies on predefined rules, but the future of
serverless computing will involve AI and machine learning (ML)-driven auto-scaling mechanisms.
These intelligent systems will analyse past usage patterns, predict traffic surges, and proactively
allocate resources. This will reduce latency and prevent cold start issues, making applications more
responsive and cost-efficient.
• Hybrid and Multi-Cloud Adoption: As organizations move toward multi-cloud and hybrid-cloud
strategies, serverless computing is expected to evolve to support cross-cloud portability. Businesses
will deploy serverless functions across multiple cloud providers (AWS, Azure, Google Cloud) or
even on-premises environments to avoid vendor lock-in, improve redundancy, and optimize costs.
• Edge Computing Integration: Edge computing and serverless computing will become more
tightly integrated, allowing functions to run closer to end users rather than relying on centralized
cloud data centres. This reduces latency and bandwidth usage, making serverless ideal for real-time
applications like IoT, AR/VR, and autonomous systems.
• Security Enhancements: As serverless computing expands, security concerns such as function
isolation, data privacy, and access control become increasingly important. Future advancements
will focus on stronger runtime security to prevent unauthorized access, ensuring that serverless
functions operate within a secure execution environment. Additionally, more granular permissions
will be implemented to regulate how functions interact with each other, minimizing the risk of
unintended data exposure. Improved API security measures will also be developed to protect
against injection attacks and unauthorized API calls, enhancing overall system integrity and
reducing vulnerabilities in serverless applications.
• Better Tooling and Debugging: Future advancements will enhance monitoring, logging, and
debugging tools, providing deeper insights into function execution. Improved solutions like AI-
driven anomaly detection, distributed tracing (e.g., AWS X-Ray, OpenTelemetry), and real-time
monitoring dashboards will help developers track errors, optimize performance, and improve
system reliability.

Dept. of CSE (CD), ICEAS 2024-2025 24


Serverless Computing: The Future of Cloud

CONCLUSION

Serverless computing has transformed the way applications are developed, deployed, and managed by
shifting the responsibility of infrastructure management to cloud providers. This approach allows
developers to focus on writing code without worrying about provisioning, scaling, or maintaining servers.
By leveraging automatic scaling, serverless architectures efficiently handle variable workloads, ensuring
optimal resource utilization and cost savings.
Despite its advantages, serverless computing comes with challenges such as cold start latency, vendor
lock-in, and limitations in debugging and monitoring. Organizations must carefully assess these factors
before adopting a serverless model, particularly for applications requiring low-latency performance or
long-running processes.
As technology advances, future trends like AI-driven auto-scaling, hybrid and multi-cloud adoption,
enhanced security mechanisms, and improved observability will further refine serverless computing. With
its growing capabilities and widespread adoption, serverless computing is poised to remain a crucial
component in modern cloud-native application development, enabling businesses to innovate faster and
operate more efficiently.

Dept. of CSE (CD), ICEAS 2024-2025 25


REFERENCES

[1] Baldini, I., Castro, P., Chang, K., Cheng, P., et al. (2017). "Serverless Computing: Current Trends
and Future Directions." IEEE Cloud Computing.
[2] Jonas, E., Schleier-Smith, J., Sreekanti, V., et al. (2019). "Cloud Programming Simplified: A
Berkeley View on Serverless Computing." UC Berkeley Technical Report.
[3] Villamizar, M., Garcés, O., Castro, H., et al. (2017). "Evaluating the Performance of Serverless
Computing for API Backend Implementations." IEEE International Conference on Cloud
Engineering (IC2E).
[4] IBM Cloud. "Serverless Computing Use Cases and Applications." Available at:
https://www.ibm.com/cloud/serverless
[5] Gartner Report (2023). "Future of Cloud Computing: Serverless Trends and Adoption."
[6] Amazon Web Services (AWS). "What is Serverless Computing?" Available at:
https://aws.amazon.com/serverless/
[7] Microsoft Azure. "Introduction to Azure Functions." Available at: https://learn.microsoft.com/en-
us/azure/azure-functions/
[8] Google Cloud. "Serverless Computing Explained." Available at: https://cloud.google.com/serverless

You might also like