SDA Assignment 01
SDA Assignment 01
Background:
Like many other buzzwords in the industry, it’s quite difficult to reliably
track the source of origin for the word serverless. However, what is important is to appreciate the
benefits it can bring to our application. Serverless architecture may sound very peculiar, to begin
with, but it makes sense when we get to understand it better. Typically when we develop an
application, we require servers to host and serve them. For instance, when we develop a Java
application bundled as a WAR file, we require an application container like Tomcat to run it on a
host like a Linux machine possibly with virtualization. Then there are other considerations like
provisioning the infrastructure with high availability and fault tolerance:
Of course, this implies that even before we are able to serve the first request, we’ve to go through
a lot of preparations. And this does not stop there, as we’ve to manage this infrastructure from
there on to keep it serviceable. What if we do not have to bother about all these tasks that are not
strictly related to application development?
This is the fundamental premise of serverless. So basically, serverless architecture is a
software design pattern where we host our applications on a third-party service. It
eliminates the need for managing the hardware and software layers necessary to do so.
Moreover, we do not even have to worry about scaling the infrastructure to match the load
Cost efficiency
Serverless computing charges for the resources used rather than pre-purchased capacity. You
don't pay for idle capacity or manage servers, and you avoid wastage during off-peak times
expected in traditional server-based architectures. This pay-per-use model results in cost savings
for variable workloads.
Operational efficiency
The serverless model simplifies the infrastructure management tasks, such as server
provisioning, patching and maintenance, which enables developers to focus on building
application features. It optimizes the workflow and makes deployment and updates faster
because the cloud provider handles server management complexities.
Scalability
The automatic scaling feature is advantageous for handling unpredictable or fluctuating traffic
patterns, as it ensures that the application remains responsive without manual intervention.
Additionally, serverless computing can dynamically adjust to sudden spikes in traffic, such as
during major events or sales, to ensure consistent performance.
Simplified back-end code enables developers to concentrate on their core product, often leading
to better quality and more innovative features. Serverless architectures are also microservices-
friendly, which makes it easier to develop, deploy and manage small, independent and modular
pieces of code in complement with microservices patterns.
Major cloud providers provide integrated services that work seamlessly with serverless
computing, including databases and machine learning capabilities. This enables the creation of
feature-rich applications
Performance issues
When a function remains unused for a certain period, it enters a dormant state. As a result,
subsequent requests after this period may experience a delay in response time -- referred to as a
cold start -- since the server needs to allocate resources and start the function from scratch. There
may be better choices than serverless computing in applications with critical response time.
Latency variability, especially in cold starts and resource allocation, can cause issues.
Vendor lock-in
Serverless architectures often rely on the services and tools that a single cloud provider offers.
This can result in vendor lock-in, which makes it challenging and potentially expensive to
migrate to a different provider in the future. Additionally, many serverless platforms offer
proprietary services that may need more equivalent options on other platforms, which can further
complicate potential migration efforts.
Limited control and flexibility
When using a serverless approach, you may have limited control over the underlying
infrastructure, including the OS and hardware. This can become a problem if your application
requires specific environmental configurations. Additionally, serverless platforms often have
restrictions on runtime execution, such as maximum execution time for a function, and the
available execution environments, such as supported programming languages and versions.
Security
Serverless applications can potentially increase the risk of cyberattacks because each function
can serve as a potential attack entry point. Additionally, a serverless application's security largely
depends on the security measures implemented by the cloud provider. While providers generally
have extensive security measures, the application owner is responsible for securing the
application code and data per the shared responsibility model.
Monitoring and logging can pose challenges due to the application's distributed nature of
serverless functions. Debugging serverless applications can be difficult, especially when
attempting to reproduce the exact conditions that led to an issue, given the stateless and
ephemeral nature of serverless functions.
Diagram:
References:
https://www.baeldung.com
https://www.techtarget.com