0% found this document useful (0 votes)
38 views12 pages

Caching

Caching is the process of storing copies of data in a temporary storage area to enable faster access. There are various caching strategies like cache-aside, read-through, write-through, write-back, and write-around. Popular caching mechanisms include content delivery network caching, page caching, object caching, web browser caching, session caching, and database caching. Common caching systems used are Redis and Memcached. Caching can improve performance in use cases involving database queries, transaction logging, interactive user interfaces, and external API requests.

Uploaded by

rohanparmar1162
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
38 views12 pages

Caching

Caching is the process of storing copies of data in a temporary storage area to enable faster access. There are various caching strategies like cache-aside, read-through, write-through, write-back, and write-around. Popular caching mechanisms include content delivery network caching, page caching, object caching, web browser caching, session caching, and database caching. Common caching systems used are Redis and Memcached. Caching can improve performance in use cases involving database queries, transaction logging, interactive user interfaces, and external API requests.

Uploaded by

rohanparmar1162
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 12

CACHING

CACHING

Caching is the process of storing copies in temporary storage location, so that they can be
accessed more quickly.

Various caching strategies are cache-aside, read-through, write-through, write-back, and write-
around
CACHING
CACHING MECHANISM

Content Delivery Network (CDN) Caching: Page Caching:


01 CDNs cache static content like images, scripts, and stylesheets 02 Entire HTML pages or fragments can be cached to avoid
on servers distributed globally. This reduces latency by serving regenerating them on each request. This is often used in Content
content from servers closer to the user. Management Systems (CMS)

Object Caching: Web Browser Caching:


03 Object caching involves storing the results of expensive
function calls or complex calculations in memory. This is
04 Web browsers cache resources such as images, stylesheets, and
scripts locally. This prevents the need to re-download the same
common in programming languages like Python and PHP resources when a user revisits a website, improving page load
times.

Session Caching: Database Caching:


05 Session data can be cached to reduce the load on the server and
improve response times. However, caution is needed to ensure
06 Database caching involves storing frequently accessed query
results or data in memory to reduce the need for repeated
data consistency and security. database queries. Memcached and Redis are popular in-memory
caching systems.
CACHING
CACHING STARTEGIES

Cache Aside Write Through


01 Application first checks data in cache.If data is available(cache
02 Here cache is responsible for the handling the data.
hit) then it returns directly to the app adn if not(cache miss) then So when application want to write data it first writes into cache
it will retrieve data from database and the same time cache will update data into database
CACHING
CACHING STARTEGIES

Write Back/Write Behind Read Through


03 It is similar to write through but has one main difference.Data written 04 This cache system manages data retrieval from database on
in the cache is asynchronously updated in main database.In other behalf of the application.Here application will interact with the
words application first write data into cache and then cache will cache system that acts as intermediary platform between
write into database after some delay database and appplication
CACHING
USECASES OF CACHING STRATEGIES
Database Queries:
Cache Aside
• Scenario: In applications that heavily rely on database queries, Cache-Aside can be used to cache the results of frequently
executed queries.
• Use Case: Store the result sets of read-heavy database queries in the cache. Subsequent requests can check the cache first
before hitting the database, reducing the load on the database and improving response times.

Write Through Transaction Logging:


• Scenario: Financial applications or systems where every transaction must be recorded accurately.
• Use Case: Write-Through caching can be employed to log every transaction both to the cache and the database. This
ensures that the system maintains an accurate and complete record of transactions, even in the event of a cache eviction or
failure.

Write Back Interactive User Interfaces:


• Scenario: Applications with interactive user interfaces that require quick response times.
• Use Case: Write-Back caching can be beneficial in applications where users expect immediate feedback, such as social
media platforms or collaboration tools. Acknowledging writes quickly allows the application to maintain a responsive
user experience.

Read Through External API Requests:


• Scenario: Applications that make requests to external APIs to retrieve data.
• Use Case: Read-Through caching can be used to cache the responses from external APIs. If the requested data is present
in the cache, it is returned immediately. If not, the application fetches the data from the external API, stores it in the cache,
and then returns the data to the application.
CACHING

CONTENT DELIVERY
NETWORK(CDN)
01 A content delivery network (CDN) is a geographically distributed group of servers that caches
content close to end users.We can also say it as a edge servers

02 Popular CDN providers Amazon CloudFront,Fastly,Google Cloud CDN

Azure CDN has a network of POPs (Points of Presence) strategically located around the world.
03 These POPs are responsible for caching and delivering content.
Azure Content Delivery Network (CDN) reduces load times, save bandwidth and speed
responsiveness.

04 Amazon CloudFront also operates using a network of edge locations, which are distributed
globally. Requests are automatically routed to the nearest edge location
Amazon CloudFront is a fast content delivery network (CDN) service that securely delivers data,
videos, applications, and APIs to customers globally with low latency, high transfer speeds.
CACHING

COOKIES,LOCAL STORAGE, SESSION


STORAGE
CACHING

REDIS AND MEMCACHED


REDIS AND MEMCACHED ARE BOTH POPULAR, OPEN-SOURCE, IN-MEMORY DATA STORAGE
THAT SERVE AS KEY-VALUE STORES FOR CACHING AND DATA STORAGE

REDIS MEMCACHED
DATA STRUCTURE:
• Redis supports a variety of data structures, including • Memcached primarily deals with plain key-value pairs
strings, hashes, lists, sets, and more. This makes it without support for complex data structures. It's
versatile for a wide range of use cases. optimized for simplicity and speed.

PERSISTANCE:
• Redis offers options for data persistence, allowing you to • Memcached does not provide built-in support for data
persist data to disk. This feature makes Redis suitable for persistence. meaning data is only stored in memory and
use as both a cache and a primary data store. may be lost upon restart.

USECASES:
• Commonly used for caching, real-time analytics, • Widely used for caching purposes, particularly in
messaging systems, leaderboards, and more. scenarios where a simple, fast caching layer is needed to
relieve the load on databases.
CACHING
HOTSTAR CASE STUDY

01 Content Delivery Network (CDN):


• Hotstar likely utilizes a Content Delivery Network to distribute content across various
geographic locations.
• CDNs use caching to store copies of content in servers strategically placed around the world.

02 Edge Caching:
• Edge caching involves placing caching servers (caches) closer to the end-users at the edge of
the network. This helps in minimizing the distance data needs to travel, improving response
times

03 Segmented Content Delivery:


• Video content is often divided into smaller segments (chunks).
• Hotstar may cache these video segments at edge servers, allowing for faster delivery to users
by serving segments from the nearest cache
CACHING
HOTSTAR CASE STUDY

05 HTTP Caching Headers:


• By using proper HTTP caching headers, the server can instruct clients and intermediate
caches on how to handle the caching of responses. This includes setting expiration times,
validation mechanisms, and cache-control directives.

06 Dynamic Content Caching:


• For dynamic content, where personalized or frequently changing data is involved, caching
strategies may include techniques like fragment caching, where only specific parts of a page
are cached, or caching based on user sessions.

07 Adaptive Bitrate Streaming (ABR):


• Video streaming services often use adaptive bitrate streaming, where the video is delivered
in multiple quality levels (bitrates). This allows the client to adapt to varying network
conditions. The client may cache certain video segments to ensure smooth playback,
especially in scenarios where network conditions fluctuate.
CACHING

THANK YOU

You might also like