0% found this document useful (0 votes)
41 views30 pages

Parallel & Distributed Computing: By: M. Imran Siddiqui

Parallel computing involves simultaneously executing multiple computations across multiple processors to solve problems more efficiently. It divides large problems into independent parts that can be processed simultaneously. Distributed computing uses multiple autonomous computers located in different physical locations that communicate through message passing to complete tasks. Some key differences are that parallel computing uses tightly coupled processors that may share memory, while distributed computing uses loosely coupled processors that must communicate over a network.

Uploaded by

student Muskan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
41 views30 pages

Parallel & Distributed Computing: By: M. Imran Siddiqui

Parallel computing involves simultaneously executing multiple computations across multiple processors to solve problems more efficiently. It divides large problems into independent parts that can be processed simultaneously. Distributed computing uses multiple autonomous computers located in different physical locations that communicate through message passing to complete tasks. Some key differences are that parallel computing uses tightly coupled processors that may share memory, while distributed computing uses loosely coupled processors that must communicate over a network.

Uploaded by

student Muskan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 30

Parallel & Distributed

Computing

By: M. Imran Siddiqui


Parallel Computing

Parallel Computing
Parallel computing refers to the process of executing several
processors an application or computation simultaneously.

It is a kind of computing architecture where the large


problems break into independent, smaller, usually similar
parts that can be processed in one go.
Continue…
Parallel Computing
It is done by multiple CPUs communicating via shared
memory, which combines results upon completion.

It helps in performing large computations as it divides


the large problem between more than one processor.
• Parallel computing involves the simultaneous execution of multiple computations or processes to
solve a problem more efficiently.
• Weather Forecasting: Numerical weather prediction models often use parallel computing to simulate
and predict weather patterns by dividing the computational workload among multiple processors.
• Big Data Processing: Technologies like Apache Hadoop and Apache Spark utilize parallel processing
to analyze large datasets by breaking them into smaller chunks and processing them simultaneously
on multiple nodes.
• Image Rendering: Parallel computing is used in graphics processing units (GPUs) to render complex
scenes in video games or simulations by dividing the workload among numerous processing cores.
• Distributed Machine Learning: Training machine learning models on distributed systems involves
parallel processing across multiple nodes to handle large datasets efficiently.
• Parallel Databases: Systems like Google's Bigtable or Apache Cassandra distribute data across
multiple nodes, allowing for parallel processing of queries.
Distributed Computing
Distributed Computing
In distributed computing we have multiple autonomous
computers which seems to the user as single system.

In distributed systems there is no shared memory and


computers communicate with each other through message
passing. In distributed computing a single task is divided
among different computers.
DIFFERENCE…
Difference between Parallel & Distributed
Parallel Computing Distributed Computing
1. Many operations are performed 1. System components are located at
simultaneously different locations.

2. Single computer is required 2. Uses multiple computers

3. Multiple processors perform 3. Multiple computers perform


multiple operations. multiple operations
DIFFERENCE…
Difference between Parallel & Distributed
Parallel Computing Distributed Computing

4. It may have shared or distributed 4. It have only distributed memory


memory

5. Processors communicate with 5. Computer communicate with each


each other through bus other through message passing.

6. Improves system scalability, fault


6. Improves the system performance tolerance and resource sharing
• Tightly Coupled: The processors in parallel • Loosely Coupled: Distributed systems are made up
computing are closely connected and often of loosely connected processors, possibly located in
different physical locations, and they communicate
located within the same machine or server.
with each other through a network.
• High Speed Interconnect: Communication • Variable Interconnect Speed: Communication
between processors is fast, as they share a between processors can be slower than in parallel
common memory and high-speed systems due to the use of networks, and the speed
interconnect. depends on the network infrastructure.
• A program is designed to perform a complex • A web server farm is a common example of
calculation. Different parts of the calculation distributed computing. Multiple servers located in
different geographical locations handle user
are assigned to different processors, and requests. Each server has its own memory and
they work simultaneously on their respective processes requests independently. They
portions. The processors communicate communicate with each other through the network
through shared memory. to share information and balance the load.
APPLICATIONS…
Applications of Parallel Computing

• Databases and Data mining


• The real-time simulation of systems
• Networked videos and Multimedia
• Science and Engineering
• Collaborative work environments
• Augmented reality, advanced graphics, and virtual reality
• Databases and Data Mining:
– Databases: Parallel computing can be used in databases for tasks like querying and indexing. Multiple processors can simultaneously handle different parts of a large
database, improving query performance.
– Data Mining: In parallel, different algorithms can analyze subsets of a dataset simultaneously, aiding tasks like pattern recognition or anomaly detection. For example,
parallel processing can be applied to find patterns in customer behavior in a large sales database.
• Real-Time Simulation of Systems:
– Parallel computing helps simulate complex systems in real-time. For instance, in a flight simulator, different processors can handle various aspects like aerodynamics,
engine simulation, and weather conditions simultaneously, providing a more realistic and responsive simulation experience.
• Networked Videos and Multimedia:
– In distributed computing, multiple servers across a network can collaboratively deliver videos or multimedia content. Each server can handle a portion
of the streaming or processing, ensuring smooth playback and efficient content distribution. Content delivery networks (CDNs) are a common example.
• Science and Engineering:
– Parallel computing is widely used in scientific simulations. For example, in physics simulations, multiple processors can simultaneously calculate
different aspects of a complex physical system, such as fluid dynamics or particle interactions, to accelerate research and experimentation.
• Collaborative Work Environments:
– In a distributed computing environment, collaborative tools can allow users from different locations to work together in real-time. For instance, in a
shared document editing platform, distributed servers enable multiple users to collaborate on a document simultaneously, with changes synchronized
across the network.
• Augmented Reality, Advanced Graphics, and Virtual Reality:
– Parallel computing is essential for rendering realistic graphics in augmented reality (AR), advanced graphics, and virtual reality (VR) applications.
Multiple processors work together to handle the complex calculations required for rendering detailed and immersive visual experiences. For example,
in a VR game, parallel processing helps maintain a high frame rate for smooth and realistic gameplay.
• Augmented Reality (AR):
• Definition:
– Augmented Reality (AR) refers to the overlaying of digital information or content onto the real-world environment. It
enhances the user's perception of the physical world by adding computer-generated elements, such as images, sounds,
or data.
• User Interaction:
– Users in AR still interact with the real world. Digital content is superimposed on the physical environment, allowing
users to see and interact with both the real and virtual elements simultaneously.
• Environment:
– AR does not replace the real world; instead, it supplements and enhances it. Users can view the real world through
devices like smartphones, tablets, or AR glasses, with digital content integrated into their field of view.
• Examples:
– Pokémon Go is a popular AR application where digital Pokémon characters are overlaid onto the real-world
surroundings through a smartphone's camera.
– AR navigation apps display directional arrows and information on the screen while users navigate the streets.
• Virtual Reality (VR):
• Definition:
– Virtual Reality (VR) creates a completely immersive, computer-generated environment that replaces the real world. Users
wearing VR headsets are transported to a simulated, 3D environment that can be entirely different from their physical
surroundings.
• User Interaction:
– In VR, users are fully immersed in a virtual environment. The physical world is blocked out, and users interact with the
digital environment through specialized VR equipment, such as headsets and controllers.
• Environment:
– VR replaces the real world with a computer-generated environment. Users may feel as though they are in a different place
altogether, and the experience is not influenced by the physical surroundings.
• Examples:
– VR gaming involves users being fully immersed in a digital game environment, interacting with the virtual world and
characters.
– VR simulations are used for training purposes, such as flight simulators for pilots or medical simulations for healthcare
professionals.
• Summary of Differences:
• Reality Interaction:
– AR enhances real-world interactions by overlaying digital content on the physical environment.
– VR creates a virtual environment that completely replaces the real world.
• User Experience:
– AR allows users to see and interact with both real and virtual elements simultaneously.
– VR immerses users entirely in a computer-generated environment, blocking out the real world.
• Devices:
– AR is often experienced through devices like smartphones, tablets, or AR glasses.
– VR requires specialized equipment like VR headsets and controllers for a fully immersive
experience.
• Augmented Reality (AR) Examples:
• Pokémon Go:
– In Pokémon Go, players use their smartphones to explore the real world, and the game's AR feature overlays digital Pokémon characters onto the physical
environment. Players can see these Pokémon as if they exist in the real world through the phone's camera.
• AR Navigation Apps:
– Some navigation apps use AR to provide real-time information about the user's surroundings. For example, AR arrows and directions can be displayed on the
smartphone screen, guiding users through the streets as they walk.
• IKEA Place:
– The IKEA Place app allows users to virtually place furniture in their real-world environment using AR. Users can use their smartphone cameras to see how
different pieces of furniture would look and fit in their homes.
• Virtual Reality (VR) Examples:
• VR Gaming:
– VR gaming provides an immersive experience where users wear VR headsets and are transported to virtual worlds. Games like Beat Saber allow players to
interact with the virtual environment using handheld controllers, slashing through blocks to the beat of the music.
• Google Earth VR:
– Google Earth VR enables users to explore the world in a virtual environment. Users can fly over cities, zoom into specific locations, and experience a sense of
scale that goes beyond what is possible in traditional mapping applications.
• VR Roller Coaster Simulations:
– VR roller coaster simulations provide a thrilling experience where users, wearing VR headsets, feel as if they are riding a roller coaster while sitting in a
stationary chair. The visual and auditory effects create a realistic sensation of speed and drops.
Types of
Parallel Computing
Types…
03 Types of Parallel Computing
• Bit-Level Parallelism

• Instruction-Level Parallelism

• Task Parallelism
Types…
Bit – Level Parallelism
Every task is dependent on processor word size. In terms of
performing a task on large-sized data, it reduces the number of
instructions the processor must execute. There is a need to split the
operation into series of instructions.
For example, there is an 8-bit processor, and you want to do an
operation on 16-bit numbers. First, it must operate the 8 lower-order
bits and then the 8 higher-order bits. Therefore, two instructions are
needed to execute the operation. The operation can be performed
with one instruction by a 16-bit processor.
• Parallel Processing in Cryptography:
– Example: Bit-level parallelism is crucial in cryptographic algorithms like bitwise XOR operations used in
encryption and decryption. In parallel computing, multiple processors can perform XOR operations on
different bits of the data simultaneously, improving the overall processing speed.
• Parallel Bitwise Operations in Image Processing:
– Example: In image processing tasks, bitwise operations (e.g., AND, OR, XOR) are commonly used for tasks like
image masking or merging. Parallelizing these operations allows different processors to handle distinct
portions of the image, enhancing the speed of the overall processing.
• Distributed Binary Search:
– Example: In a distributed computing environment, a binary search algorithm can be parallelized at the bit
level. Each node in the distributed system can handle a specific range of bits, searching for the target value
simultaneously. This approach improves the efficiency of the search process.
• Parallel Compression and Decompression:
– Example: Bit-level parallelism is employed in parallel compression and decompression algorithms, such as
those used in distributed file systems. Multiple processors can work on compressing or decompressing
different sections of a file concurrently, enhancing the overall throughput.
Types…
Instruction – Level Parallelism
In a single CPU clock cycle, the processor decides in instruction-
level parallelism how many instructions are implemented at the
same time.
For each clock cycle phase, a processor in instruction-level
parallelism can have the ability to address that is less than one
instruction. The software approach in instruction-level
parallelism functions on static parallelism, where the computer
decides which instructions to execute simultaneously.
Types…
Instruction-Level Parallelism refers to the execution of multiple instructions simultaneously within a single processor. It
involves breaking down a program into smaller instruction-level tasks that can be executed concurrently to improve
overall performance. ILP is primarily concerned with how multiple instructions from a single program can be scheduled
and executed simultaneously. There are various techniques and approaches to exploit ILP
• Superscalar Processors:
– Definition: Superscalar processors can execute multiple instructions in parallel by having multiple execution units.
– Example: In a superscalar processor, multiple arithmetic or logic operations can be performed simultaneously, exploiting ILP to
improve instruction throughput. For instance, while one execution unit is performing a multiplication, another unit can be executing
an addition operation.
• Vector Processing:
– Definition: Vector processors operate on arrays or vectors of data in a single instruction, enabling parallel processing of multiple
data elements.
– Example: In scientific simulations within parallel computing, vector processing is often used. For instance, in weather simulations,
vector processors can simultaneously perform calculations on arrays representing different geographical points.
• SIMD (Single Instruction, Multiple Data):
– Definition: SIMD architecture allows a single instruction to be applied to multiple data elements simultaneously.
– Example: In distributed computing environments, SIMD can be employed for tasks like image processing. Each node in the
distributed system can apply the same operation to different parts of a large image in parallel, speeding up the overall processing
time.
Types…
Task Parallelism
Task parallelism is the form of parallelism in which the tasks
are decomposed into subtasks. Then, each subtask is allocated
for execution. And, the execution of subtasks is performed
concurrently by processors.
Types…
• Task: Image Processing
• Sequential Approach:
– In a traditional, sequential image processing algorithm, each pixel is processed one after the other.
• Task-Level Parallelism:
– Divide the image into distinct regions or segments.
– Assign each segment to a separate processor or core.
– Process each segment concurrently.
• Benefits:
• Faster image processing as multiple segments are processed simultaneously.
• Utilizes the capabilities of multi-core processors.

• Task: Simulation of Physical Phenomena


• Sequential Approach:
– Simulating a complex physical phenomenon, like fluid dynamics, sequentially.
• Task-Level Parallelism:
– Divide the simulation domain into smaller subdomains.
– Assign each subdomain to a different processor or node.
– Simulate each subdomain concurrently.
• Benefits:
• Accelerated simulation time as different parts of the domain are simulated simultaneously.
• Effective utilization of a high-performance computing cluster.
Advantages of
Parallel Computing
Advantages…
Advantages of Parallel Computing

• More resources are used to complete the task that led to decrease the
time and cut possible costs. Also, cheap components are used to
construct parallel clusters.
• Comparing with Serial Computing, parallel computing can solve larger
problems in a short time.
• For simulating, modeling, and understanding complex, real-world
phenomena, parallel computing is much appropriate while comparing
with serial computing.
Advantages…
Advantages of Parallel Computing
• When the local resources are finite, it can offer benefit you over non-local
resources.
• There are multiple problems that are very large and may impractical or
impossible to solve them on a single computer; the concept of parallel
computing helps to remove these kinds of issues.
• One of the best advantages of parallel computing is that it allows you to
do several things in a time by using multiple computing resources.
• Furthermore, parallel computing is suited for hardware as serial
computing wastes the potential computing power.
Advantages…
• Faster Processing Speed:
– Example: Parallel computing is widely used in scientific simulations. For instance, in weather forecasting models, the workload can be divided
among multiple processors to simulate different regions of the atmosphere concurrently, significantly reducing the time required for complex
calculations.
• Improved Throughput:
– Example: In a parallel database system, multiple processors can handle queries and transactions simultaneously. This leads to improved
throughput, allowing more users to access and retrieve data concurrently, enhancing overall database performance.
• Scalability:
– Example: Parallel computing systems can easily scale to handle larger workloads. In the context of web servers, as the number of users accessing
a website increases, additional servers can be added to the network to distribute the load and maintain responsiveness.
• Resource Utilization:
– Example: Parallelism is crucial in multimedia applications. In video rendering, parallel processing can be employed to simultaneously handle
different frames or segments of a video, making efficient use of computational resources and speeding up the rendering process.
• High Performance for Parallel Algorithms:
– Example: Sorting large datasets is a common operation in many applications. Parallel computing can significantly accelerate sorting algorithms
by distributing the sorting task among multiple processors, allowing for faster and more efficient data processing.
• Redundancy and Fault Tolerance:
– Example: In distributed systems, redundancy can be achieved by replicating data or tasks across multiple nodes. If one node fails, the system can
still function using the redundant information from other nodes. This enhances fault tolerance and system reliability.
Advantages…
• Diverse Applications:
– Example: Parallel computing is employed in machine learning, particularly in training
deep neural networks. Parallel processing on GPUs accelerates the training process by
simultaneously updating the weights of different parts of the neural network.
• Cost-Effectiveness:
– Example: Cloud computing platforms offer parallel processing capabilities, allowing
users to scale their computing resources based on demand. This flexibility provides cost-
effective solutions for tasks that require varying levels of computational power.
• Scientific and Engineering Simulations:
– Example: Finite Element Analysis (FEA) in engineering simulations involves solving
complex equations to model structural behavior. Parallel computing accelerates FEA
simulations by distributing the computational workload among multiple processors.
Disadvantages of
Parallel Computing
Disadvantages…
Disadvantages of Parallel Computing
• It addresses Parallel architecture that can be difficult to achieve.
• In the case of clusters, better cooling technologies are needed in
parallel computing.
• It requires the managed algorithms, which could be handled in
the parallel mechanism.
• The multi-core architectures consume high power consumption.
Disadvantages…
Disadvantages of Parallel Computing
• The parallel computing system needs low coupling and high
cohesion, which is difficult to create.
• The code for a parallelism-based program can be done by
the most technically skilled and expert programmers.

You might also like