0% found this document useful (0 votes)
48 views28 pages

M M M M: M M M M M M

This document provides an overview of parallel computing concepts. It discusses serial vs parallel computing and Flynn's taxonomy of parallel architectures. Shared memory and distributed memory architectures are described. Common parallel programming models like message passing and data parallel are presented. Designing parallel programs through automatic and manual parallelization is covered. Domain decomposition and functional decomposition are parallelization techniques. Array processing is given as an example parallel algorithm. The conclusion states that parallel computing provides speed advantages and will be increasingly important in the future.
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
48 views28 pages

M M M M: M M M M M M

This document provides an overview of parallel computing concepts. It discusses serial vs parallel computing and Flynn's taxonomy of parallel architectures. Shared memory and distributed memory architectures are described. Common parallel programming models like message passing and data parallel are presented. Designing parallel programs through automatic and manual parallelization is covered. Domain decomposition and functional decomposition are parallelization techniques. Array processing is given as an example parallel algorithm. The conclusion states that parallel computing provides speed advantages and will be increasingly important in the future.
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 28

PARALLEL COMPUTING

PRESENTED BYBYAISHWARYA PRATAP SINGH CS 3rd YEAR (0836310002)

Overview
Concepts and Terminology Parallel Computer Memory Architectures Parallel Programming Models Designing Parallel Programs Parallel Algorithm Examples Conclusion

Concepts and Terminology: What is Parallel Computing?


Traditionally software has been written for serial computation. Parallel computing is the simultaneous use of multiple compute resources to solve a computational problem.

SERIAL COMPUTING

PARALLEL COMPUTING

Concepts and Terminology: Why Use Parallel Computing?


Saves time wall clock time Cost savings Overcoming memory constraints Its the future of computing

Concepts and Terminology: Flynns Classical Taxonomy


Distinguishes multi-processor architecture multiby instruction and data SISD Single Instruction, Single Data SIMD Single Instruction, Multiple Data MISD Multiple Instruction, Single Data MIMD Multiple Instruction, Multiple Data

Flynns Classical Taxonomy: SISD


Serial Only one instruction and data stream is acted on during any one clock cycle

Flynns Classical Taxonomy: SIMD


All processing units execute the same instruction at any given clock cycle. Each processing unit operates on a different data element.

Flynns Classical Taxonomy: MISD


Different instructions operated on a single data element. Very few practical uses for this type of classification. Example: Multiple cryptography algorithms attempting to crack a single coded message.

Flynns Classical Taxonomy: MIMD


Can execute different instructions on different data elements. Most common type of parallel computer.

Parallel Computer Memory Architectures: Shared Memory Architecture


All processors access all memory as a single global address space. Data sharing is fast. Lack of scalability between memory and CPUs

Parallel Computer Memory Architectures: Distributed Memory


Each processor has its own memory. Is scalable, no overhead for cache coherency. Programmer is responsible for many details of communication between processors.

Parallel Programming Models


Exist as an abstraction above hardware and memory architectures Examples:
   

Shared Memory Threads Messaging Passing Data Parallel

Parallel Programming Models: Message Passing Model


Tasks exchange data by sending and receiving messages. Typically used with distributed memory architectures. Data transfer requires cooperative operations to be performed by each process. Ex.- a send Ex.operation must have a receive operation. MPI (Message Passing Interface) is the interface standard for message passing.

MESSAGE PASSING MODEL

Parallel Programming Models: Data Parallel Model


Tasks performing the same operations on a set of data. Each task working on a separate piece of the set. Works well with either shared memory or distributed memory architectures.

DATA PARALLEL MODEL

Designing Parallel Programs: Automatic Parallelization


Automatic


Compiler analyzes code and identifies opportunities for parallelism Analysis includes attempting to compute whether or not the parallelism actually improves performance. Loops are the most frequent target for automatic parallelism.

Designing Parallel Programs: Manual Parallelization


Understand the problem


A Parallelizable Problem:
Calculate the potential energy for each of several thousand independent conformations of a molecule. When done find the minimum energy conformation.

A Non-Parallelizable Problem: NonThe Fibonacci Series




All calculations are dependent

Designing Parallel Programs: Domain Decomposition


Each task handles a portion of the data set.

DOMAIN DECOMPOSITION
In this type of partitioning,the data associated with a problem is decomposed.

Designing Parallel Programs: Functional Decomposition


Each task performs a function of the overall work

Parallel Algorithm Examples: Array Processing


Serial Solution
 

Perform a function on a 2D array. Single processor iterates through each element in the array Assign each processor a partition of the array. Each process iterates through its own partition.

Possible Parallel Solution


 

PARALLEL SOLUTION OF ARRAY PROCESSING

Conclusion
Parallel computing is fast. There are many different approaches and models of parallel computing. Parallel computing is the future of computing.

QUESTIONS ??

THANK YOU

You might also like