L12-Principles of Message Passing1
L12-Principles of Message Passing1
2
Principles of Message-Passing
Programming (1/3)
The paradigm is one of the oldest and most widely used
approaches for programming parallel computers.
its roots can be traced back in the early days of parallel proc.
its wide-spread adopted
There are two key attributes that characterize the
paradigm.
Assumes a partitioned address space and
It supports only explicit parallelization
3
Principles of Message-Passing
Programming (2/3)
Each data element must belong to one of the
partitions of the space
data must be explicitly partitioned and placed
encourages locality of access
All interactions (read-only, read/write) require
cooperation of 2 processes:
the process that has the data and
the process that wants to access the data.
4
Principles of Message-Passing
Programming (3/3)
Advantage of explicit two-way interactions:
the programmer is fully aware of all the costs of nonlocal
interactions, and is more likely to think about algorithms
(and mappings) that minimize interactions.
paradigm can be efficiently implemented on a wide
variety of architectures.
Disadvantage:
For dynamic and/or unstructured interactions the
complexity of the code written for this type of paradigm
can be very high for this reason. 5
Programming Issues (1/5)
Parallelism is coded explicitly by the programmer:
The programmer is responsible:
for analyzing the underlying serial algorithm/application and
Identifying ways by which he/she can decompose the computations and
extract concurrency.
Programming using the paradigm tends to be hard and
intellectually demanding.
Properly written message-passing programs can often achieve
very high performance and scale to a very large number of
processes.
6
Programming Issues (2/5)
Progs are written using the asynchronous or
loosely synchronous paradigms.
In the asynchronous paradigm:
all concurrent tasks execute asynchronously.
such programs can be harder to reason about &
can have non-deterministic behavior
7
Programming Issues (3/5)
Loosely synchronous programs:
tasks or subsets of tasks synchronize to perform
interactions
between these interactions, tasks execute
completely asynchronously
Since the interaction happens synchronously, it is
still quite easy to reason about the program
8
Programming Issues (4/5)
Paradigm supports execution of a different
program on each of the p processes.
provides the ultimate flexibility in parallel
programming
makes the job of writing parallel programs
effectively unscalable.
9
Programming Issues (5/5)
⇒ most programs are written using the single
program multiple data (SPMD) approach.
In SPMD programs the code executed by different
processes is identical except for a small no
processes (e.g., the "root" process).
10