0% found this document useful (0 votes)
47 views8 pages

Concurrency Mechanism

Concurrency mechanisms allow segments of a program to execute simultaneously to increase efficiency. There are two fundamental problems in concurrent programming: synchronization of tasks so information can be transferred, and preventing simultaneous updates to shared data. Approaches to concurrency include shared variables, asynchronous message passing, and synchronous message passing. Shared variables allow processes to access common memory but require synchronization methods like test and set. Asynchronous message passing uses buffers between tasks. Synchronous message passing uses rendezvous where tasks synchronize during input/output.

Uploaded by

shahalameenu2003
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
47 views8 pages

Concurrency Mechanism

Concurrency mechanisms allow segments of a program to execute simultaneously to increase efficiency. There are two fundamental problems in concurrent programming: synchronization of tasks so information can be transferred, and preventing simultaneous updates to shared data. Approaches to concurrency include shared variables, asynchronous message passing, and synchronous message passing. Shared variables allow processes to access common memory but require synchronization methods like test and set. Asynchronous message passing uses buffers between tasks. Synchronous message passing uses rendezvous where tasks synchronize during input/output.

Uploaded by

shahalameenu2003
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 8

CONCURRENCY

MECHANISM
Presented by,
LINA KASIM
S5 BCA
Definition
Two or more segments of a program can be
executing concurrently if the effect of executing the
segments is independent of the order in which they
are executed. We refer to these program segments as
tasks or processes. On multiple-processor machines,
independent code segments can be executed
simultaneously to achieve increased processing
efficiency. On a single-processor machine, the
execution of independent code segments can be
interleaved to achieve processing efficiency: One
task may be able to execute while another is waiting
for an external event to occur or waiting for
completion of an I/O operation.

2
FUNDAMENTAL PROBLEMS

The trend toward multiple-processor machines and the increasingly


sophisticated applications of computers has resulted in higher-level
language constructs for specifying concurrent tasks. Two
fundamental problems in concurrent programming
are synchronization of tasks so that information can
be transferred between tasks, and prevention of
simultaneous updating of data that are accessible to
more than one task. These problems are referred to as the
synchronization problem and the mutual exclusion
problem.
Shared
variables

Asynchronous
FUNDAMENT message
AL passing
APPROACHES

Synchronous
message
passing
SHARED
VARIABLE
APPROCH
In the shared variable approach to concurrency, multiple
processes have access to a common region of memory. The
simplest form of shared variable communication is the
"test and set" approach. When two tasks are to
synchronize, the first task to reach its synchronization
point will test and then set a shared memory cell to indicate
that it is waiting for the second task. When the second task
reaches its synchronization point, it tests the shared cell
and determines that the first task is waiting. Having
synchronized, the two tasks can exchange information,
reset the shared memory cell, and proceed concurrently
until the next synchronization point is reached. Test and set
also can be used to prevent one task from accessing shared
data while another task is updating that data.
ASYNCHRONOUS MESSAGE-
PASSING APPROACH

Asynchronous message passing involves association of buffers with


concurrent tasks. A sending task places information in a receiving task
buffer. Items are removed from the buffer by the receiver as needed.
Removal of items may be on a first-in first-out basis, or according to a
priority scheme. A sending task synchronizes with the receiving task
when it encounters a full buffer in the receiver; typically the sender is
suspended until the receiver retrieves an item from the full buffer. More
generally, the sending task may execute alternative actions while waiting
for the receiver to retrieve an item. A receiving task synchronizes with the
sending task when it attempts to retrieve information from an empty
buffer. The receiver must suspend processing or execute alternative
actions until the sender places an item in the buffer.
SYNCHRONOUS MESSAGE PASSING
APPROACH
The method of synchronization used in synchronous message passing is known as a
rendezvous. Both symmetric and asymmetric rendezvous are possible. Symmetric
rendezvous requires each process to know the name of the other process involved in the
rendezvous. A symmetric rendezvous occurs in the following manner. The first task to
reach its input or output statement waits for the other task. When the second task reaches
its complementary output or input statement, both tasks execute the I/O statements in
synchronization, with the output from one task being the input to the other task.
Following completion of synchronized I/O, the tasks proceed concurrently.

An asymmetric rendezvous is similar to a procedure call in


that the invoked task need not know the names of its users.
Asymmetric rendezvous is also similar to a procedure call
in that information can be transferred between tasks using
parameter lists and global variables.
Thank you

You might also like