0% found this document useful (0 votes)
22 views19 pages

Lec6 - Modern Development Concept - Concurrency

Uploaded by

fa23phcs0001
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
22 views19 pages

Lec6 - Modern Development Concept - Concurrency

Uploaded by

fa23phcs0001
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 19

Modern development concept –

Concurrency. (Understanding concurrency,


parallelism and
multithreading)
Lecture 6
Concurren
cy
Concurrency means multiple computations are
happening at
the same time. Concurrencyis everywhere in
programming,
modern whether we like it or
not: Multiple computers in a
network.
Multiple applications running on one computer.
Multiple processors in a computer (today, often
multiple processor cores on a single chip).
In fact, concurrency is essential in modern
programming:
Web sites must handle multiple simultaneous
users.
Graphical user interfaces almost always require
background work that does not interrupt the user.
For example, Eclipse compiles your Java code
Consider you are given a task of singing and eating at
the same time. At a given instance of time either you
would sing or you would eat as in both cases your
mouth is involved. So in order to do this, you would
eat for some time and then sing and repeat this until
your food is finished or song is over. So you performed
your tasks concurrently.
Concurrency means executing multiple tasks at the same
time but not necessarily simultaneously. In a concurrent
application, two tasks can start, run, and complete in
overlapping time periods i.e. Task-2 can start even before
Task-1 gets completed.
In the computer science world, the way how
concurrency is achieved in various processors is
different. In a single core environment (i.e. your
processor is having If
context-switching. a its
single core), concurrency
a multi-core environment,is
achieved via acan
concurrency process called through parallelism.
be achieved

Execution of tasks in a single core environment. Tasks are context


switched between one another.
What is Context Switch?
A context switch (also called process switch or a task
switch) is the switching of the CPU from one process or
thread to another.
Context switching can be described as the kernel (i.e.,
the core of the operating system) performing the
following activities with regard to processes (including
threads) on the CPU:

(1) suspending the progression of one process and storing


the CPU's state (i.e., the context) for that process
somewhere in memory.
(2) retrieving the context of the next process from
memory and restoring it in the CPU's registers and
(3) returning to the location indicated by the program
counter (i.e., returning to the line of code at which
the process was interrupted) in order to resume the
Parallelism
Consider you are given two tasks of cooking and
speaking to your friend over the phone. You could do
these two things simultaneously. You could cook as well
as speak over the phone. Now you are doing your tasks
parallelly.
Parallelism means performing two or more tasks
simultaneously. Parallel computing in computer science
refers to the process of performing multiple calculations
simultaneously.

Two tasks are being performed simultaneously over the same time
period.
Processes and
Threads
Process: A process is an instance of a running
program that is isolated from other processes on
the same machine. In particular, it has its own
private section of the machine’s memory. The
process abstraction is a virtual computer . It makes
the program feel like it has the entire machine to
itself – like a fresh computer has been created, with
fresh memory, just to run that program.

Thread: It is the segment of a process means a process


can have multiple threads and these multiple threads are
contained within a process. Thread takes less time to
terminate as compared to process and like process threads
do not isolate. Thread is called light weight process.
Threads share memory.
Distribution of Processes and Threads in an
application.
Synchronous and
Synchronous:
Asynchronous
Imagine you were given to write two letters one to your mom
and another to your best friend. You can not at the same
time write two
letters. a synchronous tasks
In
executed one are
after another. Each task waits for any previous
programming
task to complete
model and
, then gets executed.

Asynchronous:
Imagine you were given to make a sandwich and wash your
clothes in a washing machine. You could put your clothes in the
washing machine and without waiting for it to be done, you
could go and make the sandwich. Here you performed these two
tasks asynchronously.
In an asynchronous programming model, when one
task gets
executed, you could switch to a different task without waiting for
the previous to get completed.
Synchronous in a single and multi-threaded
environment.
Synchronous
Single
Threaded:

Each task gets executed one after another. Each task waits for its previous
task to get
executed.
Multi-
Threaded:

Tasks get executed in different threads but wait for any other
executing tasks on any other thread.
Asynchronous in a single and multi-threaded
environment.
Asynchronous
Single
Threaded:

Tasks start executing without waiting for a different task to finish. At a


given time a single task gets executed.
Multi-Threaded:

Tasks get executed in different threads without waiting for any


tasks and independently finish off their
executions.
Two Models for Concurrent
Programming
Shared memory: In the shared memory model of
concurrency, concurrent modules interact by
reading and writing shared objects in memory.
Examples of the shared-memory model:
• A and B might be two processors (or processor
cores) in the same computer, sharing the same
physical memory.
• A and B might be two programs running on the
same computer, sharing a common file system
with files they can read and write.
• A and B might be two threads in the same Java
program, sharing the same Java objects.
Message passing: In the message-passing model,
concurrent modules interact by sending messages to
each other through a communication channel.
Modules send off messages, and incoming messages
to each module are queued up for handling.
Examples include:
• A and B might be two computers in a
network, communicating by network
connections.
• A and B might be a web browser and a web
server – A opens a connection to B and asks for a
web page, and B sends the web page data back
to A.
Problem
Concurrency
s: introduces a new set of possible errors:

Race conditions:
The possibility of incorrect results in the presence of
unlucky timing is so important in concurrent
programming that it has a name: race conditions.
The most common type of race condition is check-
then-act.

Example 1:
Thread Unsafe
Sequence.
Let’s consider the Sequence.java class in Example
1, it is suitable to lost updates. The counter
increase instruction may look like a single action
because of its
compact syntax, it is not atomic, meaning that it does
not execute as a single indivisible operation, instead
it is shorthand for a sequence of 3 discrete
operations: fetch the current value, add one to it, and
write the new value back. This is an example of a
read-modify-write (or check-then- act) operation, in
which the resulting state is derived from a previous
state.

Your warning bell should ring when you see a check-


then-act operation, this is prone to race-
conditions.
Consider the case of two threads (A and B) as shown in
Image 1. Due to bad timing, the value may miss
updates.

Image 1: Race condition in Sequence.java (Example 1)

We can solve this issue by making sure that only one


thread at the time can perform the next operation. The
problem is solved by adding the synchronized block to
the method.
Example 2: Thread-Safe
Sequence.
Deadloc
k:
A deadlock occurs when a
thread enters a waiting state
because a requested resource
is held by another waiting
thread, which in turn is waiting
for another resource held by
another waiting thread. If a
thread is unable to change its
state indefinitely because the
resources requested by it are
being used by another waiting
thread, then the system is said
to be in a deadlock.
Summing
up!
Concurrency and
Parallelism -> Way tasks
are executed.

Synchronous and
Asynchronous ->
Programming model.

Single Threaded and Multi-


Threaded -> The environment of

You might also like