0% found this document useful (0 votes)
2 views12 pages

p4 Learning About Threads & Concurrency

The document discusses threads and concurrency, outlining their components, differences from processes, and the benefits and challenges of multithreading. It also covers various implicit threading approaches such as thread pools, fork-join, and grand central dispatch, as well as how Windows and Linux operating systems represent threads. Key points include the importance of responsiveness, resource sharing, and the complexities involved in designing and debugging multithreaded applications.

Uploaded by

jayomaer
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views12 pages

p4 Learning About Threads & Concurrency

The document discusses threads and concurrency, outlining their components, differences from processes, and the benefits and challenges of multithreading. It also covers various implicit threading approaches such as thread pools, fork-join, and grand central dispatch, as well as how Windows and Linux operating systems represent threads. Key points include the importance of responsiveness, resource sharing, and the complexities involved in designing and debugging multithreaded applications.

Uploaded by

jayomaer
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 12

Learning About

Threads &
Concurrency
Mosab_2802
objectives

1. Identify components of threads?


2. Illustrate difference between threads and processes?
3. Explain the major benefits and challenges of designing multithreaded
processes?
4. Illustrate different approaches to implicit threading including thread
pools, fork-join and grand central dispatch
5. Explain how windows and linux OS represent threads?
1.Identify components of
threads?
The general components of a thread include:

A thread ID uniquely identifying the thread

A register set representing the status of the processor

A user stack, employed when the thread is running in user mode, and a kernel stack,
employed when the thread is running in kernel mode

A private storage area used by various run-time libraries and dynamic link libraries
(DLLs)

The register set, stacks, and private storage area are known as the context of the
thread.
2.Illustrate difference between
threads and processes?
● We can think of a thread as a part of a process that actually performs an execution of the program - a
sequence of instructions read from the program. There is only one thread in a "traditional process" but
there can be multiple threads - multiple execution sequences - within a single process. The threads can
share many parts of the context of the process, like the program code (aka the text), variables and other
data, open files, signals and other messages.

● On the other hand, each thread is a separate execution sequence, and so it needs NOT to share certain
parts of its context. Each thread needs to have its own separate program counter, CPU register values,
thread id number, and run-time stack for supporting its own function calls.
3. Explain the major benefits and
challenges of designing
multithreaded processes?
Responsiveness: Multithreading an interactive application may allow
a program to continue running even if part of it is blocked or is
performing a lengthy operation, thereby increasing responsiveness to
the user.
Resource sharing. threads share the
memory and the resources of the process to which they belong by default.
The benefit of sharing code and data is that it allows an application to
have several different threads of activity within the same address space.
Economy. Allocating memory and resources for process creation is costly.
Because threads share the resources of the process to which they belong,
it is more economical to create and context-switch threads.
Scalability. The benefits of multithreading can be even greater in a
multiprocessor architecture, where threads may be running in parallel
on different processing cores
3. Explain the major benefits and challenges of
designing multithreaded processes?
Challenges:-
1. Identifying tasks : This involves examining applications to find areas that can be divided into separate,
concurrent tasks. Ideally, tasks are independent of one another and thus can run in parallel on individual
cores.

2. Balance : While identifying tasks that can run in parallel, programmers must also ensure that the tasks
perform equal work of equal value. In some instances, a certain task may not contribute as much value to the
overall process as other tasks. Using a separate execution core to run that task may not be worth the cost.

3. Data Splitting : Just as applications are divided into separate tasks, the data accessed and manipulated by the
tasks must be divided to run on separate cores. On a multiprocessor, multiple threads can work on a problem
in parallel - truly simultaneously. A single-threaded process can only run on one CPU at a time, no matter how
many CPU's are available in the computer.

4. Data Dependency : The data accessed by the tasks must be examined for dependencies between two or
more tasks. When one task depends on data from another, programmers must ensure that the execution of
the tasks is synchronized to accommodate the data dependency.

5. Testing and debugging : When a program is running in parallel on multiple cores, many different execution
paths are possible. Testing and debugging such concurrent programs is inherently more difficult than testing
and debugging single-threaded applications.
4.Illustrate different approaches to implicit
threading including thread pools, fork-join and grand
central dispatch
● thread pools

The main thread of a busy Internet server does not have time to 'personally'
perform the service for each client, because it needs to return immediately to
the job of accepting the incoming request from the next client.

One way to deal with that problem is for the main thread to spawn a child, a
service thread, for each client. The service thread then handles the client
request and terminates.

However, it can take excessive time for the creation and termination of the
service threads, and if too many client requests come in too fast, there may be
too many service threads taxing system resources.
4.Illustrate different approaches to implicit
threading including thread pools, fork-join and grand
central dispatch


The idea of a thread pool is for the server to create a fixed number of service threads at the time of
process start up, and to allow them to 'stay alive' as long as the server operates. The service threads
that don't have anything to do are kept suspended. When a client needs service, and if a service thread
is available, the main thread assigns one to the client. Otherwise the main thread puts the client in a
queue where it waits for a service thread to become available.


When a service thread finishes with a client, it does not terminate. It just 'goes back to the pool' and
waits to be assigned to another client.


Generally it takes less time for a server process to use an existing thread to service a request than to
create a brand new service thread. Also, no matter how much the server is flooded with client requests,
the number of service threads never exceeds the fixed size of the pool.
4.Illustrate different approaches to implicit
threading including thread pools, fork-join and grand
central dispatch
Fork-Join

Synchronous threading occurs when the
parent thread creates one or more
children and then must wait for all of its
children to terminate before it resumes
—the so-called fork-join strategy. Here,
the threads created by the parent
perform work concurrently, but the parent
cannot continue until this work
has been completed
4.Illustrate different approaches to implicit
threading including thread pools, fork-join and grand
central dispatch 
grand central dispatch

is a combination of extensions to the C language, an API, and a run-time library that allows
application developers to identify sections of code to run in parallel. Like OpenMP, GCD manages
most of the details of threading.

GCD schedules blocks for run-time execution by placing them on a dispatch queue. When it removes
a block from a queue, it assigns the block to an available thread from the thread pool it manages.
GCD identifies two types of dispatch queues: serial and concurrent.


Blocks placed on a serial Blocks placed on a concurrent
queue are removed in FIFO queue are also removed in
order. Once a block has FIFO order, but
been removed from the several blocks may be removed
queue, it must complete at a time, thus allowing
execution before another multiple blocks to
block is removed. execute in parallel.
5. Explain how windows and linux OS represent
threads?

In Windows, applications run as separate processes, which may have multiple threads. Per-
thread resources include, ID number, register set, user stack and kernel stack (for the use of
the OS when executing in behalf of the process, for example when executing a system call
for the process), and private storage used by library code. There are three primary data
structures for holding the context of threads: ETHREAD (executive thread block), KTHREAD
(kernel thread block), and TEB (thread environment block). The first two reside in kernel
memory, and the TEB is in user-space.
5. Explain how windows and linux OS represent
threads?

Linux has a traditional fork() system call that creates an exact duplicate of the parent.

Linux also has a clone() system call with flag parameters that determine what resources will be
shared between parent and child (clone). If a large amount of context is shared between parent
and clone, then the clone is about the same thing as what we have before called a new thread
inside the parent process. On the other hand, if nothing is shared, then the clone is about the same
as the child of a traditional fork() operation.

You might also like