0% found this document useful (0 votes)
25 views28 pages

Unit3 ppt4

Uploaded by

Sooraj Adhikari
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
25 views28 pages

Unit3 ppt4

Uploaded by

Sooraj Adhikari
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 28

Concurrency and

Synchronization
Concurrency in c#

 Concurrency in C# refers to the execution of multiple tasks or


operations at the same time.
 Concurrency in C# involves the simultaneous execution of
tasks within a program, using features like threads and tasks.
 Concurrency can be achieved using various techniques like
multithreading, asynchronous programming, and
parallel programming.
 In .NET, concurrency is commonly achieved using:
Threads: Multiple threads can run concurrently within a single application.
Task Parallel Library (TPL): Provides higher-level abstractions over threading with
Task and Parallel classes.
Thread Synchronization

 While concurrency allows tasks to run concurrently, synchronization


is needed when multiple threads or tasks access shared resources.
 Synchronization ensures that only one thread accesses a critical
section of code at a time, preventing concurrent threads from
interfering with each other.
 The lock keyword is the most commonly used synchronization
mechanism in C#. It ensures that only one thread can access a block
of code at a time.
 Similar to lock, but offers more control with Monitor.Enter,
Monitor.Exit, and Monitor.Wait methods.
Lock in c#

 The lock keyword in C# is used to ensure that a block of code is


executed by only one thread at a time, preventing race conditions
and ensuring thread safety when accessing shared resources.
 It provides a simple mechanism for mutual exclusion (also known
as a critical section) by locking a given object and preventing other
threads from entering the block of code until the lock is released.
 It executes a specified block and then releases the lock.
Example of Lock
Monitor in Thread Synchronization

 Monitor class in C# plays a crucial role by managing access to


shared resources. It ensures that only one thread can execute a
particular block of code at a time
 Only one thread can hold a lock on an object at any given time. This
ensures that the shared resource is accessed by only one thread,
providing mutual exclusion.
 Threads can signal each other when a resource is available or when
certain conditions are met, using Monitor.Wait(), Monitor.Pulse(), and
Monitor.PulseAll().
 This allows threads to wait for certain conditions to be met before
proceeding, and for other threads to notify waiting threads when
those conditions are satisfied.
 The following is the list of important methods in the Monitor class.
1. Enter(): When we invoke the Enter method of the Monitor class, it acquires an exclusive lock on the
specified object. This also marks the beginning of a critical section or the beginning of a shared
resource.
2. Exit(): When the Monitor class’s Exit method is invoked, it releases the lock on the specified object.
This marks the end of a critical section or the end of the shared resource protected by the locked
object.
3. Pulse(): When the Pulse method is invoked of the Monitor class, it sends a signal to a thread in the
waiting queue of a change in the locked object’s state.
4. Wait(): When the Monitor class’s Wait method is invoked, it releases the lock on an object and
blocks the current thread until it reacquires the lock.
5. PulseAll(): When the Monitor class’s PulseAll method is invoked, it sends signals to all waiting
threads about a change in the locked object’s state.
6. TryEnter(): When we invoke the Monitor class’s TryEnter method, it attempts to acquire an
exclusive lock on the specified object.
 Common Synchronization Problems
1. Race Condition: Occurs when multiple threads access shared data
simultaneously and the outcome depends on the order of thread
execution.
2. Deadlock: When two or more threads wait for each other to release a
resource, causing the program to halt indefinitely.
3. Livelock: Similar to deadlock, but the threads keep changing their
state in response to each other without making progress.
4. Thread Starvation: Occurs when a thread is perpetually denied
access to a resource because other threads are being given priority.
Mutex in Thread Synchronization

 Mutex stands for Mutual Exclusion.


 Used to synchronize access to a shared resource among multiple threads.
 Ensures only one thread can access the resource at a time.
 Prevents race conditions in concurrent programming.
Why to use Mutex?

 Race Condition Prevention: Mutex prevents multiple threads from


accessing shared resources simultaneously.
 Thread Safety: Ensures that only one thread is in the critical section,
preventing inconsistent or corrupted data.
 Cross-Process Synchronization: Unlike lock, a Mutex can synchronize
threads across different processes.
Mutex vs Lock

 Lock is used within a single process.


 Mutex can be used across multiple processes.
 Mutex is heavier and slower than Lock.
 Both provide mutual exclusion, but Mutex is useful for cross-process
synchronization.
Example
Mutex Methods in C#

 WaitOne(): Acquires the Mutex, blocks if another thread has it.


 ReleaseMutex(): Releases the Mutex, allowing other threads to
acquire it.
 Dispose(): Releases all resources used by the Mutex object.
ReaderWriterLock in Thread
Synchronization
 ReaderWriterLock is a synchronization primitive that allows multiple
threads to read from a shared resource concurrently while ensuring
exclusive access for writing.
 Multiple readers can acquire the lock simultaneously.
 Only one writer can hold the lock, and it excludes both readers and
other writers.
Why Use ReaderWriterLock?

 Concurrency Optimization: It improves performance by allowing


multiple readers at the same time.
 Exclusive Write Access: Ensures data consistency by allowing only
one thread to write at any given moment.
 Useful Scenario: When a resource is read often but written rarely
(e.g., a cache system).
How ReaderWriterLock Works?

 AcquireReaderLock: Grants shared read access. Multiple readers can


acquire the lock concurrently.
 AcquireWriterLock: Grants exclusive write access. Only one writer can
hold the lock at a time, and it blocks both readers and writers.
 ReleaseReaderLock: Releases the lock after reading.
 ReleaseWriterLock: Releases the lock after writing.
Threadpooling in C#

 Thread pooling in C# is a technique that manages a collection of


reusable threads for performing multiple tasks concurrently without
the overhead of creating new threads each time.
 The Thread pool in C# is nothing but a collection of threads that can be reused
to perform a number of tasks in the background.
 The pool of threads is managed by the .NET runtime, which efficiently
schedules and runs tasks on available threads within the pool.
 ThreadPool in C# manages the creation and reuse of threads
internally, without requiring developers to manually create, start, or
destroy threads.
Why Do We Need Thread
Pooling?
1. Performance Optimization: Creating and destroying threads is
expensive in terms of resources and time. The thread pool allows you
to reuse existing threads, reducing the need to create new threads.
2. Efficient Resource Usage: Instead of letting many threads occupy
resources when they’re idle, the thread pool only creates as many
threads as necessary and adjusts the number based on demand.
3. Simplified Task Management: The thread pool manages threads
on your behalf. You can easily schedule tasks without worrying about
starting and stopping threads explicitly.
Thread lifecycle before
ThreadPooling
Thread lifecycle After Threadpooling
How Thread Pooling Works

 Task Queueing: When a task is scheduled (e.g., using


ThreadPool.QueueUserWorkItem() or by using the Task library), the task is
placed in a queue.
 The QueueUserWorkItem method in C#’s ThreadPool takes a WaitCallback
delegate as a parameter, which represents the method that will be executed by
a thread from the thread pool. You can pass the method directly to
QueueUserWorkItem, and it will automatically create the WaitCallback delegate
for you.
 Thread Assignment: The thread pool assigns the task to an available thread.
If no threads are available and the workload increases, the pool can create
additional threads up to a specified maximum.
 Automatic Scaling: The thread pool dynamically scales the number of threads
based on the load. The pool can add more threads when there’s high demand
and reduce the count during lower usage to optimize system resources.
Example of ThreadPooling

You might also like