Concurrent programming
concurrency
• Concurrency is the ability to run several programs or several parts of
a program in parallel. If a time consuming task can be performed
asynchronously or in parallel, this upgrades the performance of the
program by increasing the throughput and the interactivity of the
program.
• Executing a single program at a time was an inefficient use of
expensive and scarce computer resources.
• In programming terms, concurrent programming is a technique in
which two or more processes start, run in an interleaved
fashion through context switching and complete in an overlapping
time period by managing access to shared resources e.g. on a single
core of CPU.
What is Thread?
Thread
• A Thread is a
• Facility to allow multiple activities within a single process.
• A thread is a series of executed statements.
• A thread is a nested sequence of method calls.
• Referred as lightweight process.
• Each thread has its own program counter, stack and local variables.
Why we use Threads?
• Threads help to perform background or asynchronous processing.
• It increases the responsiveness of GUI applications.
• Threads take advantage of multiprocessor systems.
• It simplifies program logic when there are multiple independent
entities.
Benefits of Threads
• Utilizing multiple processors
programs with multiple active threads can execute simultaneously on
multiple processors. When properly designed, multi-threaded
programs can improve throughput by using available processor
resources more effectively.
Benefits of Threads
• Simplicity of modelling
• program that processes one type of task sequentially is simpler to
write, of course less error-prone, and easier to test than one
managing multiple different types of tasks at once.
Concurrent and Parallel Programming: Key
Differences
• Concurrency: Concurrent programming focuses on managing task
dependencies and communication between tasks, regardless of
whether the tasks are executed simultaneously or not. It is primarily
concerned with the correct and efficient coordination of multiple
tasks. Concurrency aims to give the illusion of tasks running in
parallel, even on a single processing unit, by rapidly switching
between them. This is achieved through interleaving, ensuring a
smooth and responsive execution of programs.
Concurrent and Parallel Programming: Key
Differences
• Parallelism: Parallel programming, on the other hand, focuses on
actual parallel execution of tasks on multiple processing units at the
same time. It is chiefly concerned with distributing tasks across these
units for faster completion. Parallelism requires hardware support in
the form of multiple processing units, such as multi-core CPUs or
GPUs.
Resource sharing
• In concurrent programming, tasks (threads or processes) often share
common resources like memory or I/O devices. This necessitates
careful synchronisation and coordination to prevent race conditions
and data inconsistency. Parallel programming, alternatively, often
uses either private resources allocated to each processing unit or
explicitly shared resources through a well-defined communication
mechanism.
Synchronisation techniques
• Concurrent programs employ various synchronisation techniques,
such as locks, semaphores, and monitors, to manage access to shared
resources. These techniques ensure that tasks coordinate and
communicate properly to avoid issues like deadlocks. Parallel
programs, while also using synchronisation techniques, tend to rely
more on partitioning tasks across the processing units in a way that
minimizes the need for synchronisation, or by using data
structures that are specifically designed for parallel execution.