Concurrency vs. Parallelism

Concurrency and parallelism are two distinct words with distinct meanings that are often misused or confused. They both refer to the execution of multiple tasks and how multi-tasking is handled by the computer.

Parallelism is true multi-tasking, where two or more tasks are run at the same time in parallel. This is usually quite easy for people to understand, it’s like patting your head and rubbing your stomach at the same time. Computers with multiple physical CPU cores are able to achieve parallelism, as each core can handle doing one task simultaneously. Parallelism is often confused with concurrency, and people refer to true multi-tasking as concurrency when actually they mean parallelism.

Concurrency is not true multi-tasking but is still multi-tasking. Concurrency is the process of doing multiple things by switching between them or context switching. An example of context switching would be writing this article then I stop writing to check my email and then go back to writing the article. This process of splitting up tasks is called Time Slicing, where two or more tasks are split into segments and a segment of each is done one at a time. Changing between the tasks is referred to as context switching. Logical cores, or threads, within the CPU are capable of running concurrent tasks.

Your computer can be doing either parallelism, concurrency or both without you realising. CPUs are able to context switch and process tasks so fast that even concurrent tasks will appear to be executed simultaneously. A single CPU core is capable of doing one thing at a time but is capable of concurrently doing multiple things depending on how many logical cores or threads it has. The majority of brand new computers, phones, tablets, etc, will have multiple cores with fast clock speeds and therefore be capable of doing so much simultaneously that you won’t be able to tell what bits are being time-sliced and what bits are being done truly simultaneously.

However, when creating or designing programs and software it can be useful to understand the differences so that you can use them appropriately within your code. For example, it may not be efficient to make every task run concurrently but it could use a lot of memory and CPU power to run everything in parallel. Striking the balance and deciding when and when not to use multiple threads within a program can be key to the design and execution of your code.

Leave a Reply