Concurency and Parallelism
While concurrency and parallelism may sound similar (and in fact, are sometimes used interchangably both in normal and in computer contexts), the two terms actually are not the same. In this post, I will explain each of the terms’ origin, how different they are and why are they oftenly used incorrectly.
By its definition, a computer is a machine which computes, so it’s not a surprise that the CPU, which stands for central processing unit (and which we normally refer to as a computer chip) is the most important element of the machine. The part which deals with computational tasks inside a CPU is called the ALU (Arithmetic/Logic Unit), which basically computes everything in binary logic by lighting up some of its light bulbs and shutting down the others. It’s quite a simplified explanation of how CPU works, but the part we’re interested here is that at one particular time, it can only handle one computation, hence it computes and transfers the data back-and-forth to the memory (either the chip’s cache or the RAM) continuously. That means that by design, CPU computation is consequentially, and there’s no way we can expect to run two computations in the same CPU at the same time.
Read more (1091 words)