Concurrency versus parallelism
Concurrency and parallelism are very similar concepts. Different authors give different definitions to these concepts. The most accepted definition talks about concurrency when you have more than one task in a single processor with a single core and the operating system's task scheduler quickly switches from one task to another, so it seems that all the tasks run simultaneously. The same definition talks about parallelism when you have more than one task that run simultaneously at the same time, in a different computer, processor, or core inside a processor.
Another definition talks about concurrency when you have more than one task (different tasks) running simultaneously on your system. One more definition discusses parallelism when you have different instances of the same task running simultaneously over different parts of a dataset.
The last definition that we include talks about parallelism when you have more than one task that runs simultaneously in your system and talks about concurrency to explain the different techniques and mechanisms programmers have to synchronize with the tasks and their access to shared resources.
As you can see, both concepts are very similar and this similarity has increased with the development of multicore processors.
并发和并行是非常相似的概念。不同的作者对这些概念给出了不同的定义。当您在单个处理器中使用单个核心拥有多个任务并且操作系统的任务计划程序可以快速从一个任务切换到另一个任务时,最接受的定义就涉及并发性,因此似乎所有任务都同时运行。当同一时间同时运行多个任务时,在处理器中的不同的计算机,处理器或核心中,同样的定义将讨论并行性。
当您的系统上同时运行多个任务(不同的任务)时,另一个定义会谈到并发。当同一任务的不同实例同时运行在数据集的不同部分时,还有一个定义讨论了并行性。
我们包括的最后一个定义是在您的系统中同时运行多个任务时讨论并行性,并讨论并发性,以解释程序员必须与任务及其对共享资源的访问权限进行同步的不同技术和机制。
可以看出,这两个概念非常相似,随着多核处理器的发展,这种相似性也在增加。