Multiple asynchronous processes or threads may be used for several purposes. On a multi-processor, they split the computation such that more than one processor may work in parallel. Even on a single processor, time sharing insures that when a task takes a long time to complete, other tasks may still proceed. Finally, the program structure may benefit from such a decomposition. Different processes may work in pipeline, each performing a small part of the work and sending its results to the next stage. Several processes may each be responsible for reading events from a single source, when asynchronous events from several sources are expected, instead of having an event loop which polls each possible source repeatedly and dispatches the events to different routines.
Such asynchronous processes may reside on multiple networked computers and communicate through messages. However, sending messages between processes, especially over the network, incurs a relatively large overhead. Multiple processes or threads on a single computer can communicate through shared memory. The message passing overhead is avoided but proper synchronization through locking must be insured. Indeed, each data structure accessible from more than one process must have an associated lock. Before accessing such a structure, a process must acquire the associated lock and only release it after the access, once the data structure is back to a consistent state.
Processes may share a portion or all their address space, with the exception of the execution stack, and may share other resources such as file pointers. The more is shared, the lower is the overhead associated with creating a new process. In particular, lightweight processes where everything is shared, except the execution stack, are often called threads within a process. The process is the address space and associated resources such as file pointers while each thread is a separate execution stack.
It has been claimed that threads belong to the operating system and should not be a programming language issue. In practice, however, relatively few operating systems offer threads and they generally have different application programming interfaces. Furthermore, most available libraries are not thread safe in the sense that structures that would be accessed by multiple threads are not protected by locks.
Therefore, there are significant advantages to have threads defined in the standard libraries associated with a language. Portability is gained and it encourages library developers to insure that their code is thread safe. Few languages, apart from Modula-3 and Ada, support threads in this manner.