[1] To quote some graffiti seen on a Cambridge building wall: Time is a device that was invented to keep everything from happening at once.
[2] An even worse failure for this system could occur if the two assignment statements attempt to change the balance simultaneously, in which case the actual data appearing in memory might end up being a random combination of the information being written by the two threads. Most computers have interlocks on the primitive memory-write operations, which protect against such simultaneous access. Even this seemingly simple kind of protection, however, raises implementation challenges in the design of multiprocessing computers, where elaborate cache-coherence protocols are required to ensure that the various processors will maintain a consistent view of memory contents, despite the fact that data may be replicated (cached) among the different processors to increase the speed of memory access.
[3] As of 2019, browsers differ in their support for SharedArrayBuffer objects.
[4] The factorial program in section 3.1.3 illustrates this for a single sequential thread.
[5] The columns show the contents of Peter's wallet, the joint account (in Bank1), Paul's wallet, and Paul's private account (in Bank2), before and after each withdrawal (W) and deposit (D). Peter withdraws \$10 from Bank1; Paul deposits \$5 in Bank2, then withdraws \\$25 from Bank1.
[6] A more formal way to express this idea is to say that concurrent programs are inherently nondeterministic. That is, they are described not by single-valued functions, but by functions whose results are sets of possible values.
3.4.1 The Nature of Time in Concurrent Systems