Computer Sciences Dept.

Weak Ordering - A New Definition and Some Implications

Sarita V Adve and Mark D Hill
1989

A model for correct program behavior commonly and often implicitly assumed by programmers is that of sequential consistency, which guarantees that all memory accesses execute atomically and in program order. An alternative programmer’s model, weak ordering, offers greater performance potential, especially for highly parallel shared memory systems. Weak ordering was first defined by Dubois, Scheurich and Briggs in terms of a set of constraints for hardware, which are not necessary for its intuitive goal. We define a system to be weakly ordered with respect to a set of software constraints if it appears sequentially consistent to software obeying those constraints. We argue that this definition more directly reflects the intuition behind weak ordering, facilitates a formal statement of the programmer’s view of the system, and does not specify unnecessary directives for hardware. For software that does not allow data races, we show that the new definition allows a violation of the old definition and makes possible implementations that have a higher performance potential. We give an example of one such implementation for cache-based systems with general interconnects to illustrate the power of the new definition. The insight gained from the implementation of weak ordering can also be applied to sequential consistency. We give an algorithm and an implementation for cache-based systems with general interconnects that has a higher performance potential than others we are aware of. We also investigate a markedly new approach that is allowed by our definition to implement weak ordering, and possibly sequential consistency.

Download this report (PDF)


Return to tech report index

 
Computer Science | UW Home