Perhaps the most influential result in the theory of caches is the following theorem due to Sleator and Tarjan: With O(1) resource augmentation, the basic LRU eviction policy is guaranteed to be O(1)-competitive with the optimal offline policy. Sleator and Tarjan’s result applies to LRU on a fully associative cache, but does not tell us how to think about caches with low associativity, i.e., caches where each page has only d positions in which it is capable of residing. This means that many modern caches cannot directly apply the result. It is widely believed that to implement a cache with low associativity, one should still use LRU, but restricted to the d positions that are eligible for eviction. However, this low-associativity version of LRU has never been analyzed. We show that low-associativity implementations of LRU are often actually not constant-competitive algorithms. On the other hand, we give randomized eviction algorithms that are constant-competitive, and even a d-associative algorithm that, using any d = ω(1), and using 1 + o(1) resource augmentation, is 1 + o(1) competitive with the fully-associative LRU algorithm. Combined, our algorithms suggest a new way of thinking about the design of low-associativity caches, in which one intentionally designs randomized mechanisms that allow parts of the cache which are “overheating” to naturally cool down.