LRU Cache Visualization & Animation

## What is it? An LRU (Least Recently Used) Cache evicts the least recently used item when it reaches capacity. It supports O(1) `get` and `put` operations. Implemented with a HashMap + Doubly Linked List. ## How it works - **HashMap** maps each key to its corresponding node in the doubly linked list → O(1) lookup - **Doubly Linked List** maintains access order; most recently used at head, least recently used at tail **get(key):** - If not in map → return -1 - Move the node to the head of the list (mark as recently used) - Return its value **put(key, value):** - If key exists → update value, move to head - If key is new: - If at capacity → remove the tail node (LRU item) and its map entry - Create new node, add at head, add to map ## When to use - Caching systems (CPU cache, web cache, database buffer pool) - Any "evict the least recently used" policy - LeetCode 146 — the canonical cache design problem ## Key Points - HashMap for O(1) lookup; doubly linked list for O(1) insertions/deletions at known positions - Dummy head and tail nodes simplify edge cases - Combined, both structures give O(1) amortized for all operations

Category: algorithms

Difficulty: beginner

Time Complexity: O(1) get/put

Space Complexity: O(capacity)

LRU Cache

beginner

ReadyLRUCache(capacity = 3)
← MRULRU →HEADsentinelTAILsentinelMRULRUHashMap (O(1) lookup)

Capacity = 3. MRU node lives near HEAD; LRU node lives near TAIL. Cache full? The TAIL node is evicted. HashMap = O(1) lookup. DLL = O(1) move-to-front & evict.

LRU Cache
HashMap + DLL — O(1) get & put. Capacity = 3.