PYTHON
Implementing an LRU Cache with `collections.OrderedDict`
Create an efficient Least Recently Used (LRU) cache in Python using `collections.OrderedDict` to store and manage a fixed-size cache, useful for web application performance optimization.
from collections import OrderedDict
class LRUCache:
def __init__(self, capacity: int):
self.cache = OrderedDict()
self.capacity = capacity
def get(self, key: str) -> str:
if key not in self.cache:
return None
# Move the accessed key to the end to mark as recently used
self.cache.move_to_end(key)
return self.cache[key]
def put(self, key: str, value: str) -> None:
if key in self.cache:
self.cache.move_to_end(key)
self.cache[key] = value
if len(self.cache) > self.capacity:
# Remove the first item (least recently used)
self.cache.popitem(last=False)
# Example Usage:
lru_cache = LRUCache(capacity=3)
print(f"Cache state: {list(lru_cache.cache.items())}")
lru_cache.put('item1', 'data1')
lru_cache.put('item2', 'data2')
lru_cache.put('item3', 'data3')
print(f"Cache state after 3 puts: {list(lru_cache.cache.items())}")
print(f"Get item2: {lru_cache.get('item2')}") # Access item2
print(f"Cache state after get item2: {list(lru_cache.cache.items())}")
lru_cache.put('item4', 'data4') # Add new item, item1 (LRU) should be removed
print(f"Cache state after put item4: {list(lru_cache.cache.items())}")
print(f"Get item1: {lru_cache.get('item1')}") # item1 is not in cache
How it works: An LRU (Least Recently Used) cache is a caching strategy that discards the least recently used items when the cache reaches its capacity. This snippet implements an LRU cache using Python's `collections.OrderedDict`. `OrderedDict` maintains the order of key insertion, which is crucial here. When an item is accessed (`get`) or updated (`put`), it's moved to the end of the `OrderedDict` to mark it as most recently used. If the cache exceeds its `capacity` upon insertion, the item at the beginning (the least recently used) is removed using `popitem(last=False)`.