An LRU (Least Recently Used) cache is a cache eviction algorithm that discards the least recently used items first when the cache is full. This Java implementation uses a HashMap for fast access and a Doubly Linked List to maintain the order of elements.
Program Structure
- CacheNode Class: Represents each node in the cache, holding keys and values.
- LRUCache Class: Implements the LRU cache with methods to get and put items in the cache.
Java Code
import java.util.HashMap;
class CacheNode {
int key;
int value;
CacheNode prev;
CacheNode next;
public CacheNode(int key, int value) {
this.key = key;
this.value = value;
}
}
public class LRUCache {
private HashMap<Integer, CacheNode> map;
private int capacity;
private CacheNode head, tail;
public LRUCache(int capacity) {
this.capacity = capacity;
map = new HashMap<>();
head = new CacheNode(0, 0);
tail = new CacheNode(0, 0);
head.next = tail;
tail.prev = head;
}
public int get(int key) {
if (map.containsKey(key)) {
CacheNode node = map.get(key);
remove(node);
insert(node);
return node.value;
}
return -1;
}
public void put(int key, int value) {
if (map.containsKey(key)) {
remove(map.get(key));
}
if (map.size() == capacity) {
remove(tail.prev);
}
insert(new CacheNode(key, value));
}
private void remove(CacheNode node) {
node.prev.next = node.next;
node.next.prev = node.prev;
map.remove(node.key);
}
private void insert(CacheNode newNode) {
map.put(newNode.key, newNode);
CacheNode headNext = head.next;
head.next = newNode;
newNode.prev = head;
newNode.next = headNext;
headNext.prev = newNode;
}
}
Explanation of How the Program Works
- Initialization: The cache starts with a dummy head and tail to make edge case handling easier.
- Get Operation: Retrieves the value of the key if the key exists in the cache. It then moves the node to the front of the list to mark it as most recently used.
- Put Operation: Inserts a new key-value pair. If the key exists, it updates the value and moves the node to the front. If the cache is full, it removes the least recently used item before inserting the new one.
- Remove and Insert Helper Methods: Manage the linked list connections and the hashmap entries.
Key Components:
- CacheNode Class: Represents each item in the cache as a node with pointers to the next and previous nodes, which helps maintain the order of usage.
- LRUCache Class: Manages the cache size and operations. It uses a
HashMap
to store nodes for quick access and a doubly linked list to efficiently move nodes around as they are used.
This implementation is particularly efficient for scenarios where frequent access to cached data is needed, and it’s important to quickly remove the least recently used items without traversing the entire cache. The use of a doubly linked list allows for constant time insertions and deletions.
Conclusion
This LRU cache implementation efficiently keeps track of the most and least recently used items with O(1) time complexity for get and put operations, thanks to the combination of HashMap and Doubly Linked List.