What is LRU Cache?
Cache replacement algorithms are efficiently designed to replace the cache when the space is full. The Least Recently Used (LRU) is one of those algorithms. As the name suggests when the cache memory is full, LRU picks the data that is least recently used and removes it in order to make space for the new data. The priority of the data in the cache changes according to the need of that data i.e. if some data is fetched or updated recently then the priority of that data would be changed and assigned to the highest priority, and the priority of the data decreases if it remains unused operations after operations.
Table of Content
- What is LRU Cache?
- Operations on LRU Cache:
- Working of LRU Cache:
- Ways to Implement LRU Cache:
- LRU cache implementation using Queue and Hashing:
- LRU cache implementation using Doubly Linked List and Hashing:
- LRU cache implementation using Deque & Hashmap:
- LRU cache implementation using Stack & Hashmap:
- LRU cache using Counter Implementation:
- LRU cache implementation using Lazy Updates:
- Complexity Analysis of LRU Cache:
- Advantages of LRU cache:
- Disadvantages of LRU cache:
- Real-World Application of LRU Cache:
LRU algorithm is a standard problem and it may have variations as per need for example, in operating systems LRU plays a crucial role as it can be used as a page replacement algorithm in order to minimize page faults.