Advantages of LRU cache

  • Fast Access: It takes O(1) time to access the data from the LRU cache.
  • Fast Update: It takes O(1) time to update a key-value pair in the LRU cache.
  • Fast removal of Least recently used data: It takes O(1) delete that which has not been recently used.
  • No thrashing: LRU is less susceptible to thrashing compared to FIFO because it considers the usage history of pages. It can detect which pages are being used frequently and prioritize them for memory allocation, reducing the number of page faults and disk I/O operations.

Complete Tutorial on LRU Cache with Implementations

Similar Reads

What is LRU Cache?

Cache replacement algorithms are efficiently designed to replace the cache when the space is full. The Least Recently Used (LRU) is one of those algorithms. As the name suggests when the cache memory is full, LRU picks the data that is least recently used and removes it in order to make space for the new data. The priority of the data in the cache changes according to the need of that data i.e. if some data is fetched or updated recently then the priority of that data would be changed and assigned to the highest priority, and the priority of the data decreases if it remains unused operations after operations....

Operations on LRU Cache:

...

Working of LRU Cache:

LRUCache (Capacity c): Initialize LRU cache with positive size capacity c. get (key): Returns the value of Key ‘k’ if it is present in the cache otherwise it returns -1. Also updates the priority of data in the LRU cache. put (key, value): Update the value of the key if that key exists, Otherwise, add key-value pair to the cache. If the number of keys exceeded the capacity of LRU cache then dismiss the least recently used key....

Ways to Implement LRU Cache:

Let’s suppose we have an LRU cache of capacity 3, and we would like to perform following operations:...

LRU cache implementation using Queue and Hashing:

LRU cache can be implemented in a variety of ways, and each programmer may choose a different approach. However, below are commonly used approaches:...

LRU cache implementation using Doubly Linked List and Hashing:

We use two data structures to implement an LRU Cache.   Queue is implemented using a doubly-linked list. The maximum size of the queue will be equal to the total number of frames available (cache size). The most recently used pages will be near the front end and the least recently used pages will be near the rear end. A Hash with the page number as key and the address of the corresponding queue node as value. When a page is referenced, the required page may be in the memory. If it is in the memory, we need to detach the node of the list and bring it to the front of the queue. If the required page is not in memory, we bring that in memory. In simple words, we add a new node to the front of the queue and update the corresponding node address in the hash. If the queue is full, i.e. all the frames are full, we remove a node from the rear of the queue, and add the new node to the front of the queue....

LRU cache implementation using Deque & Hashmap:

...

LRU cache implementation using Stack & Hashmap:

...

LRU cache using Counter Implementation:

...

LRU cache implementation using Lazy Updates:

...

Complexity Analysis of LRU Cache:

...

Advantages of LRU cache:

...

Disadvantages of LRU cache:

The idea is very basic, i.e. keep inserting the elements at the head. if the element is not present in the list then add it to the head of the list if the element is present in the list then move the element to the head and shift the remaining element of the list Note that the priority of the node will depend upon the distance of that node from the Head, the closest the node is to the head, higher the priority it has. So when the Cache size is full and we need to remove some element, we remove the element adjacent to the tail of the doubly linked list....

Real-World Application of LRU Cache:

Deque data structure allows insertion and deletion from front as well as end, this property allows the implementation of LRU possible as Front element can represent high priority element while the end element can be the low priority element i.e. Least Recently used. Working: Get Operation: Checks if the key exist in the cache’s hash map and follow the below cases on the deque: If key is found: The item is considered as recently used, so it is moved to the front of the deque. The value associated with the key is returned as the result of the get operation. If key is not found: return -1 to indicate no such key is present. Put Operation: First check if the key already exists in the cache’s hash map and follow the below cases on the deque If key exists: The value associated with the key is updated. The item is moved to the front of the deque since it’s now the most recently used. If key does not exist: If the cache is full, it means a new item needs to be inserted, and the least recently used item must be evicted. This is done by removing the item from the end of the deque and the corresponding entry from the hash map. The new key-value pair is then inserted into both the hash map and the front of the deque to signify that it’s the most recently used item...