LRU Cache Implementation using TreeMap
Below is the algorithm of LRU Cache Implementation:
Algorithm:
- Initialize a TreeMap (entries) to store entries based on their access count.
- Initialize a HashMap (map) to store the mapping of keys to their corresponding values and access counts.
- Initialize variables.
- Size to represent the maximum capacity of the cache.
- Count to keep track of the access count.
get() method: If the key is present in pair,
- Get the current access count (c) and value associated with the key.
- Remove the entry from map with the old access count.
- Increment count.
- Update the entry in map with the new access count.
- Update the mapping in pair with the new access count.
- Return the value associated with the key.
If the key is not present, return -1.
put() method: If the key is already present in pair,
- Get the current access count (c) and update the entry in map with the new access count.
- Increment count.
- Update the entry in map with the new access count.
- Update the mapping in pair with the new access count.
- Return.
If the key is not present,
- If the cache is full (size == size)
- Get the least recently used entry from map.
- Remove the least recently used entry from both map and pair.
- Increment count.
- Add the new entry to map with the new access count.
- Update the mapping in pair with the new access count.
Implementation of LRU Cache Using TreeMap in Java
In Java, an LRU (Least Recently Used) cache is a data structure that stores a limited number of items and removes the least recently used item when the limit is reached. It is commonly used in computer systems to improve performance by keeping frequently used data readily available,
In this article, we will learn the implementation of LRU cache using TreeMap in Java.