Implement LRU Cache Decorator in Python

In this section, we are going to implement Least Recently Used cache decorator in Python. It works on the principle that it removes the least recently used data and replaces it with the new data. It generally stores the data in the order of most recently used to least recently used. LRU generally has two functions: put( )and get( ) and both work in the time complexity of O(1).In addition, we have used decorator just to modify the behavior of function and class. Decorator is a function that takes up a function and returns a function, so it basically wraps a function to extend its behavior without modifying that wrapper function.

Approach to the problem

Since put( ) and get( ) function works in the O(1) time.So, we use only those Data Structures which helps us to achieve O(1) time complexity. So, we will be using a doubly-linked list and HashMap.But python does not have a doubly-linked list, instead of that, it has a deque class that provides a double-ended queue and can be implemented as a doubly-linked list internally. Now, the question arises that how does the get function work in the O(1) time complexity as accessing elements in deque works in O(n) time? So, we simply use a HashMap and linked the item of the map to a deque node. In this manner, we achieved our goal of O(1) time. Summing up the whole discussion, Dictionary will be used to store items and its locations in the cache and deque will be used to store items in the order of most recent to least recent.

Accessing and Deleting/Evicting an item

To access any element we have to follow the following steps:

  • Firstly, lookup for the item in a HashMap.
  •  If the item is present in the HashMap, then its location is already stored in the cache. This case is called a “cache hit“.So, we will use the HashMap to find out the corresponding deque node and move the item’s deque node to the head of the deque as it is the most recently used item.
  • If the item is not present in the HashMap, then this case is called a “cache miss“.Now, we need to load the item into a cache. There arise two cases :
  1.  If the capacity of the cache is filled, then we need to remove the rightmost element i.e the least recently used and add the element to the head of the deque.
  2.  Else we will create a new node for the item, insert it to the head of the deque and add it to the HashMap.

Python program to implement LRU Cache Decorator

from collections import deque 
import time

# LRU cache implementation 
class LRUCache: 
  def __init__(self, size=5): 
    self.size = size 
    self.queue = deque() 
    self.hashmap = dict() 
  def rehash(self): 
    if len(self.queue) > 1: 
      for key, val in enumerate(self.queue): 
        self.hashmap[val] = key 
  def access(self, val): 
  def delete(self, val):
    if val in self.hashmap: 
      i = self.queue.pop() 
      del self.hashmap[i] 
  def normal_insert(self, val): 
  def insert(self, val): 
    if val in self.hashmap.keys():  
      if (len(self.hashmap.keys()) == self.size): 
  def print(self): 
    lru_elements = [2*i for i in self.queue] 

# definition of lru decorator 
def LRUDecorator(size): 
  lru = LRUCache(size) 
  def decorator(function): 
    def functionality(num): 
    return functionality 
  return decorator 

def mul(n): 
    return 2*n      
print(f'\nFunction: multiply') 
for i in range(1,6):
for i in reversed(range(1,5)):


Function: multiply
[4, 2]
[6, 4, 2]
[8, 6, 4, 2]
[10, 8, 6, 4]
[8, 10, 6, 4]
[6, 8, 10, 4]
[4, 6, 8, 10]
[2, 4, 6, 8]

Leave a Reply

Your email address will not be published. Required fields are marked *