LRU cache implementation in java


In this post, we will see LRU cache implementation in java.


Design a Least recently used cache implementation in java. It should have below properties.

bounded size: It should have bounded size to take care of memory limits.

Fast access: As we are designing cache, we should be able to fetch or update entries faster.

Evict least recently used entry: Cache should evict least recently used entry if capacity is reached.


As we need to do lookup in O(1) time then HashMap is obvious choice but we need to take care of least recently used entry as well.

We need to find a data structure which can remove/add in O(1) time if we already know the node. We can use a double linked list for this purpose because it provides removal/addition in O(1) time if already know about the node.

HashMap will make get operation in O(1) time and Doube linked list will make removal/addition in O(1) time.

LRU cache


Here is simple program for LRU cache implementation in java.

When you run above program, you will get below output:


That’s all about LRU cache implementation in java.


Join Our News Letter – Stay Updated

Subscribe to Awesome Java Content.


  1. Smitha August 4, 2018
    • Arpit Mandliya August 6, 2018
  2. ssule December 21, 2018

Add Comment

Join Our News Letter - Stay Updated

Subscribe to Awesome Java Content.
You can like our facebook page Java2blog