Java Coding Exercise: Implement LRU Cache from Scratch with LinkedHashMap

Java Coding Exercise: Implement LRU Cache from Scratch with LinkedHashMap

Java Coding Exercise: Implement LRU Cache from Scratch with LinkedHashMap

 

Building an efficient in-memory cache is a common task in software systems, especially when performance and memory usage are critical. One popular cache strategy is the Least Recently Used (LRU) eviction policy, where the least recently accessed item is removed when the cache is full. In Java, implementing an LRU cache can be elegantly handled using LinkedHashMap. In this article, we’ll explore how to create a fully functional LRU cache in Java by extending LinkedHashMap and overriding a key method to handle eviction.

1. What is an LRU Cache and Why Use It?

An LRU (Least Recently Used) cache keeps track of the order in which keys are accessed. When the cache reaches its size limit, it evicts the entry that hasn’t been used for the longest time. LRU caches are especially useful in web applications, image processing, database query results, and anywhere stale data should be discarded in favor of fresher access patterns.

The goals of an LRU cache are to:

  • Provide fast O(1) access time
  • Control memory usage with size limits
  • Automatically evict old data

Let’s look at how Java’s LinkedHashMap offers the right data structure to hit all these marks.

2. Why LinkedHashMap is Ideal for LRU

LinkedHashMap is a hash table with predictable iteration order. It maintains a doubly-linked list running through its entries, either in insertion order or access order. If we initialize it with access order, it updates the order when get() or put() is called.

The magic feature is the removeEldestEntry() method that we can override to define custom eviction logic.

LinkedHashMap(int initialCapacity, float loadFactor, boolean accessOrder)

Setting accessOrder to true enables ordering based on key access. Next, we’ll subclass LinkedHashMap to implement our LRU cache.

3. Writing the LRUCache Class

Let’s see the core implementation of the LRU cache. We’ll use a subclass of LinkedHashMap and override the removeEldestEntry method to limit the number of elements.

import java.util.LinkedHashMap;
import java.util.Map;

public class LRUCache extends LinkedHashMap {
    private final int capacity;

    public LRUCache(int capacity) {
        super(capacity, 0.75f, true); // accessOrder = true
        this.capacity = capacity;
    }

    @Override
    protected boolean removeEldestEntry(Map.Entry eldest) {
        return size() > capacity;
    }
}

Here’s what happens:

  • super(capacity, 0.75f, true): Initializes a LinkedHashMap with access-order mode.
  • removeEldestEntry: Called by put() automatically. If it returns true, the eldest entry (first in order) is evicted.

This gives us an LRU cache in ~10 lines of Java. Let’s test how it works.

4. Testing LRUCache in a Main Application

public class Main {
    public static void main(String[] args) {
        LRUCache cache = new LRUCache<>(3);

        cache.put(1, "A");
        cache.put(2, "B");
        cache.put(3, "C");
        System.out.println(cache); // {1=A, 2=B, 3=C}

        cache.get(1); // Access 1
        cache.put(4, "D"); // Should evict 2 because it’s the LRU

        System.out.println(cache); // {3=C, 1=A, 4=D}

        cache.put(5, "E"); // Evicts 3
        System.out.println(cache); // {1=A, 4=D, 5=E}
    }
}

Each time we add a new entry that makes the size exceed 3, the LRU item is evicted. Accessing a key moves it to the end. This behavior is automatically handled by LinkedHashMap in access-order mode.

5. Tips, Performance, and Real-World Usage

Performance: LinkedHashMap gives O(1) time for get and put. The eviction logic is built-in and efficient.

Thread Safety: If you’re using LRUCache in a multi-threaded environment, wrap it with Collections.synchronizedMap() or use ConcurrentHashMap with custom eviction logic.

Customization: Want to log evictions? Override removeEldestEntry() with custom logging when it returns true.

Alternatives: Libraries like Caffeine or Guava’s CacheBuilder offer more powerful caching mechanisms, but for simple use cases, this hand-written LRU cache is fast and lightweight.

If you’re building an image loader, caching product data, or even results from expensive DB queries, this pattern can drastically reduce latency.

Conclusion

We’ve built a concise and elegant LRU cache by extending LinkedHashMap, overriding just one method to enforce eviction. This kind of exercise is great for interviews and practical enough for real systems. Java’s built-in data structure capabilities let us implement performant logic with minimal complexity. Try extending this further with TTLs or custom eviction metrics.

 

Useful links: