LRU Cache implementation

This one is a very known algorithm that is often taken as example for students but it’s an interesting one.

If someone ask you to implement a LRU (Least Recently Used) Cache. How would you do it?
Since it’s a cache you need to guarantee a O(1) for reading and writing. For a fast access, hash tables are very often the right data structure so we can consider it, but we need to keep the order and hash tables cannot do that.
An LRU cache is also a FIFO (First In First Out) data structure, a queue looks very adapted too but we loose the O(1) access time.

A good approach is to use both:

  • An hash table for the O(1) lookup time
  • A queue to keep the order

The only one problem is that queues are very effective for enqueue and dequeue but very slow for random access, and since each hit has to reorder the queue, those operations would lead to a O(n) lookup time for rearranging the queue every time we access the cache.

The good strategy is to keep this approach but use a double linked list instead of a queue because:

  • Very easy to implement a queue with it
  • Still slow for random access but easy to move a given node to the head and it’s all we need

Let’s focus on the implementation. We need first to define a node that contains the pair key/value:

class Node {
    int key;
    int data;
    Node previous;
    Node next;
}

Your LRUCache definition:

class LRUCache {

    private int capacity;
    private Map<Integer, Node> data;
    private Node head;
    private Node end;

    public LRUCache(int capacity) {
        this.capacity = capacity;
        this.data = new HashMap<>();
    }
}

Then we write an add method that append the node as the head:

    private void add(Node node) {

        // Reset position
        node.next = null;
        node.previous = null;

        // First element
        if (null == this.head) {
            this.head = node;
            this.end = node;
            return;
        }

        // Existing element
        this.head.previous = node;
        node.next = this.head;
        this.head = node;
    }

Same thing for the remove:

    private void remove(Node node) {

        // Nothing to do
        if (this.head == null || null == node) {
            return;
        }

        // The only one item
        if (this.head == this.end && this.head == node) {
            this.head = null;
            this.end = null;
            return;
        }

        // Remove from head
        if (node == this.head) {
            this.head.next.previous = null;
            this.head = this.head.next;
            return;
        }

        // Remove from end
        if (node == this.end) {
            this.end.previous.next = null;
            this.end = this.end.previous;
            return;
        }

        // Remove in the middle
        node.previous.next = node.next;
        node.next.previous = node.previous;

    }

Two methods that help to move a node to the head (for hits), and a remove last that cleanup the oldest item:

    private void moveFirst(Node node) {
        this.remove(node);
        this.add(node);
    }

    private void removeLast() {
        this.remove(this.end);
    }

The linked list is partially implemented (enough for our needs).
The LRU get method simply retrieve the key and move in to the head in the list.

public int get(int key) {

        // Existing key
        if (this.data.containsKey(key)) {

            // Move to first place
            Node node = this.data.get(key);
            this.moveFirst(node);

            // Return the value
            return node.data;
        }

        // Not found
        return -1;
    }

Last method, the set method generates an hit to move the accessed element to the head.
Like the get method, the set method deals with both hash table and linked list:

    public void set(int key, int value) {

        // Existing slot
        if (this.data.containsKey(key)) {
            Node node = this.data.get(key);
            this.moveFirst(node);
            node.data = value;
            return;
        }

        // Out of capacity, cleaning the oldest slot
        if (this.data.size() >= this.capacity) {
            int id = this.end.key;
            this.removeLast();
            this.data.remove(id);
        }

        // New slot
        Node node = new Node();
        node.key = key;
        node.data = value;
        this.add(node);
        this.data.put(key, node);
    }

Of course you probably shouldn’t implement your own cache or linked list, there are so many very efficient implementations.
In Java you can implement your LRUCache in a couple of instructions reusing java built in data structures but here we just want to understand what’s going on and write our own for learning purpose.
Notice also that this implementation is not thread safe too.

Leave a comment