Introdução
if you’re a software developer with your sights set on a career at a big tech, there’s a key skill you’ll need to master: building a Least Recently Used (LRU) cache. LeetCode often emphasizes this as a common challenge in technical interviews, and it’s not hard to see why. An LRU cache plays a pivotal role in managing memory efficiently in constrained environments. In this article, we’re going to walk through how to construct an LRU cache in Java, using the versatile LinkedHashMap class.
Let’s dive in and unpack this essential technique.
What is a Cache?
A cache is a special kind of memory that stores data for a short time. It’s really fast, and its main job is to make accessing data quicker than usual. This is important because it helps avoid slow processes that happen when a computer reads or writes data.
Caching is a big deal in making computers work faster. It’s used in lots of different areas, like making websites load quickly or helping apps on your phone run smoothly. By using caching, we can make sure computers do their jobs more efficiently.
What is LRU Cache?
LRU stands for ‘Least Recently Used,’ and it’s a smart way of managing cache memory. Imagine a cache as a small box where you can only fit a limited number of items. When it gets full and you need to put something new in, LRU helps by removing the item that hasn’t been used for the longest time. This method works great in situations where you’re less likely to need old data again.
Let’s take an example: think about a website that recommends products to you. To suggest items you might like, it needs to quickly look at what you’ve recently viewed or bought. But with so many products and users, it’s impossible to store all this information in the main memory – it’s just too much.
Here’s where an LRU cache comes in handy. It keeps track of the products you’ve looked at or bought most recently. So, as you move around on the website, it can quickly use this cache to suggest other items you might like, making the whole experience faster and more personal.
How to Build an LRU Cache in Practice
The LRU Cache can be implemented using a doubly linked list combined with a HashMap. However, in Java, this implementation can be abstracted using the LinkedHashMap class. Internally, LinkedHashMap implements a doubly linked list to maintain the order of elements and a HashMap to store keys and values, thereby enabling quick access to elements.
Java Implementation Using LinkedHashMap
import java.util.LinkedHashMap;
import java.util.Map;
public class LRUCache<K, V> extends LinkedHashMap<K, V> {
private final int capacity;
public LRUCache(int capacity) {
super(capacity, 0.75f, true);
this.capacity = capacity;
}
@Override
protected boolean removeEldestEntry(Map.Entry<K, V> eldest) {
return size() > this.capacity;
}
}
Breaking Down the LRU Cache Code
The LRUCache class takes advantage of a built-in data structure in Java, the LinkedHashMap. This structure is pretty handy because it can keep track of the order of elements based on either when they were added (insertion order) or when they were last accessed (access order). For our LRU Cache, maintaining the access order is crucial. This means every time an item is read or written, it gets moved to the end of the list.
When we set up our LRUCache class, we start by calling the constructor of LinkedHashMap with three important parameters: initial capacity, load factor, and access order.
- Initial Capacity: This is all about how big our cache can be. Once the cache hits this limit, it needs to remove the least recently used item before any new ones can be added.
- Load Factor: Think of this as a threshold that tells us how full the HashMap can get before it needs to grow bigger. We’ll set ours to 75%, which means when it’s 75% full, it’ll expand. But in our LRU Cache, we’re actually going to remove the least used items before this happens.
- Access Order: A simple true/false setting. We set this to true because we want to keep things in order based on how recently they’ve been accessed, not when they were put in.
There’s also a crucial bit where we override the removeEldestEntry method from LinkedHashMap. This method gets called every time we add something new with put or putAll. It decides if the oldest entry (the least recently used one) should be tossed out. Our logic here is straightforward: if our map’s size is bigger than our set capacity, the oldest entry gets removed. This is the heart of the LRU Eviction process.
Using the LRUCache Class: A Practical Example
public static void main(String[] args) {
LRUCache<Integer, Integer> cache = new LRUCache<>(2);
cache.put(1, 1);
cache.put(2, 2);
System.out.println(cache); //output {1=1, 2=2}
cache.get(1); // returns 1
System.out.println(cache); //output {2=2, 1=1}
cache.put(3, 3); // Remove 2 and add 3
System.out.println(cache); //output {1=1, 3=3}
cache.get(2); // return null
cache.put(4, 4); // Remove 1 and add o 4
System.out.println(cache); //output {3=3, 4=4}
cache.get(3); // return 3
System.out.println(cache); //output {4=4, 3=3}
cache.get(4); // return 4
System.out.println(cache); //output {3=3, 4=4}
}
Conclusion
In this article, we’ve explored how an LRU Cache can dramatically speed up data access in applications where quick responses are critical. Using Java’s LinkedHashMap to implement this algorithm offers a great mix of simplicity and effectiveness. Hopefully, you now have a clearer understanding of the LRU Cache and feel ready to apply this technique to improve your own applications.
We want to hear from you! If you’ve implemented an LRU Cache in your project, or if you have any questions, please share with us in the comments. Your input is invaluable to the community and can help other developers find solutions and inspiration.
For those interested in a Portuguese version of this article, please click here to access it.
References
Cracking the Coding Interview: This book is a renowned best-seller and a personal favorite – an essential book for preparing for technical interviews!
Designing Data-Intensive Applications: ‘ is a go-to reference for creating scalable, robust, and high-performance applications!