Table of Contents
1. Time-to-Live (TTL) – Expire After Time
2. Least Recently Used (LRU) – Use It or Lose It
3. Least Frequently Used (LFU) – Favor the Favorites
4. First In, First Out (FIFO) – The Oldest Out
5. Other Policies and Hybrids
Final Thoughts

5 Cache Eviction Strategies Every Developer Should Know

This blog explores common cache eviction policies—like LRU, LFU, FIFO, and more—explaining how systems decide what data to remove when caches fill up.
Imagine your fridge is overstuffed – to fit new groceries, you toss out the stale leftovers.
Software does something similar with caches.
Caching makes data access faster, but cache memory is limited.
When a cache runs out of space, something has to go.
The trick is choosing the right item to evict – ideally one the user won’t miss. A good policy keeps the most useful data in memory (maximizing your cache hit rate) and discards the rest.
Cache eviction policies decide which data to remove when a cache is full.
In this post, we’ll demystify cache eviction, why it matters for performance, and introduce popular strategies (LRU, LFU, FIFO, etc.) in simple terms.
Let’s explore a few common strategies:
1. Time-to-Live (TTL) – Expire After Time
TTL assigns each cache entry an expiration timestamp.
Items are evicted after a fixed time period, regardless of access.
This ensures stale data doesn’t linger.
It’s perfect for data that naturally expires (session tokens, DNS records, etc.).
TTL is simple to implement but not very flexible – it might remove an item that’s still in active use just because its timer ran out.
2. Least Recently Used (LRU) – Use It or Lose It
LRU evicts the item that hasn’t been accessed for the longest time.
If you haven’t touched something in a while, LRU assumes you likely won’t need it imminently.
Thanks to temporal locality (recently used items are usually accessed again soon), LRU tends to work well. It’s one of the most widely used policies – web browsers, operating systems, and databases often rely on LRU or similar algorithms.
LRU isn’t perfect (it has to track usage order, and it can misjudge in unusual access patterns), but it’s a strong default in many scenarios.
3. Least Frequently Used (LFU) – Favor the Favorites
LFU evicts whichever item has been used the fewest times.
In other words, it kicks out the least popular item to make space. This keeps high-traffic data in the cache.
LFU is great if a small subset of items account for most of the accesses (for example, a few trending products on an online store).
The drawback is the bookkeeping: you need to maintain a usage count for each item, which adds overhead.
Also, new items start with a count of zero, so they might get evicted before they’ve had a chance to become popular (some implementations add tweaks to avoid this cold-start issue).
4. First In, First Out (FIFO) – The Oldest Out
FIFO evicts the oldest entry in the cache (the one that was added first) when space is needed.
It’s basically a queue: the earliest item goes out first.
FIFO is very simple but it doesn’t consider usage at all – it might throw out data that is still hot simply because it’s been in the cache the longest.
Because of this, pure FIFO is rarely ideal for general caching (unless your access pattern is truly FIFO-friendly).
5. Other Policies and Hybrids
-
Most Recently Used (MRU): essentially the opposite of LRU – evict the item that was accessed most recently. Counterintuitive as it sounds, MRU can outperform LRU in certain cases (like cyclic access patterns where the most recent item won’t be needed again soon).
-
Random Replacement: evict a random item. This requires no tracking and can work okay when access patterns are unpredictable, though it’s not usually optimal in practice.
-
Segmented LRU (SLRU): a hybrid approach that splits the cache into “new” and “frequently used” segments. New items start in one segment, and if they are accessed again, they get promoted to a protected segment. This way, frequently-used items are much less likely to be evicted than one-hit-wonders.
Final Thoughts
Modern cache systems (like Redis or Memcached) often let you choose an eviction policy or even combine them.
The right choice depends on your workload – there’s no one-size-fits-all. LRU is a safe bet for many applications, but if your usage pattern is unique (e.g. a few items dominate access, or data access is cyclic), considering LFU, MRU, or others can improve performance.
The key is to match the policy to how your data is accessed, so you maximize cache hits and minimize evicting something important.
What our users say
ABHISHEK GUPTA
My offer from the top tech company would not have been possible without this course. Many thanks!!
AHMET HANIF
Whoever put this together, you folks are life savers. Thank you :)
Ashley Pean
Check out Grokking the Coding Interview. Instead of trying out random Algos, they break down the patterns you need to solve them. Helps immensely with retention!