Caches are often used as a cheap way to reduce latency and improve the performance of applications. Often we leverage libraries like Guava because they are rock solid and have lots of functionality but at the same time, all libs using Guava could easily bring a big jar hell and distributed monolith if you dont be cautious about your shared libraries. So if you are building a service might be completely fine to use more libs but as you build shared libs that could be really bad. So It’s always precise and lean about your dependencies. So today I want to show that algorithms do not bite(much) and it’s not rocket science to create a simple LRU Cache in Java using ZERO libraries. So I recorded a video going through the code and explaining what I did and how the algorithm works. For sure you could be thinking I prefer to use what’s already there, sure thats fine. At the same time blotted applications and shared libs are really a nightmare at scale so somethings are better to copy code or just write some simple implementations yourself. Don’t be afraid of algorithms and if you do enough tests you should be fine. So Let’s get started!

The Video

The Code


Diego Pacheco

Originally published at on February 24, 2021.

Brazilian, Software Architect, SWE(Java, Scala, Rust, Go) SOA & DevOps expert, Author. Working with EKS/K8S. (Opinions on my own)

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store