Caching is really the holy grail of liveness. Consider the difference between hitting your database for every request vs pulling that data directly out of memory. Here is a small comparison of performance related properties with pros in green and cons in red.
- Near instantaneous access to data
- Concurrent contention is virtually nil
- Limited amount of memory
- Disk seek speed
- Concurrent contention as threads wait for disk access
- IO speed
- The database query speed
- Connection setup/teardown
- Marshalling and unmarshalling
- Large amount of disk space available
These are all pretty obvious points, but it is just to highlight how much more stuff is done when you don't cache. These things are really taxing on the entire system, much more so than pulling stuff from memory which also means your application is far more scalable.
Traffic was increasing in large amounts and the response times were getting worse and worse. Not to the point where it was the end of the world, but bad enough that we had to do something about it before it got even worse (over a second is bad). Now that we cache almost everything, response times are down to a few milliseconds, right up there with Google search responses.
Caching is actually pretty simple to implement. No matter what solution you use, the interface is always something like java.util.Map where you put, get, and remove elements.