Cache

Definition
A cache is a high speed data storage layer that stores frequently accessed information to improve performance and reduce load on primary systems. By keeping copies of data closer to the processor or application, cache reduces response times and improves user experience.
For example, a web browser stores cached versions of web pages and images so that they load faster when the user revisits them.
Advanced
Cache can exist in multiple layers, including CPU cache, in memory data stores, and content delivery networks. CPU caches store small amounts of data very close to the processor for microsecond retrieval. Application level caches such as Redis and Memcached handle high volumes of queries in real time. Web caches in CDNs deliver content to users from nearby servers, reducing latency and bandwidth consumption.
Advanced strategies include cache invalidation policies such as least recently used, time to live settings, and write through or write back approaches. Proper cache management balances speed with consistency, ensuring data served is both fast and accurate.
Why it matters
Use cases
Metrics
Issues
Example
An e-commerce platform uses Redis to cache product catalog queries. When customers browse the site, results are delivered from the cache, reducing database strain and speeding up page loads.