Cache is an important component in a distributed system, mainly solving hot spots in high concurrency and big data scenarios Data access performance issues. Provide high-performance and fast access to data.
This time I mainly share an architectural solution for a cache strategy that I think is more common and easier to understand. Comments are welcome.
If you have more awesome ideas, please share them:
Cache is an important component in a distributed system. It mainly solves the performance problem of hot data access in high concurrency and big data scenarios. Provide high-performance and fast access to data.
(1) Storage (device) that writes/reads data faster;
(2) Cache data to the location closest to the application;
(3) Cache data to the location closest to the user.
In distributed systems, caching is widely used. From a deployment perspective, there are the following caching applications.
(1) CDN cache;
(2) Reverse proxy cache;
(3) Distributed Cache;
(4) Local application cache;
Commonly used middleware: Varnish, Ngnix, Squid, Memcache, Redis, Ehcache, etc.;
Cached content: files, data, objects;
Caching media: CPU, memory (local, distributed), disk (local, distributed)
Cache design needs to solve the following problems:
(1) What to cache?
Which data needs to be cached: 1. Hot data; 2. Static resources;
(2) Where is the cache?
CDN, reverse proxy, distributed cache server, local machine (memory, hard disk)
(3) How to cache?
1. Fixed time: For example, the specified cache time is 30 minutes;
2. Relative time: For example, data that has not been accessed in the last 10 minutes;