Home > Database > Redis > body text

What are the redis cache strategies?

(*-*)浩
Release: 2019-06-18 13:44:37
Original
4029 people have browsed it

When using Redis as a cache, if the memory space is full, old data will be automatically evicted. Memcached works this way by default, and most developers are familiar with it. LRU is the only recycling algorithm supported by Redis.

What are the redis cache strategies?

Eviction Strategy (Recommended learning: Redis Video Tutorial)

When the maximum memory limit (maxmemory) is reached, Redis determines the specific behavior based on the policy configured by maxmemory-policy.

The current version, the strategies supported by Redis 3.0 include:

noeviction: Do not delete the strategy. When the maximum memory limit is reached, if more memory is needed, directly Return error message. Most write commands will cause more memory to be occupied (with rare exceptions, such as DEL).

allkeys-lru: Common to all keys; delete the least recently used (LRU) keys first.

volatile-lru: Only limited to the part where expire is set; delete the least recently used (LRU) key first.

allkeys-random: Common to all keys; randomly delete some keys.

volatile-random: Only limited to the part where expire is set; randomly delete a part of the key.

volatile-ttl: Only limited to the part where expire is set; keys with short remaining time (time to live, TTL) will be deleted first.

If the expire key is not set and the prerequisites are not met; then the behavior of volatile-lru, volatile-random and volatile-ttl strategies is basically the same as noeviction (no deletion).

You need to choose an appropriate eviction strategy based on the characteristics of the system. Of course, you can also dynamically set the eviction policy through commands during operation, and monitor cache misses and hits through the INFO command for tuning.

Generally speaking:

If it is divided into hot data and cold data, it is recommended to use the allkeys-lru strategy. That is, some of the keys are often read and written. If you are not sure about the specific business characteristics, then allkeys-lru is a good choice.

If you need to read and write all keys in a loop, or the access frequency of each key is similar, you can use the allkeys-random strategy, that is, the probability of reading and writing all elements is almost the same.

If you want Redis to filter keys that need to be deleted based on TTL, please use the volatile-ttl strategy.

The main application scenarios of volatile-lru and volatile-random strategies are: instances with both cache and persistent keys. Generally speaking, for scenarios like this, two separate Redis instances should be used.

It is worth mentioning that setting expire will consume additional memory, so using the allkeys-lru strategy can make more efficient use of memory, because this way you no longer need to set the expiration time.

For more Redis-related technical articles, please visit the Introduction to Using Redis Database Tutorial column to learn!

The above is the detailed content of What are the redis cache strategies?. For more information, please follow other related articles on the PHP Chinese website!

Related labels:
source:php.cn
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Popular Tutorials
More>
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template
About us Disclaimer Sitemap
php.cn:Public welfare online PHP training,Help PHP learners grow quickly!