Maison > base de données > tutoriel mysql > Compressing Large Data Sets in Redis With Gzip

Compressing Large Data Sets in Redis With Gzip

WBOYWBOYWBOYWBOYWBOYWBOYWBOYWBOYWBOYWBOYWBOYWBOYWB
Libérer: 2016-06-07 16:30:03
original
1093 Les gens l'ont consulté

Compressing Large Data Sets in Redis With Gzip: When publishing it, the post dropped the quote and my comments. A long post analyzing different scenarios of compressing data stored in Redis using Gzip: Year and a half ago, I was working wi

Compressing Large Data Sets in Redis With Gzip:

When publishing it, the post dropped the quote and my comments.

A long post analyzing different scenarios of compressing data stored in Redis using Gzip:

Year and a half ago, I was working with a software that used Redis as a buffer to store large sets of text data. We had some bottlenecks there. 
One
of them was related to Redis and the large amount of data, that we had there
(large comparing to RAM amount). Since then, I’ve wanted to check if using
Gzip would be a big improvement or would it be just a next bottleneck (CPU).
Unfortunately I don’t have access to this software any more, that’s why I’ve
decided to create a simple test case just to check this matter.
Copier après la connexion

If what’s important is the speed, I think algorithms like snappy and lzo are a better fit. If data density is important, then Zopfli is probably a better fit.

Original title and link: Compressing Large Data Sets in Redis With Gzip (NoSQL database?myNoSQL)

Compressing Large Data Sets in Redis With Gzip
Étiquettes associées:
Déclaration de ce site Web
Le contenu de cet article est volontairement contribué par les internautes et les droits d'auteur appartiennent à l'auteur original. Ce site n'assume aucune responsabilité légale correspondante. Si vous trouvez un contenu suspecté de plagiat ou de contrefaçon, veuillez contacter admin@php.cn
Tutoriels populaires
Plus>
Derniers téléchargements
Plus>
effets Web
Code source du site Web
Matériel du site Web
Modèle frontal