首页 > 数据库 > mysql教程 > Compressing Large Data Sets in Redis With Gzip

Compressing Large Data Sets in Redis With Gzip

WBOYWBOYWBOYWBOYWBOYWBOYWBOYWBOYWBOYWBOYWBOYWBOYWB
发布: 2016-06-07 16:30:03
原创
1093 人浏览过

Compressing Large Data Sets in Redis With Gzip: When publishing it, the post dropped the quote and my comments. A long post analyzing different scenarios of compressing data stored in Redis using Gzip: Year and a half ago, I was working wi

Compressing Large Data Sets in Redis With Gzip:

When publishing it, the post dropped the quote and my comments.

A long post analyzing different scenarios of compressing data stored in Redis using Gzip:

Year and a half ago, I was working with a software that used Redis as a buffer to store large sets of text data. We had some bottlenecks there. 
One
of them was related to Redis and the large amount of data, that we had there
(large comparing to RAM amount). Since then, I’ve wanted to check if using
Gzip would be a big improvement or would it be just a next bottleneck (CPU).
Unfortunately I don’t have access to this software any more, that’s why I’ve
decided to create a simple test case just to check this matter.
登录后复制

If what’s important is the speed, I think algorithms like snappy and lzo are a better fit. If data density is important, then Zopfli is probably a better fit.

Original title and link: Compressing Large Data Sets in Redis With Gzip (NoSQL database?myNoSQL)

Compressing Large Data Sets in Redis With Gzip
相关标签:
本站声明
本文内容由网友自发贡献,版权归原作者所有,本站不承担相应法律责任。如您发现有涉嫌抄袭侵权的内容,请联系admin@php.cn
最新问题
git clone 致命:索引包失败
来自于 1970-01-01 08:00:00
0
0
0
终端中git pull之后怎么显示详细文件变更
来自于 1970-01-01 08:00:00
0
0
0
egg.js 解压压缩包的方法
来自于 1970-01-01 08:00:00
0
0
0
热门教程
更多>
最新下载
更多>
网站特效
网站源码
网站素材
前端模板