php - json_encode handles big data problems
PHP中文网
PHP中文网 2017-05-16 13:09:11
0
4
853

When using json_encode to convert big data into json, I found that the memory was exhausted. There is no solution that can replace json_encode. This problem also seems to occur when looping big data. How to solve it

PHP中文网
PHP中文网

认证高级PHP讲师

reply all(4)
世界只因有你
//设定脚本无超时时间
set_time_limit(0);

//设置脚本可用最大内存
ini_set("memory_limit","2048M");
巴扎黑

Loops can consider the yield keyword to solve memory consumption.
json_encode This is too vague.

黄舟

If you just save and call parsing by php yourself, you can use the serialize method, which has much higher performance than Json_encode.

My answer to the question is not comprehensive, so you don’t need to read it. Only suitable for certain specific scenarios. . .

PHPzhong

Usually when I encounter problems with large data volumes. I will always think about whether this large data can be split. For example. I want to cache a data list. I can just cache the id. I am getting the specific data through ID (all cached). Of course, the specific situation needs to be analyzed on a case-by-case basis.
In addition, if you serialize, it will be very slow. When you need to process this json. Reading and parsing is also a problem

Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template