When using json_encode to convert big data into json, I found that the memory was exhausted. There is no solution that can replace json_encode. This problem also seems to occur when looping big data. How to solve it
Usually when I encounter problems with large data volumes. I will always think about whether this large data can be split. For example. I want to cache a data list. I can just cache the id. I am getting the specific data through ID (all cached). Of course, the specific situation needs to be analyzed on a case-by-case basis. In addition, if you serialize, it will be very slow. When you need to process this json. Reading and parsing is also a problem
Loops can consider the yield keyword to solve memory consumption.
json_encode This is too vague.
If you just save and call parsing by php yourself, you can use the serialize method, which has much higher performance than Json_encode.
My answer to the question is not comprehensive, so you don’t need to read it. Only suitable for certain specific scenarios. . .
Usually when I encounter problems with large data volumes. I will always think about whether this large data can be split. For example. I want to cache a data list. I can just cache the id. I am getting the specific data through ID (all cached). Of course, the specific situation needs to be analyzed on a case-by-case basis.
In addition, if you serialize, it will be very slow. When you need to process this json. Reading and parsing is also a problem