I currently use php to export csv. The implementation process is to first generate the file, then query the data in batches, and then write the file one by one in a foreach loop. After the writing is completed, execute the download method to download the file. My current problem is that if there is a lot of data, After the php program is executed, the file is still being written and the download method cannot be executed, so the file cannot be downloaded. Is there any way to solve this problem?
I currently use php to export csv. The implementation process is to first generate the file, then query the data in batches, and then write the file one by one in a foreach loop. After the writing is completed, execute the download method to download the file. My current problem is that if there is a lot of data, After the php program is executed, the file is still being written and the download method cannot be executed, so the file cannot be downloaded. Is there any way to solve this problem?
With this amount of data, a single PHP request will usually time out.
It is recommended that the php file only processes 10 pieces of data per request, and appends are written to the csv file, and a single request is returned quickly.
Then write a js program, and ajax calls this php file in a loop until the data is processed, triggering the web page download action. Another advantage of this is that the front-end web page can display the processing progress in real time.
Because csv is essentially plain text, you can cooperate with js and php. js initiates multiple requests to pass the strings in batches, and finally lets js merge to generate the file (the link is in the form of data:)
The file is not completely generated and the button is in disable state