Laravel Excel queue consuming too much RAM
P粉884667022
P粉884667022 2024-03-21 19:11:50
0
2
310

I have set up Laravel queue to read excel files using your Laravel excel and it works great for small files.

But for large files (100 mb) and 400k records, it takes too much time and consumes nearly 40GB RAM of the server.

I have set up a supervisor to run queue:work commands. My server memory is 60GB. For small files everything works fine, but for large files it doesn't work.

I also checked the query times using telescope, but no query took a long time.

P粉884667022
P粉884667022

reply all(2)
P粉726234648

Currently, there is no direct answer to your question. A lot depends on your target results. You have to devise your own way to solve it.

One of the things I'm most concerned about is chunking or partitioning large excel files and putting them into a queue. Maybe you can take advantage of Laravel job batching.

Another thing you can introduce is a microservices system, where these heavy tasks will be done by another, better machine.

But like I said, there is no single solution to a problem like this. You have to figure this out yourself.

P粉455093123

For all those facing this kind of problem, I recommend using Spout. It works like a charm. I tried 3 PHP services for this and in the end only spout worked.

https://opensource.box.com/spout/

https://github.com/box/spout

Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template
About us Disclaimer Sitemap
php.cn:Public welfare online PHP training,Help PHP learners grow quickly!