Home > Backend Development > PHP Tutorial > Example analysis of big data processing ideas in PHP

Example analysis of big data processing ideas in PHP

小云云
Release: 2023-03-21 06:58:02
Original
5257 people have browsed it


Encountered an issue with a previous version related to large data processing reports. I had previously solved this problem with the sync handler code, but it ran very slowly, causing me to have to extend the maximum script run time by 10 to 15 minutes. Is there a better way to handle large amounts of data in PHP sites? Ideally I'd like to run it in the background and make it as fast as possible. This process included processing thousands of pieces of financial data. I used Laravel to rebuild the site.

Most popular answer (from spin81):

People tell you to use Queues and whatnot, it's a good idea, but the problem doesn't seem to be with PHP. Laravel/OOP is great, but the program that generates the reports you're talking about doesn't seem like you should have a problem. For a different perspective, I'd like to see the SQL query you used to get this data. As others have said, if your form has thousands of rows your report shouldn't take 10 to 15 minutes to complete. In fact, if you do everything right you can probably process thousands of records and complete the same report in a minute.

1. If you are doing thousands of queries, see if you can do just a few queries first. I've previously used a PHP function to reduce 70,000 queries to a dozen queries, thus reducing its run time from minutes to a fraction of a second.

2. Run EXPLAIN on your query to see if you are missing any indexes. I once made a query and the efficiency improved by 4 orders of magnitude by adding an index. This is not an exaggeration. If you are using MySQL, you can learn this. This "black magic" skill will stun you and your friends.

3. If you are doing a SQL query and getting the results and getting a lot of numbers together, see if you can use something like SUM() and Functions like AVG() call the GROUP BY statement. As a general rule, let the database handle as much of the calculation as possible. One very important tip I can give you is this: (at least in MySQL) boolean expressions take the value 0 or 1, and if you're really creative you can do it using SUM() and its friends Some very surprising things.

4. Okay, here’s a final tip from the PHP side: See if you have calculated these equally time-consuming numbers many times. For example, let's say the cost of 1000 bags of potatoes is expensive to calculate, but you don't need to calculate that cost 500 times before you store the cost of 1000 bags of potatoes in an array or something like that, so you don't have to put the same thing Calculations over and over again. This technique is called mnemonics, and it often works wonders when used in reports like yours.
related suggestion:

Solutions to big data, large traffic and high concurrency of PHP websites

Six basic javascript tutorials Detailed explanation of data type usage

Recommended articles about large data volume testing

The above is the detailed content of Example analysis of big data processing ideas in PHP. For more information, please follow other related articles on the PHP Chinese website!

Related labels:
source:php.cn
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Popular Tutorials
More>
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template