This article mainly introduces the method of reading large CSV files into the database with PHP. Interested friends can refer to it. I hope it will be helpful to everyone.
How does PHP read large CSV files and import them into the database?
For CSV files with millions of data items, the file size may reach hundreds of MB. If it is simply read, it is likely to time out or freeze.
In order to successfully import the data in the CSV file into the database, batch processing is very necessary.
The following function reads certain rows of data specified in the CSV file:
/** * csv_get_lines 读取CSV文件中的某几行数据 * @param $csvfile csv文件路径 * @param $lines 读取行数 * @param $offset 起始行数 * @return array * */ function csv_get_lines($csvfile, $lines, $offset = 0) { if(!$fp = fopen($csvfile, 'r')) { return false; } $i = $j = 0; while (false !== ($line = fgets($fp))) { if($i++ < $offset) { continue; } break; } $data = array(); while(($j++ < $lines) && !feof($fp)) { $data[] = fgetcsv($fp); } fclose($fp); return $data; }
Calling method:
$data = csv_get_lines('path/bigfile.csv', 10, 2000000); print_r($data);
The function mainly uses the idea of row positioning, by skipping the beginning Number of lines to implement file pointer positioning.
As for how the data is stored in the database, this article will not go into detail.
The above function has been tested on files within 500M and runs smoothly. It has not been tested on larger files. Please consider using it or improving it.
This example of PHP reading a large CSV file and importing it into the database is all the content shared by the editor. I hope it can give you a reference.
php automatic backup Database table method
php mysqlDatabaseHow to encapsulate a class
phpHow to encapsulate mongodbDatabase
The above is the detailed content of How to read large CSV files and import them into the database with PHP. For more information, please follow other related articles on the PHP Chinese website!