For CSV files with millions of data items, the file size may reach hundreds of MB. If it is simply read, it is likely to time out or freeze.
In order to successfully import the data in the CSV file into the database, batch processing is very necessary.
The following function reads certain rows of data specified in the CSV file:
Copy the code The code is as follows:
/**
* csv_get_lines Read certain lines of data in the CSV file
* @param $csvfile csv file path
* @param $lines Read the number of lines
* @param $offset The starting line number
* @return array
**/
function csv_get_lines($csvfile, $lines, $offset = 0) {
if(!$fp = fopen($csvfile, 'r')) {
return false;
}
$i = $j = 0;
while (false !== ($line = fgets($fp))) {
if($i++ < $offset ) {
continue;
}
break;
}
$data = array();
while(($j++ < $lines) && !feof($fp) ) {
$data[] = fgetcsv($fp);
}
fclose($fp);
return $data;
}
Calling method:
Copy code The code is as follows:
$data = csv_get_lines('path/bigfile.csv', 10, 2000000);
print_r($data);
The
function mainly uses the line positioning idea to realize file pointer positioning by skipping the starting line number.
The above function has been tested on files within 500M and runs smoothly. It has not been tested on larger files. Please consider using it or improving it.
http://www.bkjia.com/PHPjc/751506.htmlwww.bkjia.comtruehttp: //www.bkjia.com/PHPjc/751506.htmlTechArticleFor CSV files with millions of data items, the file size may reach hundreds of M, if simply read It is likely to time out or get stuck. In order to successfully convert the data in the CSV file...