新手刚学PHP,目前开发需要满足用phpexcel往数据库插入1万条记录(1条记录43列),每条数据插入前先查询库中是否存在,不存在则继续插入,并更新原表置标志,否则跳过该条记录。
现在的情况是导入2000条数据要20多秒,导入1万条数据就报500错误,导入失败,查看存放上传文件的目录里面也没有该文件。`$result=move_uploaded_file($_FILES'inputExcel',$uploadfile);
if(!$result)
{
die('no file!');
}
$objReader = PHPExcel_IOFactory::createReader('CSV')
->setDelimiter(',')
->setInputEncoding('GBK')
->setEnclosure('"')
->setLineEnding("\r\n")
->setSheetIndex(0);
$objPHPExcel = $objReader->load($uploadfile);
$sheet = $objPHPExcel->getSheet(0);
$highestRowNum = $sheet->getHighestRow();
$highestColumn = $sheet->getHighestColumn();
$highestColumnNum = PHPExcel_Cell::columnIndexFromString($highestColumn);
echo $highestRowNum.'+'.$highest
这边无法获取行和列数。代码里面已结加了set_time_limit=0,内存也改成128M。求大神指点
It should be that the memory is full. You can use yield iteration to limit the number of items inserted each time. In fact, it is a coroutine.
A single import of 10,000 pieces of data will easily time out and place a heavy load on the server. You can consider splitting the 1,000 pieces of data into multiple imports. This kind of big data import can be completed in an asynchronous manner. Do not let users wait. You can also consider using coroutines instead.
Add the uploaded file name to the queue. Write a script to read the file name from the queue and open the file for import. Add the script to a crontab scheduled task or write it as a service
Since it is csv, there is no need to use PHPExcel. This open source library consumes too much memory. I used this library before, https://packagist.org/package... (it seems to only support PHP7), I won’t write the demo. I’ll look for it myself in the documentation. I can import hundreds of thousands of items without any pressure, thank you.
Introducing another one, https://packagist.org/package..., there is no pressure. I suggest you first understand what csv is. We can even write a csv import class ourselves, which feels most efficient and light. magnitude.