Reprinted: http://blog.csdn.net/huyanping/article/details/7068356
We often encounter the need to export data from a database to an Excel file. Using some open source libraries, such as PHPExcel, is indeed easier to implement, but the support for large amounts of data is very poor, and it is easy to reach the upper limit of PHP memory usage. The method here is to use fputcsv to write CSV files and directly output Excel files to the browser.
// Output the Excel file header, you can replace user.csv with the file name you want
header('Content -Type: application/vnd.ms-excel');
header('Content-Disposition: attachment;filename="user.csv"');
header('Cache-Control: max-age=0');
// Get data from the database. In order to save memory, do not read the data into the memory at once, but read it line by line from the handle. Just read
$sql = 'select * from tbl where ……';
$stmt = $db->query($sql);
// Open the PHP file handle, php://output means output directly to the browser
$fp = fopen('php://output', 'a' );
// Output Excel column name information
$head = array('Name', 'Gender', 'Age', 'Email', 'Phone ', '...');
foreach ($head as $i => $v) {
// CSV Excel supports GBK encoding, it must be Convert, otherwise garbled characters
$head[$i] = iconv('utf-8', 'gbk', $v);
}
// Write data to the file handle through fputcsv
fputcsv($fp, $head);
// Counter
$cnt = 0;
// Every $limit line, refresh the output buffer, not too big, not too small
$ limit = 100000;
// Fetch data row by row without wasting memory
while ($row = $stmt->fetch(Zend_Db::FETCH_NUM)) {
$cnt ;
if ($limit == $cnt) { //Refresh the output buffer to prevent problems caused by too much data
ob_flush();
flush();
$cnt = 0;
}
foreach ($row as $i => $v) {
$row[$i] = iconv('utf-8', 'gbk', $v) ;
}
fputcsv($fp, $row);
}