Streaming Large Files with PHP
When dealing with large files that exceed PHP's memory limit, it becomes necessary to stream the file directly to the user's browser without loading it entirely into memory. This technique allows for efficient handling of large data without causing memory exhaustion issues.
One approach to streaming files is by using the file_get_contents() function. However, this method requires the entire file to be loaded into memory, which is not feasible for large files.
To overcome this limitation, we can implement a chunking approach, where the file is divided into smaller chunks and streamed sequentially. Here's a recommended approach:
define('CHUNK_SIZE', 1024*1024); // Size (in bytes) of tiles chunk // Read a file and display its content chunk by chunk function readfile_chunked($filename, $retbytes = TRUE) { $buffer = ''; $cnt = 0; $handle = fopen($filename, 'rb'); if ($handle === false) { return false; } while (!feof($handle)) { $buffer = fread($handle, CHUNK_SIZE); echo $buffer; ob_flush(); flush(); if ($retbytes) { $cnt += strlen($buffer); } } $status = fclose($handle); if ($retbytes && $status) { return $cnt; // return num. bytes delivered like readfile() does. } return $status; } if (/* User is logged in */) { $filename = 'path/to/your/file'; $mimetype = 'mime/type'; header('Content-Type: '.$mimetype); readfile_chunked($filename); } else { echo 'Access denied.'; }
In this code, the readfile_chunked() function divides the file into 1MB chunks and streams each chunk to the user's browser. The flushing of the output buffers ensures that the chunks are sent immediately without waiting for the entire file to be read.
By implementing this approach, you can efficiently stream large files to users without encountering memory issues.
The above is the detailed content of How Can I Stream Large Files in PHP Without Exceeding Memory Limits?. For more information, please follow other related articles on the PHP Chinese website!