Streaming Large Files with PHP
In scenarios where you want to securely offer users a one-time download of a massive file without consuming excessive memory, the question arises: how can you stream the file efficiently?
The conventional approach of using file_get_contents() to retrieve the entire file content simultaneously proves impractical due to potential memory limitations. To address this issue, consider employing a streaming method that delivers data in manageable chunks.
One solution, as suggested in an online source, is to use the readfile_chunked() function. This function allows you to specify a chunk size and iteratively read and output the file content, avoiding memory overload.
The provided code sample demonstrates the implementation of this approach:
// Define the chunk size in bytes define('CHUNK_SIZE', 1024*1024); // Function to read a file and display its content chunk by chunk function readfile_chunked($filename, $retbytes = TRUE) { $buffer = ''; $cnt = 0; $handle = fopen($filename, 'rb'); if ($handle === false) { return false; } while (!feof($handle)) { $buffer = fread($handle, CHUNK_SIZE); echo $buffer; ob_flush(); flush(); if ($retbytes) { $cnt += strlen($buffer); } } $status = fclose($handle); if ($retbytes & $status) { return $cnt; // Return the number of bytes delivered. } return $status; } // Restrict access to logged-in users if ($logged_in) { $filename = 'path/to/your/file'; $mimetype = 'mime/type'; header('Content-Type: '.$mimetype ); readfile_chunked($filename); } else { echo 'Access denied.'; }
This approach streams the file in manageable chunks, avoiding memory constraints and delivering the file to users efficiently.
The above is the detailed content of How Can I Efficiently Stream Large Files in PHP to Avoid Memory Exhaustion?. For more information, please follow other related articles on the PHP Chinese website!