Reading large files one line at a time can be crucial in various scenarios, especially when memory consumption is a concern. Node.js offers several approaches to achieve this efficiently.
Beginning with Node.js v0.12, the readline core module provides a stable solution for this purpose. Here's how you can use it:
const fs = require('fs'); const readline = require('readline'); async function processLineByLine() { const fileStream = fs.createReadStream('input.txt'); const rl = readline.createInterface({ input: fileStream, crlfDelay: Infinity }); for await (const line of rl) { console.log(`Line from file: ${line}`); } } processLineByLine();
This code creates a read stream from the specified file and then uses the readline interface to iterate over the file, printing each line to the console.
If you're using an older version of Node.js, you can consider using a third-party module like line-reader. Here's an example:
var lineReader = require('readline').createInterface({ input: require('fs').createReadStream('file.in') }); lineReader.on('line', function (line) { console.log('Line from file:', line); }); lineReader.on('close', function () { console.log('all done, son'); });
This code uses the line-reader module to create a line reader interface, which is then used to listen for the 'line' event. Each time a line is read from the file, the event handler is triggered, printing the line to the console.
The above is the detailed content of How to Efficiently Read Files Line by Line in Node.js?. For more information, please follow other related articles on the PHP Chinese website!