This question explores the efficient reading of large files one line at a time in Node.js. While the provided Quora example demonstrates this for STDIN input, the conversion to a file-based approach requires careful consideration.
The attempt to read lines using fs.open() and process.stdin will fail because fs.open() does not provide a streaming interface like process.stdin. To resolve this, we can utilize the readline core module introduced in Node.js v0.12.
The readline module provides a convenient way to process large files line-by-line asynchronously. Here's an example:
const fs = require('fs'); const readline = require('readline'); async function processLineByLine() { const fileStream = fs.createReadStream('input.txt'); const rl = readline.createInterface({ input: fileStream, crlfDelay: Infinity // Handle CR LF as a single line break }); for await (const line of rl) { console.log(`Line from file: ${line}`); } } processLineByLine();
Alternatively, the readline module can also be used in a synchronous manner:
const lineReader = require('readline').createInterface({ input: fs.createReadStream('file.in') }); lineReader.on('line', (line) => { console.log('Line from file:', line); }); lineReader.on('close', () => { console.log('all done'); });
In this example, the lineReader emits 'line' events for each line in the file and a 'close' event when all lines have been processed.
Note: The official Node.js documentation now includes an example similar to the one above.
The above is the detailed content of How to Efficiently Read Large Files Line by Line in Node.js?. For more information, please follow other related articles on the PHP Chinese website!