Streams are Node.js's superpower for handling large datasets efficiently. Let's dive into streams and pipelines.
const fs = require('fs'); const readStream = fs.createReadStream('big.file'); const writeStream = fs.createWriteStream('output.file'); readStream.on('data', (chunk) => { writeStream.write(chunk); }); readStream.on('end', () => { writeStream.end(); });
Pipelines simplify stream composition and error handling.
const { pipeline } = require('stream/promises'); const fs = require('fs'); const zlib = require('zlib'); async function compressFile(input, output) { await pipeline( fs.createReadStream(input), zlib.createGzip(), fs.createWriteStream(output) ); console.log('Compression complete'); } compressFile('big.file', 'big.file.gz').catch(console.error);
const { Transform } = require('stream'); const upperCaseTransform = new Transform({ transform(chunk, encoding, callback) { this.push(chunk.toString().toUpperCase()); callback(); } }); pipeline( process.stdin, upperCaseTransform, process.stdout ).catch(console.error);
Streams shine with large datasets or real-time data processing. Master them for scalable Node.js applications.
cheers?
以上是Mastering Node.js Streams and Pipelines的详细内容。更多信息请关注PHP中文网其他相关文章!