Home > Web Front-end > JS Tutorial > The Basics of Node.js Streams

The Basics of Node.js Streams

Lisa Kudrow
Release: 2025-02-20 10:07:10
Original
265 people have browsed it

The Basics of Node.js Streams

Node.js, being asynchronous and event-driven, excels at I/O-bound operations. Leveraging Node.js streams significantly simplifies these tasks by efficiently processing data in smaller chunks. Let's delve into the world of streams and see how they streamline I/O.

Key Concepts:

  • Node.js streams, asynchronous and event-driven, optimize I/O by handling data in manageable portions.
  • Streams are classified as Readable, Writable, or Duplex (both readable and writable). Readable streams fetch data from a source; writable streams send data to a destination.
  • The pipe() function is invaluable, facilitating seamless data transfer between source and destination without manual flow management.
  • Methods like Readable.pause(), Readable.resume(), and readable.unpipe() offer granular control over data flow, enhancing stream functionality.

Understanding Streams:

Streams are analogous to Unix pipes, enabling effortless data transfer from source to destination. Essentially, a stream is an EventEmitter with specialized methods. The implemented methods determine whether a stream is Readable, Writable, or Duplex. Readable streams provide data input; writable streams handle data output.

You've likely encountered streams in Node.js already. In an HTTP server, the request is a readable stream, and the response is a writable stream. The fs module provides both readable and writable file stream capabilities.

This article focuses on readable and writable streams; duplex streams are beyond its scope.

Readable Streams:

A readable stream reads data from a source (a file, in-memory buffer, or another stream). Being EventEmitters, they trigger various events. We utilize these events to interact with the streams.

Reading from Streams:

The most common approach is to listen for the data event and attach a callback. When data is available, the data event fires, executing the callback.

const fs = require('fs');
const readableStream = fs.createReadStream('file.txt');
let data = '';
readableStream.on('data', (chunk) => { data += chunk; });
readableStream.on('end', () => { console.log(data); });
Copy after login
Copy after login

fs.createReadStream() creates a readable stream. Initially static, it begins flowing upon attaching a data event listener. Data chunks are then passed to the callback. The frequency of data events is determined by the stream implementation (e.g., an HTTP request might emit an event per few KB, while a file stream might emit per line).

The end event signals the end of data.

Alternatively, repeatedly call read() on the stream instance until all data is read:

const fs = require('fs');
const readableStream = fs.createReadStream('file.txt');
let data = '';
readableStream.on('data', (chunk) => { data += chunk; });
readableStream.on('end', () => { console.log(data); });
Copy after login
Copy after login

read() retrieves data from the internal buffer. It returns null when no data remains. The readable event indicates data availability.

Setting Encoding:

Data is typically a Buffer object. For strings, use Readable.setEncoding():

const fs = require('fs');
const readableStream = fs.createReadStream('file.txt');
let data = '';
let chunk;
readableStream.on('readable', () => {
  while ((chunk = readableStream.read()) !== null) {
    data += chunk;
  }
});
readableStream.on('end', () => { console.log(data); });
Copy after login

This interprets data as UTF-8, passing it as a string to the callback.

Piping:

Piping simplifies data transfer between source and destination:

const fs = require('fs');
const readableStream = fs.createReadStream('file.txt');
let data = '';
readableStream.setEncoding('utf8');
readableStream.on('data', (chunk) => { data += chunk; });
readableStream.on('end', () => { console.log(data); });
Copy after login

pipe() handles data flow automatically.

Chaining:

Streams can be chained:

const fs = require('fs');
const readableStream = fs.createReadStream('file1.txt');
const writableStream = fs.createWriteStream('file2.txt');
readableStream.pipe(writableStream);
Copy after login

This decompresses input.txt.gz and writes the result to output.txt.

Additional Readable Stream Methods:

  • Readable.pause(): Pauses the stream.
  • Readable.resume(): Resumes a paused stream.
  • readable.unpipe(): Removes destination streams from the pipe.

Writable Streams:

Writable streams send data to a destination. Like readable streams, they are EventEmitters.

Writing to Streams:

Use write() to send data:

const fs = require('fs');
const zlib = require('zlib');
fs.createReadStream('input.txt.gz')
  .pipe(zlib.createGunzip())
  .pipe(fs.createWriteStream('output.txt'));
Copy after login

write() returns a boolean indicating success. If false, the stream is temporarily full; wait for the drain event before writing more.

End of Data:

Call end() to signal the end of data. The finish event is emitted after all data is flushed. You cannot write after calling end().

Important Writable Stream Events:

  • error: Indicates an error.
  • pipe: Emitted when a readable stream is piped.
  • unpipe: Emitted when unpipe() is called on the readable stream.

Conclusion:

Streams are a powerful feature in Node.js, enhancing I/O efficiency. Understanding streams, piping, and chaining enables writing clean, performant code.

Node.js Streams FAQ:

  • What are Node.js streams? They are objects that allow for efficient, incremental processing of data, avoiding loading entire datasets into memory.

  • Main types of Node.js streams? Readable, Writable, Duplex, and Transform.

  • Creating a Readable stream? Use stream.Readable and implement the _read method.

  • Common use cases for Readable streams? Reading large files, processing data from HTTP requests, real-time data handling.

  • Creating a Writable stream? Use stream.Writable and implement the _write method.

  • Common uses of Writable streams? Saving data to files, sending data to services.

  • Duplex stream? Combines Readable and Writable functionality.

  • Transform streams? Modify data as it passes through (e.g., compression, encryption).

  • Piping data between streams? Use the .pipe() method.

  • Best practices for working with Node.js streams? Use them for large datasets, handle errors and backpressure, and consider util.promisify for promise-based operations.

The above is the detailed content of The Basics of Node.js Streams. For more information, please follow other related articles on the PHP Chinese website!

Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Latest Articles by Author
Popular Tutorials
More>
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template