Node.js is inherently asynchronous and event-driven, making it ideal for handling I/O-related tasks. If you are handling I/O related operations in your application, you can take advantage of streams in Node.js. So let's look at streams in detail and understand how they simplify I/O operations.
What is a stream
A stream is a unix pipe that allows you to easily read data from a data source and then flow it to another destination.
Simply put, a stream is nothing special, it is just an EventEmitter that implements some methods. Depending on how it is implemented, a stream can become a readable stream (Readable), a writable stream (Writable), or a bidirectional stream (Duplex, simultaneously readable and writable).
Readable streams allow you to read data from a data source, while writable streams allow you to write data to a destination.
If you have used Node.js, you have probably encountered streams.
For example, in a Node.js HTTP server, request is a readable stream and response is a writable stream.
You may also have used the fs module, which can help you handle readable and writable streams.
Now let's learn some basics and understand different types of streams. This article will discuss readable streams and writable streams. Bidirectional streams are beyond the scope of this article and we will not discuss them.
Readable Streams
We can use readable streams to read data from a data source. This data source can be anything, such as a file in the system, a buffer in memory, or even other streams. Because streams are EventEmitters, they send data with various events. We will use these events to make the flow work.
Reading data from the stream
The best way to read data from the stream is to listen to the data event and add a callback function. When data flows in, the readable stream will send the data event, and the callback function will be triggered. Take a look at the following code snippet:
var fs = require('fs'); var readableStream = fs.createReadStream('file.txt'); var data = ''; var readableStream.on('data', function(chunk){ data += chunk; }); readableStream.on('end', function(){ console.log(data); });
fs.createReadStream will give you a readable stream.
At the beginning, this stream was not dynamic. When you add an event listener for data and add a callback function, it will become fluid. After that, it will read a small piece of data and pass it to your callback function.
The implementer of the stream determines the triggering frequency of the data event. For example, an HTTP request will trigger the data event when a few KB of data are read. When you read data from a file, you may decide to fire the data event when a row is read.
When there is no data to read (when the end of the file is read), the stream will send the end event. In the above example, we listened to this event and printed the data when we finished reading the file.
There is another way to read a stream. You just need to keep calling the read() method in the stream instance before reading the end of the file.
var fs = require('fs'); var readableStream = fs.createReadStream('file.txt'); var data = ''; var chunk; readableStream.on('readable', function(){ while ((chunk = readableStream.read()) != null) { data += chunk; } }); readableStream.on('end', function(){ console.log(data); });
read() method will read data from the internal buffer. When there is no data to read, it will return null.
So, in the while loop we check whether read() returns null, and when it returns null, we terminate the loop.
It should be noted that when we can read data from the stream, the readable event will trigger.
Set encoding
By default, what you read from the stream is a Buffer object. If you want to read strings, this is not suitable for you. Therefore, you can set the encoding of the stream by calling Readable.setEncoding() as in the following example:
var fs = require('fs'); var readableStream = fs.createReadStream('file.txt'); var data = ''; readableStream.setEncoding('utf8'); readableStream.on('data', function(chunk){ data += chunk; }); readableStream.on('end', function(){ console.log(data); });
In the above example, we set the encoding of the stream to utf8, and the data will be parsed into utf8 , the chunk in the callback function will be a string.
Piping (Piping)
Piping is a great mechanism. You can read data from the data source and then write it to the destination without having to manage the state of the stream yourself. Let’s take a look at the following example first:
var fs = require('fs'); var readableStream = fs.createReadStream('file1.txt'); var writableStream = fs.createWriteStream('file2.txt'); readableStream.pipe(writableStream);
The above example uses the pipe() method to write the contents of file1 to file2. Because pipe() will manage the data flow for you, you don't need to worry about the speed of the data flow. This makes pipe() very concise and easy to use.
It should be noted that pipe() will return the destination stream, so you can easily link multiple streams!
Chaining
Suppose you have an archive file and you want to unzip it. There are many ways to accomplish this task. But the cleanest way is to utilize pipes and links:
var fs = require('fs'); var zlib = require('zlib'); fs.createReadStream('input.txt.gz') .pipe(zlib.createGunzip()) .pipe(fs.createWriteStream('output.txt'));
First, we create a readable stream via input.txt.gz, then let it stream zlib.createGunzip(), which will decompress the content. Finally, we add a writable stream to write the decompressed contents to another file.
Other methods
We have discussed some important concepts in readable streams, here are some methods you need to know:
1.Readable.pause() – This method will pause the flow of the stream . In other words, it will no longer trigger the data event.
2.Readable.resume() – This method is the opposite of the above and will resume the paused stream.
3.Readable.unpipe() – This method will remove the destination. If an argument is passed in, it will stop the readable stream from a specific destination, otherwise, it will remove all destinations.
可写流 (Writable Streams)
可写流让你把数据写入目的地。就像可读流那样,这些也是 EventEmitter ,它们也会触发不同的事件。我们来看看可写流中会触发的事件和方法吧。
写入流
要把数据写如到可写流中,你需要在可写流实例中调用 write() 方法,看看下面的例子:
var fs = require('fs'); var readableStream = fs.createReadStream('file1.txt'); var writableStream = fs.createWriteStream('file2.txt'); readableStream.setEncoding('utf8'); readableStream.on('data', function(chunk){ writableStream.write('chunk'); });
上面的代码非常简单,它只是从输入流中读取数据,然后用 write() 写入到目的地中。
这个方法返回一个布尔值来表示写入是否成功。如果返回的是 true 那表示写入成功,你可以继续写入更多的数据。 如果是 false ,那意味着发生了什么错误,你现在不能继续写入了。可写流会触发一个 drain 事件来告诉你你可以继续写入数据。
写完数据后
当你不需要在写入数据的时候,你可以调用 end() 方法来告诉流你已经完成写入了。假设 res 是一个 HTTP response 对象,你通常会发送响应给浏览器:
res.write('Some Data!!');
res.end();
当 end() 被调用时,所有数据会被写入,然后流会触发一个 finish 事件。注意在调用 end() 之后,你就不能再往可写流中写入数据了。例如下面的代码就会报错:
res.write('Some Data!!');
res.end();
res.write('Trying to write again'); //Error !
这里有一些和可写流相关的重要事件:
1.error – 在写入或链接发生错误时触发
2.pipe – 当可读流链接到可写流时,这个事件会触发
3.unpipe – 在可读流调用 unpipe 时会触发
以上就是本文的全部内容,希望对大家的学习有所帮助,也希望大家多多支持PHP中文网。
更多Node.js Streams文件读写操作详解相关文章请关注PHP中文网!