An in-depth analysis of file flow in node.js
This article will analyze the file flow in Nodejs, I hope it will be helpful to everyone!
File stream
Since various media in the computer have different reading and storage speeds and different capacities, there may be a long-term problem with one of them during the operation. Waiting state
There are three main types of file streams, namely Input stream (Readable) , Output stream (Writeable) , Duplex stream (Duplex) . There is another type of stream that is not commonly used, which is Transform stream (Transform)
provides the stream module in node. There are two class instances in this module: Readable and Writable, these two classes will be inherited in the stream, so there will be many common methods.
Readable stream (Readable)
Input stream: Data flows from the source to the memory, and the data in the disk is transferred to the memory.
createReadStream
fs.createReadStream(path, configuration)
In the configuration there are: encoding (encoding method), start (start reading Bytes), end (end of reading bytes), highWaterMark (amount of each read)
highWaterMark: If encoding has a value, this number represents a number of characters; if encoding is null, this number represents a number of characters Number of sections
Returns a Readable subclass ReadStream
const readable = fs.createReadStream(filename, { encoding: 'utf-8', start: 1, end: 2, // highWaterMark: });
Register event
readable.on(event name, handler function)
readable.on('open', (err, data)=> { // console.log(err); console.log('文件打开了'); }) readable.on('error', (data, err) => { console.log(data, err); console.log('读取文件发生错误'); }) readable.on('close', (data, err) => { // console.log(data, err); console.log('文件关闭'); }) readable.close() // 手动触发通过 readable.close()或者在文件读取完毕之后自动关闭--autoClose配置项默认为 true readable.on('data', (data) => { console.log(data); console.log('文件正在读取'); }) readable.on('end', ()=>{ console.log('文件读取完毕'); })
Pause reading
readable.pause() Pauses reading and triggers the pause event
Resume reading
readable.resume() Resumes reading , will trigger the resume event
Writable stream
const ws = fs.createWriteStream(filename[, configuration])
ws.write(data)
Write a data, data can be a string or a Buffer, and return a Boolean value.
If true is returned, it means that the write channel is not full, and the next data can be written directly. The write channel is the size indicated by highWaterMark in the configuration.
If false is returned, it means that the writing channel is full, and the remaining characters start to wait, causing back pressure.
const ws = fs.createWriteStream(filename, { encoding: 'utf-8', highWaterMark: 2 }) const flag = ws.write('刘'); console.log(flag); // false 这里虽然只会执行一次,但是在通道有空余空间的时候就会继续写入,并不在返回 值。 ws.write() 只会返回一次值。 const flag = ws.write('a'); console.log(flag); const flag1 = ws.write('a'); console.log(flag1); const flag2 = ws.write('a'); console.log(flag2); const flag3 = ws.write('a'); console.log(flag3); 输出顺序:true、false、false、false 第二次写入的时候已经占了两字节,第三次写入后直接占满了,所以返回false
Use streams to copy and paste files and solve back pressure problems
const filename = path.resolve(__dirname, './file/write.txt'); const wsfilename = path.resolve(__dirname, './file/writecopy.txt'); const ws = fs.createWriteStream(wsfilename); const rs = fs.createReadStream(filename) rs.on('data', chumk => { const falg = ws.write(chumk); if(!falg) { rs.pause(); } }) ws.on('drain', () => { rs.resume(); }) rs.on('close', () => { ws.end(); console.log('copy end'); })
pipe
Using pipe, you can also directly read and write streams Streaming in series can also solve the back pressure problem
rs.pipe(ws); rs.on('close', () => { ws.end(); console.log('copy end'); })
After learning, I feel that file streaming is very convenient when reading and writing a large number of files, and it can be done quickly and efficiently. Compared with writeFile
and readFile
are much more efficient, and there will be no major blocking if handled correctly.
For more node-related knowledge, please visit: nodejs tutorial! !
The above is the detailed content of An in-depth analysis of file flow in node.js. For more information, please follow other related articles on the PHP Chinese website!

Hot AI Tools

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Undress AI Tool
Undress images for free

Clothoff.io
AI clothes remover

AI Hentai Generator
Generate AI Hentai for free.

Hot Article

Hot Tools

Notepad++7.3.1
Easy-to-use and free code editor

SublimeText3 Chinese version
Chinese version, very easy to use

Zend Studio 13.0.1
Powerful PHP integrated development environment

Dreamweaver CS6
Visual web development tools

SublimeText3 Mac version
God-level code editing software (SublimeText3)

Hot Topics



The Node service built based on non-blocking and event-driven has the advantage of low memory consumption and is very suitable for handling massive network requests. Under the premise of massive requests, issues related to "memory control" need to be considered. 1. V8’s garbage collection mechanism and memory limitations Js is controlled by the garbage collection machine

This article will give you an in-depth understanding of the memory and garbage collector (GC) of the NodeJS V8 engine. I hope it will be helpful to you!

Choosing a Docker image for Node may seem like a trivial matter, but the size and potential vulnerabilities of the image can have a significant impact on your CI/CD process and security. So how do we choose the best Node.js Docker image?

The file module is an encapsulation of underlying file operations, such as file reading/writing/opening/closing/delete adding, etc. The biggest feature of the file module is that all methods provide two versions of **synchronous** and **asynchronous**, with Methods with the sync suffix are all synchronization methods, and those without are all heterogeneous methods.

Node 19 has been officially released. This article will give you a detailed explanation of the 6 major features of Node.js 19. I hope it will be helpful to you!

How does Node.js do GC (garbage collection)? The following article will take you through it.

The event loop is a fundamental part of Node.js and enables asynchronous programming by ensuring that the main thread is not blocked. Understanding the event loop is crucial to building efficient applications. The following article will give you an in-depth understanding of the event loop in Node. I hope it will be helpful to you!

The reason why node cannot use the npm command is because the environment variables are not configured correctly. The solution is: 1. Open "System Properties"; 2. Find "Environment Variables" -> "System Variables", and then edit the environment variables; 3. Find the location of nodejs folder; 4. Click "OK".
