What Node is most proud of is that it has a very small core. While some languages have full POSIX API bindings, Node implements as few bindings as possible and exposes them through synchronous, asynchronous or streaming APIs.
This approach means that there are some very convenient features in the operating system that need to be rebuilt in Node. This is a practical tutorial that teaches you how to use file system packages.
Quoting files
It is important when interacting with the file system to point to the correct file. Since NPM packages use relative path references, you cannot hard-code the path. There are two main ways to ensure that packages reference the correct files:
// 使用 `path.join()` 而不是 `+` 确保Windows也能正常工作 const path = require('path') // 找到基于调用点的相对路径,对于命令行程序(CLI applications)非常实用 path.join(process.cwd(), 'my-dynamic-file') // 或者 path.resolve('my-dynamic-file') // 基于一个文件找到另外一个文件 path.join(__dirname, 'my-package-file')
Read file
The easiest way to read files asynchronously in node is to use streams! Here is an example:
const path = require('path') const fs = require('fs') // read a file and pipe it to the console fs.createReadStream(path.join(__dirname, 'my-file')) .pipe(process.stdout)
Create file
Creating files is not difficult. Here is a cat command implemented in node:
const path = require('path') const fs = require('fs') // cat ./my-file > ./my-other-file fs.createReadStream(path.join(__dirname, 'my-file')) .pipe(fs.createWriteStream(path.join(__dirname, './my-other-file')))
Delete files
Files and directories deleted in shell scripts usually use the rm-rf command. A rimraf in NodeJS also implements the same function:
const rimraf = require('rimraf') const path = require('path') rimraf(path.join(__dirname, './my-directory'), err => { if (err) throw err })
Create directory
Creating and deleting files is very similar, using the mkdirp package
const mkdirp = require('mkdirp') const path = require('path') mkdirp(path.join(__dirname, 'foo/bar'), err => { if (err) throw err })
Find files
Use readdirp to find files in the current directory:
const readdirp = require('readdirp') const json = require('JSONStream') const path = require('path') // recursively print out all files in all subdirectories // to the command line. The object stream must be // stringified before being passed to `stdout`. readdirp({ root: path.join(__dirname) }) .pipe(json.stringify()) .pipe(process.stdout)
Use findup to find files in the current parent directory:
const findup = require('findup') const path = require('path') // recurse up all files relative to __dirname and find // all `package.json` files. findup(path.join(__dirname), 'package.json', (err, res) => { if (err) throw err console.log('dir is: ' + res) })
About pipes
It is very useful to handle errors of the entire data flow once in the pipeline. Instead of using .on('error', cb) for each individual data stream:
const pump = require('pump') const fs = require('fs') // oh no, no errors are handled! fs.createReadStream('./in.file').pipe(fs.createWriteStream('./out.file')) // that's better, we're handing errors now const rs = fs.createReadStream('./in.file') const ws = fs.createWriteStream('./out.file') pump(rs, ws, err => { if (err) throw err })