In depth nodejs module FS – file system operation


NodefsThere are many APIs with dense documents. After all, they fully support the operation of file system. Documents are well organized. Operations are basically divided into file operations, directory operations, file information, and flow. Programming also supports synchronization, asynchrony, and promise.

This paper records several problems not described in detail in the documents, which can be better connectedfsDocument ideas:

  • File descriptor
  • Synchronous, asynchronous and promise
  • Catalog and catalog items
  • file information
  • stream

File descriptor

The file descriptor is a non negative integer. It is an index value from which the operating system can find the corresponding file.

In many of FS’s underlying APIs, file descriptors are needed. Descriptors are commonly used in documentsfdTo represent. For, buffer, offset, length, position, callback)。 Corresponding to this API are:fs.readFile(path[, options], callback)

Because the operating system has a limit on the number of file descriptors, do not forget to close:

const fs = require("fs");"./db.json", "r", (err, fd) => {
    if (err) throw err;
    //File operation
    //When you are finished, close the file
    fs.close(fd, err => {
        if (err) throw err;

Synchronous, asynchronous and promise

All file system APIs come in two forms: synchronous and asynchronous.

Synchronous writing

It is not recommended to use synchronization API, which will block threads

try {
    const buf = fs.readFileSync("./package.json");
} catch (error) {

Asynchronous writing

Asynchronous writing is easy to enter callback hell.

fs.readFile("./package.json", (err, data) => {
    if (err) throw err;

(recommended) promise

Before node V12, you need to use promise encapsulation:

function readFilePromise(path, encoding = "utf8") {
    const promise = new Promise((resolve, reject) => {
        fs.readFile(path, (err, data) => {
            if (err) return reject(err);
            return resolve(data.toString(encoding));
    return promise;

readFilePromise("./package.json").then(res => console.log(res));

In node V 12, FS promise API is introduced. They return promise objects instead of using callbacks. API available throughrequire('fs').promisesVisit. As a result, development costs are lower.

const fsPromises = require("fs").promises;

    .readFile("./package.json", {
        encoding: "utf8",
        flag: "r"

Catalog and catalog items

Fs.dir class: encapsulates operations related to file directory

Fs.dirent class: encapsulates the related operations of directory items. For example, determine the device type (character, block, FIFO, etc.).

The relationship between them is shown by code:

const fsPromises = require("fs").promises;

async function main() {
    const dir = await fsPromises.opendir(".");
    let dirent = null;
    while ((dirent = await !== null) {


file information

Fs.stats class: encapsulates operations related to file information. It is infs.stat()Returned in the callback function of.

fs.stat("./package.json", (err, stats) => {
    if (err) throw err;

Note about checking the existence of the file:

  • It is not recommended to use FS. Stat() to check the existence of a file before calling FS. Open(), FS. Readfile(), or FS. Writefile().Instead, you should open, read, or write to the file directly, and handle the errors if the file is not available
  • To check that a file exists but does not subsequently operate on it, FS. Access() is recommended.

Readstream and writestream

Stream is a very important library in nodejs. Many library APIs are encapsulated based on stream. For example, readstream and writestream in FS.

FS itself provides readfile and WriteFile. The price they are easy to use is that there is a performance problem and they will load all the content into memory at one time. But for a few gigabytes of large files, there are obviously problems.

So the solution for large files is naturally: read it out a little bit. This requires stream. Take readstream for example. The code is as follows:

const rs = fs.createReadStream("./package.json");
let content = "";

rs.on("open", () => {
    console.log("start to read");

rs.on("data", chunk => {
    content += chunk.toString("utf8");

rs.on("close", () => {
    console.log("finish read, content is:\n", content);

With the help of stream pipe, one line quickly encapsulates the copy function of a large file:

function copyBigFile(src, target) {

Reference link

  • File descriptor
  • Socket socket
  • Nodejs Foundation: introduction and use of stream module
  • Fastest way to copy file in node.js
  • Using Node.js to Read Really, Really Large Datasets & Files (Pt 1)


  1. I think it’s good,Give me a recommendation, your support is the biggest incentive for me
  2. Welcome to my official account.Xintan blog, focus only onFront end + algorithmOriginal sharing of

Due to limited personal energy, many series and historical articles are not synchronized in real time,Please go to “front end map” & “algorithm problem solving”To ensure that you get something.

Recommended Today

The application of USB camera in rk3399

The application of USB camera in rk3399 1, introduction UVCFull nameUSB Video Class, is a set of standard customized by usb-if. All USB interface cameras complying with this standard can almost be used directly under Windows Linux and other systems, achieving the similar effect of drive free. Of course, it doesn’t mean that there is […]