“Ask if you don’t understand” where is esbuild?

Time:2021-11-27

preface

esbuildIs a new generation of JavaScript packaging tools.

His author isFigmaCTO of-Evan Wallace

esbuildwithFast speedIt takes only 2% ~ 3% of the time of webpack.

esbuildThe main objectives of the project are:Open up a new era of building tool performance and create an easy-to-use modern packer

Its main functions:

  • Extreme speed without needing a cache
  • ES6 and CommonJS modules
  • Tree shaking of ES6 modules
  • An API for JavaScript and Go
  • TypeScript and JSX syntax
  • Source maps
  • Minification
  • Plugins

It is now built into many tools, such as the one we know well:

  • vite,
  • snowpack

With the excellent performance of esbuild, vite is even more powerful and fast to fly.

Today, let’s explore why esbuild is so fast.

Today’s main content:

  • Comparison of several groups of performance data
  • Why is esbuild so fast
  • esbuild upcoming roadmap
  • Application of esbuild in vite
  • Why does the production environment still need to be packaged?
  • Why not package vite with esbuild?
  • summary

text

Let’s start with a set of comparisons:

Use 10 production packages of threejs to compare the packaging speed of different packaging tools under the default configuration.

Webpack5 at the bottom, time-consuming55.25Seconds.

Esbuild only takes time0.37Seconds.

The difference is huge.

There are more comparisons:

Webpack 5 means very hurt: I can’t compare with webpack 4?

Why is esbuild so fast?

There are several reasons.

(in order to ensure the accuracy of the content, the following content is translated from the esbuild official website.)

1. It is written in go language and can be compiled into local code.

Most packers are written in JavaScript, but forJIT compilationCommand line applications have the worst performance for the language.

Every time the packer is run, the JavaScript VM will see the code of the packer without any optimization prompt.

While esbuild is busy parsing JavaScript, node is busy parsing the JavaScript of the packer.

By the time the node completes parsing the packer code, esbuild may have exited and your packer has not even started packaging.

In addition, go is forParallelismDesigned, not JavaScript.

Go between threadsShared memoryJavaScript must serialize data between threads.

Both go and JavaScriptParallel garbage collector, but go’s pile isAll threadsFor JavaScript, there is one in each JavaScript threadSeparate heap

According to the test, this seems to reduce the parallelism of JavaScript worker threads by half, probably because half of the CPU core is busy collecting garbage for the other half.

2. Parallel operation is widely used.

In esbuildThe algorithm is carefully designed, you can make full use of CPU resources.

It is roughly divided into three stages:

  1. analysis
  2. link
  3. code generation

analysisandcode generationIs most of the work, and canFull parallelization(linking is an inherent serial task in most cases).

Because all threadsShared memoryTherefore, it is easy to share work when bundling different entry points into the same JavaScript library.

Most modern computers haveMulti kernelTherefore, parallelism is a great victory.

3. The code is written by ourselves without third-party dependency.

Writing all your own content instead of using third-party libraries can bring many performance advantages.

You can keep performance in mind from the beginning, ensure that everything uses a consistent data structure to avoid expensive transformations, and make extensive architectural changes when necessary. The disadvantage is, of course, a lot more work.

For example, many bundles use the official typescript compiler as the parser.

However, it was built to achieve the goals of the typescript compiler team, which did not make performance a top priority.

4. Efficient utilization of memory.

Ideally, the complexity of the compiler is O (n) according to the length of the data

If you want to process a large amount of data, the speed of memory access may seriously affect performance.

The fewer iterations of the data (and the fewer different representations required to convert the data into data), the faster the compiler will be.

For example, esbuild only touches the entire JavaScript ast three times:

  1. The process of lexical analysis, parsing, scope setting and declaration of symbols
  2. Bind symbols to minimize syntax. For example, convert JSX / TS to JS and ES next to Es5.
  3. Minimum identifier, minimum space, generate code.

When ast data is still active in the CPU cache, the reuse of AST data is maximized.

Other packers perform these steps in separate processes rather than interleaving them together.

They can also convert between data representations and organize multiple libraries together (for example: string → TS → JS → string, then string → JS → old JS → string, then string → JS → minified JS → string).

This takes up more memory and slows down.

Another advantage of go is that it can store content compactly in memory, so that it can use less memory and accommodate more content in the CPU cache.

The types and fields of all object fields are tightly packed together, such as several Boolean flags, each occupying only one byte.

Go also has value semantics, which can embed one object directly into another object, so it is’ free ‘without additional allocation.

JavaScript does not have these features, but also has other disadvantages, such as JIT overhead (such as hidden class slots) and inefficient representation (such as non integer and pointer heap allocation).

Each of the above factors can improve the compilation speed to a certain extent.

When they work together, the effect is several orders of magnitude faster than other packers commonly used today.

The above contents are cumbersome, and some netizens have made a brief summary:

  • It usesGoLanguage, which can be compiled into local code. And go is fast. Generally speaking, the operation of JS ismillisecond , and go isNanosecond level
  • analysis, the operations of generating the final package file and generating source maps are fully parallelized
  • No expensive data conversion required, you can do it all in a few steps
  • This libraryTo improve the compilation speed is the first principle when writing code, and try to avoid unnecessary memory allocation.

For reference only.

Upcoming roadmap

The following features are already in progress and are the first priority:

  1. Code splitting (#16, docs)
  2. CSS content type (#20, docs)
  3. Plugin API (#111)

The following fearures have potential, but are not sure:

  1. HTML content type (#31)
  2. Lowering to ES5 (#297)
  3. Bundling top-level await (#253)

Those who are interested can stay focused.

Application of esbuild in vite

viteIt is widely used inesbuildHere are two points.

  1. optimizer

import { build, BuildOptions as EsbuildBuildOptions } from 'esbuild'

// ...
const result = await build({
    entryPoints: Object.keys(flatIdDeps),
    bundle: true,
    format: 'esm',
    external: config.optimizeDeps?.exclude,
    logLevel: 'error',
    splitting: true,
    sourcemap: true,
    outdir: cacheDir,
    treeShaking: 'ignore-annotations',
    metafile: true,
    define,
    plugins: [
      ...plugins,
      esbuildDepPlugin(flatIdDeps, flatIdToExports, config)
    ],
    ...esbuildOptions
  })

  const meta = result.metafile!

  // the paths in `meta.outputs` are relative to `process.cwd()`
  const cacheDirOutputPath = path.relative(process.cwd(), cacheDir)

  for (const id in deps) {
    const entry = deps[id]
    data.optimized[id] = {
      file: normalizePath(path.resolve(cacheDir, flattenId(id) + '.js')),
      src: entry,
      needsInterop: needsInterop(
        id,
        idToExports[id],
        meta.outputs,
        cacheDirOutputPath
      )
    }
  }

  writeFile(dataPath, JSON.stringify(data, null, 2))
  1. handle.tsfile

Why does the production environment still need to be packaged?

Although nativeESMNow I got itBroad support, but because of nested importAdditional network round trips, it is still inefficient to publish an unpackaged ESM in a production environment, even if it is usedHTTP/2)。

In order to obtainBest loading performance, it’s best to change the codetree-shakingLazy loadingandChunk segmentation(for better results)cache)。

Make suredevelopmentServer andProduct constructionBetweenOptimal outputandbehaviorIt is not easy to reach agreement.

To solve this problem, vite comes with a set ofConstruction optimizationofBuild command, out of the box.

Why not package vite with esbuild?

althoughesbuildIt’s surprisingly fast, and it’s already an excellent tool for building libraries, but some important functions for building applications are still under development — especiallyCode segmentationandCSS processingaspect.

For now,RollupIn terms of application packaging, it is more mature and flexible.

Nevertheless, when these functions are stable in the future, it is not ruled out to use esbuild as theProduction builderPossible.

summary

Esbuild brings the dawn of building efficiency, and the number of ESMS is also increasing rapidly:

hopeesmImprove the ecology as soon as possible to benefit the front end.

Today’s content is so much, I hope it will inspire you.

Lack of talent and shallow learning. If there are mistakes in the article, please correct them. Thank you.

Reference link

  1. https://esbuild.github.io/get…
  2. https://esbuild.github.io/faq/
  3. https://twitter.com/skypackjs…