Performance optimization under Web 1 (network direction)


Performance optimization (network direction)

Web application is nothing more than the transmission between two hostsdata packetHow to reduce the transmission process time is the focus of network direction optimization, optimization starting point from the first article

DNS resolutionProcess optimization

When browser fromThird party services request resources across domainsBefore the browser initiates the request, the cross domain domain domain name of the third party needs to be resolved to an IP address. This process is called DNS resolution;
DNS cacheIt can be used to reduce the time-consuming of this process. DNS resolution may increase the delay of requests. For websites that need to request many third-party resources, the time-consuming delay of DNS resolution may greatly reduce the page loading performance.

  • dns-prefetch

When a site refers to a resource on a cross domain domain domain, the DNS prefetch prompt should be placed in the < head > element, but some precautions should be kept in mind. First, DNS prefetch is only valid for DNS lookups on cross domain domains, so avoid using it for the site you are currently visiting

<link rel="dns-prefetch" href="">
  • preconnect

Because DNS prefetch only performs DNS lookups, preconnect establishes a connection to the server. If the site is served over HTTPS, the process includes DNS resolution, establishing a TCP connection, and performing a TLS handshake. Combining the two can provide opportunities to further reduce the perceived latency of cross source requests

<! -- pay attention to the order, compatibility of precontent and DNS prefetch -- >
<link rel="preconnect" href="" crossorigin>
<link rel="dns-prefetch" href="">

TCP transmission phase optimization

This front-end aspect seems to be limited. We all know that HTTP protocol is based on TCP;
Consider upgrading the HTTP protocol version. For example, change http / 1.0 to http / 1.1 to http / 2;
This requires us to configure it on the application server (nginx, Apache, etc.), and do not give an overview. In addition, we also need to support both the client and the server. At present, we have not developed a stable version. Many only support HTTPS, but it is not far away

  • Advantages of http2
#1. Multiplexing: the same TCP connection transmits multiple resources
In this way, only a limited number of TCP connections are allowed under the unified domain name, 
In this way, HTTP1.1 does not need to reduce the number of requests optimization
For example, multiple small images are combined into a large picture (sprite map), and JS and CSS files are merged

#2. Header compression and binary coding: reduce transmission volume
In http1, the first request has a complete HTTP header, and the second request is the same;
In http2, the first request has a complete HTTP header, and the second request only carries the path field;
This greatly reduces the amount of transmission. This implementation requires the client and the service to maintain a header table at the same time.

# 3.Server Push
Http2 allows the service to push other resources (such as pictures) that are likely to be requested by the client to you first,
You don't have to wait for the request to send, which can improve the overall loading speed of the page
But at present the support is not very good... EMM

Generally speaking, it will not be very popular under the C-end business, after all, it needs software support

HTTP request response phase optimization

In order to make the packet transfer faster, we can download theTwo aspectsStart: packet size requested (server), packet frequency requested (client)

Reduce the size of the request file

The request file corresponds to the static resource file (which will be deployed to the server) after the completion of our project. The smaller the file is, the smaller the data package will be transmitted, and the more reasonable it will reach the client

how to reduce a package size?

At present, we all use packaging tools (such as webpack, rollup, GLUP, etc.), how to use tools to reduce the size of the package? Here we suggest you go to the official website document… Of course, here are some common means (webpack), but pay attention to the plug-in version update

  • JS file compression
const UglifyJsPlugin = require('uglifyjs-webpack-plugin');
module.exports = {
 plugins: [
   new UglifyJsPlugin({
     //Allow concurrency
     parallel: true,
     //Turn on cache
     cache: true,
     compress: {
       //Delete all console statements    
       drop_console: true,
       //Automatically define static values that are used many times as variables
       reduce_vars: true,
     output: {
       //Do not keep comments
       comment: false,
       //Make the output code as compact as possible
       beautify: false
  • CSS file compression
// optimize-css-assets-webpack-plugin
plugins: [
  new OptimizeCSSAssetsPlugin({
    assetNameRegExp: /\.css$/g,
    cssProcessor: require('cssnano'),
  • HTML file compression
// html-webpack-plugin
plugins: [
  new HtmlWebpackPlugin({
    template: path.join(__dirname, 'src/index.html'),
    filename: 'index.html',
    chunks: ['index'],
    inject: true,
    minify: {
      html5: true,
      collapseWhitespace: true,
      preserveLineBreaks: false,
      minifyCSS: true,
      minifyJS: true,
      removeComments: false,
  • The source map file is closed
  • tree shaking
1. The code will not be executed and cannot be reached, such as if (false) {// the side code}
2. The result of code execution will not be used
3. Code only affects dead variables (write only but not read)
4. There should be no side effects

//Principle related: to be studied later
Using the characteristics of ES6 module: 
  Can only appear as a statement at the top level of a module
  The module name of import can only be a string constant
  Import binding is immutable
Code erasure: delete useless code in uglify phase
  • Scope hosting

Analyze the dependency relationship between modules, and merge the scattered modules into a function as much as possible, but the premise is that code redundancy cannot be caused

const ModuleConcatenationPlugin = require('webpack/lib/optimize/ModuleConcatenationPlugin');
module.exports = {
  resolve: {
    //For the third-party module in NPM, it is preferred to use jsnext:main  The file of the ES6 modular syntax pointed to in
    mainFields: ['jsnext:main', 'browser', 'main']
  plugins: [
    //Open scope hosting
    new ModuleConcatenationPlugin(),
  • On demand loading and lazy loading (routing, component level) are used in the project
const router = new VueRouter({
  routes: [
    { path: '/foo', component: () => import(/* webpackChunkName: "foo" */ './Foo.vue') }
    { path: '/bar', component: () => import(/* webpackChunkName: "bar" */ './Bar.vue') }
  • Enable gizp compression
Sometimes the server performance will be consumed if it is enabled. Use it according to the situation

Let’s talk about it for the moment. I’ll add it later

Reduce request frequency

Because of the limitation of the number of TCP connections in the same domain name, too many requests will be queued and blocked, so we need to control the number and frequency of requests as much as possible

Common measures
  • Inline static resources into HTML

In this way, these resources do not need to be obtained from the server, but may affect the rendering process

Small image inline Base64 (URL loader) -- > 1
<! -- 2. CSS inline -- >
<! -- 3. JS inline -- >
  • Using cache at all levels

It’s usually in theThe server makes relevant configurationBut you have to know

We can use HTTP cache (browser side) to reduce and intercept secondary requests. Of course, it is generally set on the server side;
The server can also set cache (redis, etc.) to reduce the time of data query, and also shorten the whole request time
  • Leverage local storage
We can store common and unchanging information locally (cookie, storage API, etc.);
It's OK not to ask for the relevant interface if it exists, or to request it regularly
  • Pay for CDN acceleration

CDN is also called content distribution network. By deploying resources to all parts of the world, users can access resources from the nearest server according to the principle of proximity, so as to speed up the acquisition of resources. In fact, CDN improves the network speed by optimizing the network speed, packet loss and other problems in the physical link layer transmission process

Purchase CDN server;
Then upload the static resources of the web page to the CDN service,
These static resources need to be accessed through the URL address provided by CDN service;

#Note that the new version will not take effect after the release of the CDN cache
Therefore, the hash value is often added after the file when packing
Then the resource import address in the HTML file also needs to be replaced by the address provided by the CDN service

#Using CDNs of different domain names to store resources (TCP connection restriction)
  • Adding CDN in the construction of webpack
//The import URL of a static resource needs to become a URL that points to the absolute path of the CDN service instead of a URL relative to an HTML file.
//The file name of the static resource needs to carry the hash value calculated from the file content to prevent it from being cached.
//Different types of resources are put into CDN services of different domain names to prevent parallel loading of resources from being blocked.
module.exports = {
  //Omit entry configuration
  output: {
    //Add a hash value to the output JavaScript file name
    filename: '[name]_[chunkhash:8].js',
    path: path.resolve(__dirname, './dist'),
    //Specifies the CDN directory URL where JavaScript files are stored
    publicPath: '//',
  module: {
    rules: [
        //Add support for CSS files
        test: /\.css$/,
        //Extract the CSS code in chunk into a separate file
        use: ExtractTextPlugin.extract({
          //Compress CSS code
          use: ['css-loader?minimize'],
          //Specifies the CDN directory URL to hold the imported resources (such as pictures) in CSS
          publicPath: '//'
        //Add support for PNG files
        test: /\.png$/,
        //Add hash value to the output PNG file name
        use: ['file-loader?name=[name]_[hash:8].[ext]'],
      //Omit other loader configurations
  plugins: [
    //Using webplugin to generate HTML automatically
    new WebPlugin({
      //The file path where the HTML template file is located
      template: './template.html',
      //The file name of the HTML output
      filename: 'index.html',
      //Specify the CDN directory URL where CSS files are stored
      stylePublicPath: '//',
    new ExtractTextPlugin({
      //Add a hash value to the output CSS file name
      filename: `[name]_[contenthash:8].css`,
    //Omit code compression plug-in configuration
The core part of the above code is to set the CDN directory URL for storing static resources through the publicpath parameter,
In order to export different types of resources to different CDNs, you need to set the

output.publicPath  Set the address of JavaScript in.
css- loader.publicPath  Set the address of the resource imported by CSS.
WebPlugin.stylePublicPath  Set the address of the CSS file in.
After setting the publicpath, webplugin will consider the public path in the configuration when generating HTML files and CSS loader converting CSS code, and replace the original relative address with the corresponding online address.

reference resources

Webpack document
Easy to understand webpack
Scope Hoisting