Build from scratch Node.js Enterprise web server (XIII): breakpoint debugging and performance analysis

Time:2021-1-20

breakpoint debugging

Node.js The official breakpoint debugging mechanism is provided. For security reasons, it is closed by defaultnodeparameter--inspector--inspect-brkOpen, with IDE can be very convenient to debug code, this chapter on the previous chapter has completed the projecthost1-tech/nodejs-server-examples – 12-rpcbe based onVisual Studio CodeTo debug breakpoints:

$MKDIR. Vscode # create a new. Vsocde directory to store the vscode workspace level configuration

$tree - L 1 - a ා shows all the directory content structure at the beginning of
.
├── .dockerignore
├── .env
├── .env.local
├── .env.production.local
├── .npmrc
├── .sequelizerc
├── .vscode
├── Dockerfile
├── database
├── node_modules
├── package.json
├── public
├── scripts
├── src
└── yarn.lock

establish.vscode/launch.json(there are many methods, which will not be expanded here.)

// .vscode/launch.json
{
  // Use IntelliSense to learn about possible attributes.
  // Hover to view descriptions of existing attributes.
  // For more information, visit: https://go.microsoft.com/fwlink/?linkid=830387
  "version": "0.2.0",
  "configurations": [
    {
      "name": "Attach by Process ID",
      "processId": "${command:PickProcess}",
      "request": "attach",
      "skipFiles": ["<node_internals>/**"],
      "type": "pwa-node"
    }
  ]
}

startup parameter--inspectAnd--inspect-brkCan be turned on debugging state, in contrast, the latter in the Node.js When the program starts, it immediately enters the breakpoint waiting, which gives the developer the opportunity to debug the startup process--inspect-brkWrite debug entry:

// package.json
{
  "name": "13-debugging-and-profiling",
  "version": "1.0.0",
  "scripts": {
    "start": "node -r ./scripts/env src/server.js",
+    "start:inspect": "node --inspect-brk -r ./scripts/env src/server.js",
    "start:prod": "cross-env NODE_ENV=production node -r ./scripts/env src/server.js",
    "sequelize": "sequelize",
    "sequelize:prod": "cross-env NODE_ENV=production sequelize",
    "build:yup": "rollup node_modules/yup -o src/moulds/yup.js -p @rollup/plugin-node-resolve,@rollup/plugin-commonjs,rollup-plugin-terser -f umd -n 'yup'"
  },
  // ...
}

Next, through theyarn start:inspectStart the program and use Visual Studio code for breakpoint debugging:

$ yarn start:inspect | yarn bunyan -o short
# ...
Debugger listening on ws://127.0.0.1:9229/35a82bee-093c-491d-9d54-4cca9a142cbf
For help, see: https://nodejs.org/en/docs/inspector

Build from scratch Node.js  Enterprise web server (XIII): breakpoint debugging and performance analysis

performance analysis

The performance bottlenecks of the server are usually different according to different application scenarios, but the most common performance bottleneck mainly occurs in the amount of concurrency. Among them, the number of requests per second (RPS) is the main indicator to measure the amount of concurrency

$yarn add - D autocannon # local installation of RPS test tool autocannon
# ...
info Direct dependencies
└─ [email protected]
# ...

$ yarn autocannon  https://www.taobao.com/  #Launch RPS test on Taobao home page
# ...
Running 10s test @ https://www.taobao.com/
10 connections

┌─────────┬───────┬───────┬────────┬────────┬───────────┬──────────┬───────────┐
│ Stat    │ 2.5%  │ 50%   │ 97.5%  │ 99%    │ Avg       │ Stdev    │ Max       │
├─────────┼───────┼───────┼────────┼────────┼───────────┼──────────┼───────────┤
│ Latency │ 36 ms │ 90 ms │ 267 ms │ 523 ms │ 106.68 ms │ 70.25 ms │ 586.13 ms │
└─────────┴───────┴───────┴────────┴────────┴───────────┴──────────┴───────────┘
┌───────────┬─────────┬─────────┬─────────┬─────────┬─────────┬────────┬─────────┐
│ Stat      │ 1%      │ 2.5%    │ 50%     │ 97.5%   │ Avg     │ Stdev  │ Min     │
├───────────┼─────────┼─────────┼─────────┼─────────┼─────────┼────────┼─────────┤
│ Req/Sec   │ 81      │ 81      │ 95      │ 99      │ 92.7    │ 5.18   │ 81      │
├───────────┼─────────┼─────────┼─────────┼─────────┼─────────┼────────┼─────────┤
│ Bytes/Sec │ 9.93 MB │ 9.93 MB │ 11.6 MB │ 12.1 MB │ 11.4 MB │ 636 kB │ 9.93 MB │
└───────────┴─────────┴─────────┴─────────┴─────────┴─────────┴────────┴─────────┘

Req/Bytes counts sampled once per second.

927 requests in 10.07s, 114 MB read

Taobao has a very perfect load balancing strategy. The above data can’t represent the concurrency of the whole cluster, but to some extent represent the value of a single container. The average RPS is 92.7.

Now test your server,

Log in through the browser to get the session ID in the cookie:

Build from scratch Node.js  Enterprise web server (XIII): breakpoint debugging and performance analysis

Start RPS test on your server home page:

$# launch RPS test on your server home page
$ yarn autocannon -H 'Cookie: connect.sid=s%3AY2A4lo84OOtXCYgc3LRft03HtRaC4ieZ.kW%2BJlJIhSUQTOCxpREjtByDm8QmmA%2FPsNvddYQSP1fM' http://localhost:9000/
# ...
Running 10s test @ http://localhost:9000/
10 connections

┌─────────┬────────┬────────┬────────┬────────┬───────────┬──────────┬───────────┐
│ Stat    │ 2.5%   │ 50%    │ 97.5%  │ 99%    │ Avg       │ Stdev    │ Max       │
├─────────┼────────┼────────┼────────┼────────┼───────────┼──────────┼───────────┤
│ Latency │ 122 ms │ 197 ms │ 276 ms │ 306 ms │ 197.71 ms │ 38.14 ms │ 343.02 ms │
└─────────┴────────┴────────┴────────┴────────┴───────────┴──────────┴───────────┘
┌───────────┬─────────┬─────────┬─────────┬─────────┬─────────┬────────┬─────────┐
│ Stat      │ 1%      │ 2.5%    │ 50%     │ 97.5%   │ Avg     │ Stdev  │ Min     │
├───────────┼─────────┼─────────┼─────────┼─────────┼─────────┼────────┼─────────┤
│ Req/Sec   │ 46      │ 46      │ 50      │ 57      │ 50.1    │ 2.92   │ 46      │
├───────────┼─────────┼─────────┼─────────┼─────────┼─────────┼────────┼─────────┤
│ Bytes/Sec │ 44.2 kB │ 44.2 kB │ 48.1 kB │ 54.8 kB │ 48.1 kB │ 2.8 kB │ 44.2 kB │
└───────────┴─────────┴─────────┴─────────┴─────────┴─────────┴────────┴─────────┘

Req/Bytes counts sampled once per second.

501 requests in 10.06s, 481 kB read

It is found that the average RPS is only 50.1, and it is suspected that there is a performance bottleneck. Used here0xCollect the generated flame diagram to locate the problem, install and write to the startup entry:

$yarn add - D 0x # local installation
# ...
info Direct dependencies
└─ [email protected]
info All dependencies
# ...
// package.json
{
  "name": "13-debugging-and-profiling",
  "version": "1.0.0",
  "scripts": {
    "start": "node -r ./scripts/env src/server.js",
    "start:inspect": "node --inspect-brk -r ./scripts/env src/server.js",
+    "start:profile": "0x -- node -r ./scripts/env src/server.js",
    "start:prod": "cross-env NODE_ENV=production node -r ./scripts/env src/server.js",
    "sequelize": "sequelize",
    "sequelize:prod": "cross-env NODE_ENV=production sequelize",
    "build:yup": "rollup node_modules/yup -o src/moulds/yup.js -p @rollup/plugin-node-resolve,@rollup/plugin-commonjs,rollup-plugin-terser -f umd -n 'yup'"
  },
  // ...
}

adoptyarn start:profileStart the server and use 60 seconds to test RPS. The longer the test time is, the less the noise is

# Tab 1
$ yarn start:profile | yarn bunyan -o short
🔥  Profiling
# ...
# Tab 2
$# launch RPS test on your server home page
$ yarn autocannon -d 60 -H 'Cookie: connect.sid=s%3AY2A4lo84OOtXCYgc3LRft03HtRaC4ieZ.kW%2BJlJIhSUQTOCxpREjtByDm8QmmA%2FPsNvddYQSP1fM' http://localhost:9000/
# ...

Shut down the server and open the flame map:

# Tab 1
$ yarn start:profile | yarn bunyan -o short
🔥  Profiling
# ...
^C
🔥  Waiting for subprocess to exit...
🔥  Flamegraph generated in ... 79929.0x/flamegraph.html

$ open 79929.0x/ flamegraph.html  #Use browser to open the flamegraph.html  file

The flame diagram is obtained from the sampling data of the function according to the state of the program stack. The width represents the time required for the function to run once, the height represents the number of layers of the stack, and the color depth represents the frequency of the function appearing in the sampling. Therefore, the block with the longest width and the deepest color has the greatest impact on the performance. By default, the flame diagram generated by X is*And~The symbol difference shows the optimized and unoptimized functions in V8. ClickMergePlease refer to0x UI):

Build from scratch Node.js  Enterprise web server (XIII): breakpoint debugging and performance analysis

It is found that sequenize’sModel#findOneThe method is called by a large number of express sessions, and it is inferred that connect session serialize may be used as the source of express sessionstoreSo we try to use redis as the session cache for optimization

$yarn add redis connect redis ᦇ local installation of redis, connect redis
# ...
info Direct dependencies
├─ [email protected]
└─ [email protected]
# ...

To mirror redis:6.0.6  Start the redis service and name it redis6
$ docker run -p 6379:6379 -d --name redis6 redis:6.0.6
// src/config/index.js
// ...
const config = {
  //Default configuration
  default: {
    // ...
+
+    redisOptions: {
+      host: 'localhost',
+      port: 6379,
+    },
  },
  // ...
};
// ...
# .env
LOG_LEVEL='debug'

GRPC_TRACE='all'
GRPC_VERBOSITY='DEBUG'
+
+WITH_REDIS=1
// src/middlewares/session.js
const session = require('express-session');
const sessionSequelize = require('connect-session-sequelize');
+const redis = require('redis');
+const sessionRedis = require('connect-redis');
const { sequelize } = require('../models');
-const { sessionCookieSecret, sessionCookieMaxAge } = require('../config');
+const {
+  redisOptions,
+  sessionCookieSecret,
+  sessionCookieMaxAge,
+} = require('../config');
+const { name } = require('../../package.json');

module.exports = function sessionMiddleware() {
-  const SequelizeStore = sessionSequelize(session.Store);
-
-  const store = new SequelizeStore({
-    db: sequelize,
-    modelKey: 'Session',
-    tableName: 'session',
-  });
+  let store;
+
+  if (process.env.WITH_REDIS) {
+    const client = redis.createClient(redisOptions);
+    const RedisStore = sessionRedis(session);
+    store = new RedisStore({ client, prefix: name });
+  } else {
+    const SequelizeStore = sessionSequelize(session.Store);
+    store = new SequelizeStore({
+      db: sequelize,
+      modelKey: 'Session',
+      tableName: 'session',
+    });
+  }

  return session({
    secret: sessionCookieSecret,
    cookie: { maxAge: sessionCookieMaxAge },
    store,
    resave: false,
    proxy: true,
    saveUninitialized: false,
  });
};

Retest RPS:

Build from scratch Node.js  Enterprise web server (XIII): breakpoint debugging and performance analysis

$# launch RPS test on your server home page
$ yarn autocannon -H 'Cookie: connect.sid=s%3As4C-tLD4Xf8q3zjmtQ3k4B62mTZZNzfw.OdVLDs08H1YCXyZXIBgRuzE%2FE%2FH1BwDH4ynxEBNlKkg' http://localhost:9000/
# ...
Running 10s test @ http://localhost:9000/
10 connections

┌─────────┬──────┬──────┬───────┬───────┬─────────┬─────────┬─────────┐
│ Stat    │ 2.5% │ 50%  │ 97.5% │ 99%   │ Avg     │ Stdev   │ Max     │
├─────────┼──────┼──────┼───────┼───────┼─────────┼─────────┼─────────┤
│ Latency │ 6 ms │ 9 ms │ 18 ms │ 20 ms │ 9.35 ms │ 3.25 ms │ 34.4 ms │
└─────────┴──────┴──────┴───────┴───────┴─────────┴─────────┴─────────┘
┌───────────┬────────┬────────┬────────┬─────────┬─────────┬────────┬────────┐
│ Stat      │ 1%     │ 2.5%   │ 50%    │ 97.5%   │ Avg     │ Stdev  │ Min    │
├───────────┼────────┼────────┼────────┼─────────┼─────────┼────────┼────────┤
│ Req/Sec   │ 720    │ 720    │ 1035   │ 1190    │ 1014.82 │ 160.17 │ 720    │
├───────────┼────────┼────────┼────────┼─────────┼─────────┼────────┼────────┤
│ Bytes/Sec │ 692 kB │ 692 kB │ 995 kB │ 1.14 MB │ 975 kB  │ 154 kB │ 692 kB │
└───────────┴────────┴────────┴────────┴─────────┴─────────┴────────┴────────┘

Req/Bytes counts sampled once per second.

11k requests in 11.04s, 10.7 MB read

It is found that the mean value of RPS is increased to 1003.89, which indicates that the inference is correct. Use it now Node.js The built-in cluster mechanism improves CPU utilization and further optimizes. Here, pay attention to setting the number of workers reasonably. If the number of workers is too high, the performance will be reduced due to the frequent scheduling of the operating system

# .env
LOG_LEVEL='debug'

GRPC_TRACE='all'
GRPC_VERBOSITY='DEBUG'

WITH_REDIS=1
+
+CLUSTERING=2
// src/server.js
+const os = require('os');
+const cluster = require('cluster');
const express = require('express');
const { resolve } = require('path');
const { promisify } = require('util');
const initMiddlewares = require('./middlewares');
const initControllers = require('./controllers');
const initSchedules = require('./schedules');
const initRpc = require('./rpc');
const logger = require('./utils/logger');

// ...
async function bootstrap() {
  // ...
}
// ...

-bootstrap();
+const useCluster = Boolean(process.env.CLUSTERING);
+
+if (useCluster && cluster.isMaster) {
+  const forkCount = parseInt(process.env.CLUSTERING) || os.cpus().length;
+
+  for (let i = 0, n = forkCount; i < n; i++) {
+    cluster.fork();
+  }
+
+  cluster.on('online', (worker) => {
+    logger.info(`> Worker ${worker.process.pid} is running`);
+  });
+
+  cluster.on('exit', (worker) => {
+    logger.info(`> Worker ${worker.process.pid} exited`);
+    process.exit(worker.process.exitCode);
+  });
+} else {
+  bootstrap();
+}

Retest RPS:

$# launch RPS test on your server home page
$ yarn autocannon -H 'Cookie: connect.sid=s%3As4C-tLD4Xf8q3zjmtQ3k4B62mTZZNzfw.OdVLDs08H1YCXyZXIBgRuzE%2FE%2FH1BwDH4ynxEBNlKkg' http://localhost:9000/
# ...
Running 10s test @ http://localhost:9000/
10 connections

┌─────────┬──────┬──────┬───────┬───────┬─────────┬─────────┬──────────┐
│ Stat    │ 2.5% │ 50%  │ 97.5% │ 99%   │ Avg     │ Stdev   │ Max      │
├─────────┼──────┼──────┼───────┼───────┼─────────┼─────────┼──────────┤
│ Latency │ 4 ms │ 7 ms │ 14 ms │ 17 ms │ 7.77 ms │ 2.74 ms │ 44.46 ms │
└─────────┴──────┴──────┴───────┴───────┴─────────┴─────────┴──────────┘
┌───────────┬────────┬────────┬─────────┬─────────┬─────────┬────────┬────────┐
│ Stat      │ 1%     │ 2.5%   │ 50%     │ 97.5%   │ Avg     │ Stdev  │ Min    │
├───────────┼────────┼────────┼─────────┼─────────┼─────────┼────────┼────────┤
│ Req/Sec   │ 1009   │ 1009   │ 1209    │ 1430    │ 1208.41 │ 117.95 │ 1009   │
├───────────┼────────┼────────┼─────────┼─────────┼─────────┼────────┼────────┤
│ Bytes/Sec │ 970 kB │ 970 kB │ 1.16 MB │ 1.38 MB │ 1.16 MB │ 114 kB │ 970 kB │
└───────────┴────────┴────────┴─────────┴─────────┴─────────┴────────┴────────┘

Req/Bytes counts sampled once per second.

12k requests in 10.04s, 11.6 MB read

It is found that the average RPS has reached 1208.41, and the concurrency has reached a thousand level, which meets the expectation. Next, let’s take a look at the RPS of the database query interface and test it/api/shop

$# launch RPS test on your own server / API / shop
$ yarn autocannon -H 'Cookie: connect.sid=s%3ArGA44wXyem3dChGhc4PTIgAnyUJ8Dj2N.7hd9jyemRgD8CskqEUSjTGSSl%2FguJsKaAdienAyO7O8' http://localhost:9000/api/shop
# ...
Running 10s test @ http://localhost:9000/api/shop
10 connections

┌─────────┬───────┬───────┬───────┬───────┬──────────┬────────┬──────────┐
│ Stat    │ 2.5%  │ 50%   │ 97.5% │ 99%   │ Avg      │ Stdev  │ Max      │
├─────────┼───────┼───────┼───────┼───────┼──────────┼────────┼──────────┤
│ Latency │ 15 ms │ 23 ms │ 41 ms │ 46 ms │ 24.18 ms │ 6.7 ms │ 85.78 ms │
└─────────┴───────┴───────┴───────┴───────┴──────────┴────────┴──────────┘
┌───────────┬────────┬────────┬────────┬────────┬────────┬─────────┬────────┐
│ Stat      │ 1%     │ 2.5%   │ 50%    │ 97.5%  │ Avg    │ Stdev   │ Min    │
├───────────┼────────┼────────┼────────┼────────┼────────┼─────────┼────────┤
│ Req/Sec   │ 361    │ 361    │ 399    │ 441    │ 404.9  │ 26.06   │ 361    │
├───────────┼────────┼────────┼────────┼────────┼────────┼─────────┼────────┤
│ Bytes/Sec │ 312 kB │ 312 kB │ 345 kB │ 381 kB │ 349 kB │ 22.5 kB │ 312 kB │
└───────────┴────────┴────────┴────────┴────────┴────────┴─────────┴────────┘

Req/Bytes counts sampled once per second.

4k requests in 10.05s, 3.49 MB read

/api/shopThe average RPS is 404.9, which is close to the upper concurrency limit that a single SQLite database can provide.

in consideration of Node.js When debugging and analyzing the cluster mechanism of, it will bring a lot of inconvenience

// package.json
{
  "name": "13-debugging-and-profiling",
  "version": "1.0.0",
  "scripts": {
    "start": "node -r ./scripts/env src/server.js",
-    "start:inspect": "node --inspect-brk -r ./scripts/env src/server.js",
+    "start:inspect": "cross-env CLUSTERING='' node --inspect-brk -r ./scripts/env src/server.js",
-    "start:profile": "0x -- node -r ./scripts/env src/server.js",
+    "start:profile": "cross-env CLUSTERING='' 0x -- node -r ./scripts/env src/server.js",
    "start:prod": "cross-env NODE_ENV=production node -r ./scripts/env src/server.js",
    "sequelize": "sequelize",
    "sequelize:prod": "cross-env NODE_ENV=production sequelize",
    "build:yup": "rollup node_modules/yup -o src/moulds/yup.js -p @rollup/plugin-node-resolve,@rollup/plugin-commonjs,rollup-plugin-terser -f umd -n 'yup'"
  },
  // ...
}

Source code of this chapter

host1-tech/nodejs-server-examples – 13-debugging-and-profiling

Read more

Build from scratch Node.js Enterprise web server (zero): static services
Build from scratch Node.js Enterprise web server (1): interface and layering
Build from scratch Node.js Enterprise web server (2): Verification
Build from scratch Node.js Enterprise web server (3): Middleware
Build from scratch Node.js Enterprise web server (4): exception handling
Build from scratch Node.js Enterprise web server (5): database access
Build from scratch Node.js Enterprise web server (6): session
Build from scratch Node.js Enterprise web server (7): authentication login
Build from scratch Node.js Enterprise web server (8): network security
Build from scratch Node.js Enterprise web server (9): configuration items
Build from scratch Node.js Enterprise web server (x): log
Build from scratch Node.js Enterprise web server (11): timed tasks
Build from scratch Node.js Enterprise web server (12): remote call
Build from scratch Node.js Enterprise web server (XIII): breakpoint debugging and performance analysis
Build from scratch Node.js Enterprise web server (14): automated testing
Build from scratch Node.js Enterprise web server (XV): summary and Prospect

Recommended Today

How to Build a Cybersecurity Career

Original text:How to Build a Cybersecurity Career How to build the cause of network security Normative guidelines for building a successful career in the field of information security fromDaniel miesslerstayinformation safetyCreated / updated: December 17, 2019 I’ve been doing itinformation safety(now many people call it network security) it’s been about 20 years, and I’ve spent […]