Wind Field Visualization: Plotting Trajectories

Time:2022-11-24

Primer

learndraw particlesAfter that, move on to how to draw particle trajectories.

draw track

It is mentioned in the original post that the way to draw the trail is to draw the particles into a texture, then use that texture as a background (slightly darkened) on the next frame, and swap the input/target texture every frame. Here are two key features of WebGL:

based ondraw particlesBased on the main idea of ​​adding logic:

  • During initialization, background texture B and screen texture S are added.
  • When creating data related to each particle, two textures T20 and T21 are stored.
  • When drawing, first draw the background texture B, then draw all the particles according to the texture T20, then draw the screen texture S, and then use the screen texture S as the background texture B of the next frame.
  • Finally draw a new result based on texture T21, generate a new state texture to cover T20, and start drawing the next frame.

Particle trail effects that do not contain random generation seeexample, Let’s take a look at the specific implementation.

texture

Add texture-related logic:

// code omitted
resize() {
  const gl = this.gl;
  const emptyPixels = new Uint8Array(gl.canvas.width * gl.canvas.height * 4);
  // screen textures to hold the drawn screen for the previous and the current frame
  this.backgroundTexture = util.createTexture(gl, gl.NEAREST, emptyPixels, gl.canvas.width, gl.canvas.height);
  this.screenTexture = util.createTexture(gl, gl.NEAREST, emptyPixels, gl.canvas.width, gl.canvas.height);
}
// code omitted

The initialized background texture and screen texture are based on the width and height of the Canvas, and are also stored in 4 components per pixel.

screen shader program

Add a new screen shader program object, and the final visible content is this object that is responsible for drawing:

this.screenProgram = webglUtil.createProgram(gl, quadVert, screenFrag);

vertex data

Vertex related logic:

// code omitted
  this.quadBuffer = util.createBuffer(gl, new Float32Array([0, 0, 1, 0, 0, 1, 0, 1, 1, 0, 1, 1]));
// code omitted
  util.bindAttribute(gl, this.quadBuffer, program.a_pos, 2);
// code omitted
  gl.drawArrays(gl.TRIANGLES, 0, 6);
// code omitted

It can be seen here that the vertex data is analyzed in two dimensions, a total of 6 points, and a rectangle is drawn, why the coordinates are 0 and 1, and then look at the shader below.

vertex shader

Added vertex shader and corresponding bound variables:

const quadVert = `
  precision mediump float;

  attribute vec2 a_pos;

  varying vec2 v_tex_pos;

  void main() {
      v_tex_pos = a_pos;
      gl_Position = vec4(1.0 - 2.0 * a_pos, 0, 1);
  }
`;
// code omitted
this.drawTexture(this.backgroundTexture, this.fadeOpacity);
// code omitted
drawTexture(texture, opacity) {
  // code omitted
  util.bindAttribute(gl, this.quadBuffer, program.a_pos, 2);
  // code omitted
  gl.drawArrays(gl.TRIANGLES, 0, 6);
}
// code omitted

From these scattered logic, find the actual value corresponding to the variable in the shader:

  • a_posquadBufferTwo-dimensional data for each vertex in .
  • v_tex_pos: anda_poswill be used in the corresponding fragment shader.

heregl_PositionCombining with the above-mentioned vertex coordinates of 0 and 1, it is found that the range of the calculation result is [-1.0, +1.0], which can be displayed within the clipping space.

Fragment shader

Fragment shader and corresponding bound variables:

const screenFrag = `
  precision mediump float;

  uniform sampler2D u_screen;
  uniform float u_opacity;

  varying vec2 v_tex_pos;

  void main() {
      vec4 color = texture2D(u_screen, 1.0 - v_tex_pos);
      // a hack to guarantee opacity fade out even with a value close to 1.0
      gl_FragColor = vec4(floor(255.0 * color * u_opacity) / 255.0);
  }
`;
this.fadeOpacity = 0.996;
// code omitted
drawTexture(texture, opacity) {
  // code omitted
  gl.uniform1i(program.u_screen, 2);
  gl.uniform1f(program.u_opacity, opacity);

  gl.drawArrays(gl.TRIANGLES, 0, 6);
}

From these scattered logic, find the actual value corresponding to the variable in the shader:

  • u_screen: The dynamically changing texture needs to be judged according to the context.
  • u_opacity: Transparency, which needs to be judged according to the context.
  • v_tex_pos: Passed from the vertex shader, that isquadBufferdata in .

1.0 - v_tex_posThe range of is [0, 1], which just contains the range of the entire texture. The final color is multiplied by dynamicu_opacityThe effect is the purpose of “slightly darkening” in the original text.

Update shader programs

Adding an updated shader program object is the key to making particles generate movement trajectories:

this.updateProgram = webglUtil.createProgram(gl, quadVert, updateFrag);

vertex data

Shared set of vertex data with the screen shader program.

vertex shader

Shared set of vertex shaders with the screen shader program.

Fragment shader

For the updated fragment shader and corresponding bound variables:

const updateFrag = `
  precision highp float;

  uniform sampler2D u_particles;
  uniform sampler2D u_wind;
  uniform vec2 u_wind_res;
  uniform vec2 u_wind_min;
  uniform vec2 u_wind_max;

  varying vec2 v_tex_pos;

  // wind speed lookup; use manual bilinear filtering based on 4 adjacent pixels for smooth interpolation
  vec2 lookup_wind(const vec2 uv) {
      // return texture2D(u_wind, uv).rg; // lower-res hardware filtering
      vec2 px = 1.0 / u_wind_res;
      vec2 vc = (floor(uv * u_wind_res)) * px;
      vec2 f = fract(uv * u_wind_res);
      vec2 tl = texture2D(u_wind, vc).rg;
      vec2 tr = texture2D(u_wind, vc + vec2(px.x, 0)).rg;
      vec2 bl = texture2D(u_wind, vc + vec2(0, px.y)).rg;
      vec2 br = texture2D(u_wind, vc + px).rg;
      return mix(mix(tl, tr, f.x), mix(bl, br, f.x), f.y);
  }

  void main() {
      vec4 color = texture2D(u_particles, v_tex_pos);
      vec2 pos = vec2(
          color.r / 255.0 + color.b,
          color.g / 255.0 + color.a); // decode particle position from pixel RGBA

      vec2 velocity = mix(u_wind_min, u_wind_max, lookup_wind(pos));

      // take EPSG:4236 distortion into account for calculating where the particle moved
      float distortion = cos(radians(pos.y * 180.0 - 90.0));
      vec2 offset = vec2(velocity.x / distortion, -velocity.y) * 0.0001 * 0.25;

      // update particle position, wrapping around the date line
      pos = fract(1.0 + pos + offset);

      // encode the new particle position back into RGBA
      gl_FragColor = vec4(
          fract(pos * 255.0),
          floor(pos * 255.0) / 255.0);
  }
`;
// code omitted
setWind(windData) {
  // The source data of the wind field image
  this.windData = windData;
}
// code omitted
util.bindTexture(gl, this.windTexture, 0);
util.bindTexture(gl, this.particleStateTexture0, 1);
// code omitted
this.updateParticles();
// code omitted
updateParticles() {
  // code omitted
  const program = this.updateProgram;
  gl.useProgram(program.program);

  util.bindAttribute(gl, this.quadBuffer, program.a_pos, 2);

  gl.uniform1i(program.u_wind, 0); // wind texture
  gl.uniform1i(program.u_particles, 1); // particle texture

  gl.uniform2f(program.u_wind_res, this.windData.width, this.windData.height);
  gl.uniform2f(program.u_wind_min, this.windData.uMin, this.windData.vMin);
  gl.uniform2f(program.u_wind_max, this.windData.uMax, this.windData.vMax);

  gl.drawArrays(gl.TRIANGLES, 0, 6);
  // code omitted
}

From these scattered logic, find the actual value corresponding to the variable in the shader:

  • u_wind: The texture generated by the wind field imagewindTexture
  • u_particles: texture of all particle color informationparticleStateTexture0
  • u_wind_res: The width and height of the generated image.
  • u_wind_min: The minimum value of the wind field data component.
  • u_wind_max: The maximum value of the wind field data component.

according toquadBufferThe vertex data from the textureparticleStateTexture0Obtain the pixel information of the corresponding position in , and use the pixel information to decode the particle position, throughlookup_windThe method obtains the smooth interpolation of 4 adjacent pixels, and then calculates the offset based on the maximum and minimum values ​​of the wind fieldoffset, and finally get the new position and turn it into a color output. During this process, the following key points were discovered:

  • How to get 4 adjacent pixels?
  • In a two-dimensional map, how are polar and equatorial particles distinguished?

How to get 4 adjacent pixels?

Look at the main method:

vec2 lookup_wind(const vec2 uv) {
  vec2 px = 1.0 / u_wind_res;
  vec2 vc = (floor(uv * u_wind_res)) * px;
  vec2 f = fract(uv * u_wind_res);
  vec2 tl = texture2D(u_wind, vc).rg;
  vec2 tr = texture2D(u_wind, vc + vec2(px.x, 0)).rg;
  vec2 bl = texture2D(u_wind, vc + vec2(0, px.y)).rg;
  vec2 br = texture2D(u_wind, vc + px).rg;
  return mix(mix(tl, tr, f.x), mix(bl, br, f.x), f.y);
}
  • Based on the width and height of the generated image, the basic unit is obtainedpx
  • Under the new metric, round down to get the approximate positionvcAs the 1st reference point, move the base unit single componentpx.xGet the second reference point;
  • move base unit single componentpx.yGet the 3rd reference point, move the base unitpxGet the 4th reference point.

In a two-dimensional map, how are polar and equatorial particles distinguished?

As in the original text:

Near the poles, particles should move much faster along the X axis than at the equator, since the same longitude represents much smaller distances.

Corresponding processing logic:

float distortion = cos(radians(pos.y * 180.0 - 90.0));
vec2 offset = vec2(velocity.x / distortion, -velocity.y) * 0.0001 * u_speed_factor;

radiansmethod converts angles to radian values,pos.y * 180.0 - 90.0Guessing is the rule for turning wind data into angles.cosThe cosine value gradually becomes smaller between [0, π], corresponding tooffsetThe first component of will gradually become larger, and the effect will appear to be faster. The second component is signed-, it is presumed to be consistent with the image texture, and the image texture is reversed on the Y axis by default.

draw

Plotting this piece varies a lot:

draw() {
    // code omitted
    this.drawScreen();
    this.updateParticles();
  }
  drawScreen() {
    const gl = this.gl;
    // draw the screen into a temporary framebuffer to retain it as the background on the next frame
    util.bindFramebuffer(gl, this.framebuffer, this.screenTexture);
    gl.viewport(0, 0, gl.canvas.width, gl.canvas.height);

    this.drawTexture(this.backgroundTexture, this.fadeOpacity);
    this.drawParticles();

    util.bindFramebuffer(gl, null);
    // enable blending to support drawing on top of an existing background (e.g. a map)
    gl.enable(gl.BLEND);
    gl.blendFunc(gl.SRC_ALPHA, gl.ONE_MINUS_SRC_ALPHA);
    this.drawTexture(this.screenTexture, 1.0);
    gl.disable(gl.BLEND);

    // save the current screen as the background for the next frame
    const temp = this.backgroundTexture;
    this.backgroundTexture = this.screenTexture;
    this.screenTexture = temp;
  }
  drawTexture(texture, opacity) {
    const gl = this.gl;
    const program = this.screenProgram;
    gl.useProgram(program.program);
    // code omitted
    gl.drawArrays(gl.TRIANGLES, 0, 6);
  }
  drawParticles() {
    const gl = this.gl;
    const program = this.drawProgram;
    gl.useProgram(program.program);
    // code omitted
    gl.drawArrays(gl.POINTS, 0, this._numParticles);
  }
  updateParticles() {
    const gl = this.gl;
    util.bindFramebuffer(gl, this.framebuffer, this.particleStateTexture1);
    gl.viewport(
      0,
      0,
      this.particleStateResolution,
      this.particleStateResolution
    );

    const program = this.updateProgram;
    gl.useProgram(program.program);
    // code omitted
    gl.drawArrays(gl.TRIANGLES, 0, 6);

    // swap the particle state textures so the new one becomes the current one
    const temp = this.particleStateTexture0;
    this.particleStateTexture0 = this.particleStateTexture1;
    this.particleStateTexture1 = temp;
  }
  • First switch to the frame buffer, the specified texture isscreenTexture, note that the result of drawing from here is invisible, and then draws the entire background texturebackgroundTextureand texture basedparticleStateTexture0All individual particles of the , then unbind the framebuffer. This part of the drawing result will be stored in the texturescreenTexturemiddle.
  • Switch to the default color buffer, notice that the result of drawing from here is visible, enable alpha blending,blendFuncThe effect of the two parameters set is that the overlapping part will be drawn later and overwrite the first drawn. Then painted the whole texturescreenTexture, that is to say, the drawing results of the frame buffer are all displayed on the canvas.
  • After the drawing is completed, the intermediate variable is used for replacement, the texturebackgroundTextureBecomes the currently rendered texture content as the background of the next frame.
  • Then switch to the frame buffer to update the particle state, the specified texture isparticleStateTexture1, note that the result of drawing from here is invisible, based on the textureparticleStateTexture0Draw the offset state, and the entire drawing result will be stored in the textureparticleStateTexture1middle.
  • After the drawing is completed, the intermediate variable is used for replacement, the textureparticleStateTexture0It becomes the texture content after moving, which is used as the basis for the particle presentation in the next frame. Such continuous frame drawing looks like a dynamic effect.

Confuse

It feels like that’s the case, but some still don’t quite understand.

Why do offsets use the calculation method in lookup_wind?

The original text explains to find smooth interpolation, but what is the mathematical principle in it? Why do you want to find it again?mixonce? I haven’t found a better explanation personally.

References