Key elements of smart city traffic: the solution of intersection supervision visualization system

Time:2020-11-12

preface

With the development and change of the information age, thesmart citySlowly appeared in real life, to a large extent, to facilitate the daily management and maintenance. In the context of smart city,Intelligent traffic supervision visualization systemIt is an important part of the smart city. Through the series of road monitoring, the management and control of a smart city is extended,real-time data Equipment statusas well asVideo surveillanceIs extremely important. Among them, video monitoring has always been a part of the main bodyinternetandInternet of thingsIn the form of going hand in hand,China SkynetAt the historic moment, this is actually a city monitoring system, but it is not a single camera device, but a full 170 million monitoring cameras, and in the next three years, another 400 million cameras will be installed. As the artery of urban development, traffic is closely related to people’s daily life. Under this series of supervision, traffic has become a “public security video monitoring system”, which is related to people’s daily security management.

The main mode of urban transportation is reflected in the facilities of urban roads, public transportation and rail transit. However, with the acceleration of urbanization process and the promotion of economic and social development, the number of motor vehicles increases rapidly, and the urban traffic problems are becoming increasingly serious. In the face of this situation, in order to ease the various problems of urban traffic, a variety of solutions have been adopted, such as the construction of a series of signal light control, intersection checkpoint monitoring, video monitoring and other methods of system maintenance, which has a certain degree of effect, but each system independently solves its corresponding problems, unable to carry out comprehensive control from the overall traffic situation At present, the management system of urban intelligent transportation can well deal with this problem. Between 2D configuration and 3D configuration,Hightopo(hereinafter referred to as HT)HT for WebThere are rich configuration options on the product, this article will introduce how to use the HT rich2 / 3D configurationBuild a solution of intersection monitoring system.

This paper will introduce the realization process of intelligent traffic visualization monitoring system from the following aspects:

1、Vehicle generation and operation under traffic light control;

2、Camera simulation and real scene monitoring;

3、Real time data and maintenance of intersection monitoring information;

Interface introduction and effect preview

In the intelligent traffic monitoring system, the scene of real-time vehicle driving under the control of traffic lights is presented. With the controllable weather environment switch panel, the operation status of the intersection in various weather conditions can be simulated. The monitoring status of the intersection camera is also realized. Click each camera at the intersection to view the real-time operation of the intersection within the monitoring range of this camera, providing simulation Secondly, many real-time data monitoring panels are added to achieve the maximum benefit of real-time intersection monitoring.

Key elements of smart city traffic: the solution of intersection supervision visualization system

systems analysis

Through the combination of rich 2D configuration and 3D configuration, HT has sorted out many solutions on the industrial Internet. Under the background of smart city promotion, the visualization decision-making system of intelligent traffic management system is also an extremely important part. For the monitoring of roads and intersections, facing the large screen environment of traffic management intelligent center, its 3D scene can be accessedHT for WebThe product lightweight HTML5 / webgl modeling scheme can achieve the good effect of rapid modeling, lightweight operation and maintenance, and even 3D visual operation and maintenance by mobile phone terminal browser; vector icon on 2D drawings will not be distorted after zooming in, supporting large screen, multi screen, super resolution and other display scenarios.

1. Integrated situation monitoring

Integrate the data of geographic information system, video monitoring system and traffic management departments, comprehensively monitor the traffic conditions, vehicle flow, accident handling report and other elements, and support the click to view the detailed information of specific police force, mobile target, traffic incident and monitoring video, so as to help managers master the overall traffic situation in real time.

Key elements of smart city traffic: the solution of intersection supervision visualization system

2. Traffic basic resources monitoring

It supports the monitoring and visual management of the quantity, spatial location distribution, real-time status and other information of basic traffic resources such as cameras, flow detection equipment, traffic lights, etc., supports detailed information query of equipment, alarms for equipment that does not work normally, strengthens the monitoring and perception of equipment status by managers, and improves the operation and maintenance management efficiency of transportation infrastructure.

Key elements of smart city traffic: the solution of intersection supervision visualization system

3. Video inspection and monitoring

It supports the integration of front-end video inspection system, effectively combines video intelligent analysis, intelligent positioning, intelligent research and judgment technology to visually monitor road congestion points, hidden danger points, accident points, etc., so as to realize real-time alarm and rapid display of abnormal events, and intelligently retrieve monitoring video around abnormal points, so as to effectively improve the efficiency of receiving and handling alarms.

Key elements of smart city traffic: the solution of intersection supervision visualization system

4. Intersection signal light monitoring

It supports the integration of intersection signal lights, video monitoring and other system data, real-time monitoring of intersection traffic flow, flow rate, vehicle and road abnormal events, signal light status and other information. Combined with professional model algorithm, it can compare the best historical speed and the best traffic volume, and visually analyze and judge the intersection traffic situation, so as to optimize the signal timing and organize the intersection traffic Optimization provides scientific basis for decision-making and effectively improves traffic operation efficiency.

Key elements of smart city traffic: the solution of intersection supervision visualization system

5. Analysis of illegal cases

Fully integrate the existing data resources of the traffic control department, provide a variety of visual analysis and interactive means, conduct visual serial and parallel analysis on massive historical illegal and illegal case data, and deeply mine the temporal and spatial distribution law of cases, so as to provide support for the traffic control department to carry out cause analysis, active prevention and other business applications.

Key elements of smart city traffic: the solution of intersection supervision visualization system 

code implementation

1. Vehicle generation and operation under traffic light control

In the simulation operation system of the intersection, there are so many vehicles running in the scene, which are loaded with vehicle models dynamically and animated by pipeline operation. Of course, these vehicles need to comply with the operation rules of signal lights, and some control measures are needed to make these vehicles abide by the law.

Key elements of smart city traffic: the solution of intersection supervision visualization system

The following is the pseudo code of loading the corresponding vehicle model by setting the basic attributes of some vehicles and judging according to the type type:

loadCar(type) {
    //Create a new vehicle node
    let car = new ht.Node();
    //Create and load the corresponding vehicle model according to the vehicle type
    switch (type) {
        case 'familyCar':
            Car. S ('shape3d ','models / HT model library / traffic / vehicle / family car. JSON');
            break;
        case 'truck':
            Car. S ('shape3d ','models / HT model library / traffic / vehicle / truck. JSON');
            break;
        case 'jeep':
            Car. S ('shape3d ','models / HT model library / traffic / vehicle / Jeep. JSON');
            break;
        ...
        default:
            console.log('NO THIS TYPE CAR!');
            break;
    }
    //Set the vehicle to be unselectable and immovable
    car.s({
        '3d.selectable': false,
        '3d.movable': false
    });
    //Set the anchor - the head of the car
    car.setAnchor3d(1, 0, 0.5); 
    //Set initial position
    car.setPosition3d(0, 100000, 0);

    let typeIndex = 1;
    //Determine if this type of vehicle has been generated before
    this.g3dDm.each(data => {
        if (data.getTag() === type + typeIndex) {
            typeIndex++;
        }
    })
    //Set vehicle node label
    car.setTag(type + typeIndex);
    //Set the name of the vehicle node
    car.setDisplayName(type);
    //Add the vehicle node to the data model
    this.g3dDm.add(car);
}

Driven by the pipeline animation, the generated vehicle is driving under the control of the signal lamp, and the first step is to pass through ht.Polyline Drawing a vehicle running pipeline has two-dimensional and three-dimensional forms, which are inherited from the ht.Shape When shape3d is set to cylinder, it will be displayed as 3D pipeline effect. As well as encapsulating the vehicle, we can complete the animation of the pipeline through the HT ht.Default.startAnim () to trend vehicles running along the pipeline to achieve the effect of pipeline animation.

Key elements of smart city traffic: the solution of intersection supervision visualization system

The implementation of pipeline animation is based on ht.Default.startAnim () encapsulates the animation function of move, which is the encapsulation function of node moving smoothly along the path

  • node:Animation node;
  • path:Running path;
  • duration:Animation execution scheduling time;
  • animParams:Animation parameters;

By drawing a pipeline that runs the route, ht.Default.getLineCacheInfo () get the point location and segmentation information cache of this pipeline, and then the pipeline information passes through ht.Default.getLineLength () to get the length of the pipe and pass ht.Default.getLineOffset () to obtain the offset information of the specified proportion of the line or pipe, so as to achieve the effect of moving node.lookAtX () to obtain the position information of the next face of the node, and set the position of the node at this time, so as to achieve the effect of smooth movement of the node along the path.

move(node, path, duration = 20000, animParams) {
    // path._ cache_  It contains the node information of the pipeline
    let cache = path._cache_; 
    //If there is no cache information, get path_ cache_  It contains the node information of the pipeline
    if (!cache) {
        cache = path._cache_ = ht.Default.getLineCacheInfo(path.getPoints(), path.getSegments());
    }
    //Gets the length of the pipeline cache information
    const len = ht.Default.getLineLength(cache);
    //Set animation object initialization
    animParams = animParams || {};
    //Set action to animparams animation execution function
    const action = animParams.action;
    //Animation execution part
    animParams.action = (v, t) => {
        //Get the offset information of pipe motion
        const offset = ht.Default.getLineOffset(cache, len * v);
        //Gets the point at the offset
        const point = offset.point;
        //Sets the next position the node looks at
        node.lookAtX([point.x, point.y, point.z], "forward ");
        //Set the location of the node
        node.p3(point.x, point.y, point.z);
        //Judge whether the animation is finished
        if (action) action();
    };
    //Loop call animation execution function
    return loop(animParams.action, duration);
}

//Loop animation function
loop(action, duration) {
    return ht.Default.startAnim({
        duration: duration,
        action: action
    });
}

2. Camera simulation and real scene monitoring

As one of the important solutions of the system, video surveillance provides two styles of simulation and real scene. The significance of simulation is to simulate the traffic flow through a simple vehicle model, and then reflect it by drawing the scene and sharing a data model. In essence, it can highlight the key information of monitoring through science fiction style, such as equipment maintenance and some illegal scenes For the real scene style, it connects the real-time video data stream, and then shows the running state of the intersection, and reproduces every dynamic information of the intersection with the true original appearance.

Key elements of smart city traffic: the solution of intersection supervision visualization system

2.1 realization principle of virtual camera

As for the camera implementation of the virtual reality scheme, a small 3D scene with pop-up window is loaded by rendering element renderhtml of HT, and a data model is shared with the main scene of intersection Datamodel is used to realize data change and animation intercommunication. Now we only need to obtain the real view angle information of the camera that is clicked. Through the global event dispatch, the obtained real view angle is transmitted to the camera pop-up scene to change the eye and center of the corresponding perspective, so as to obtain the camera’s virtual perspective in the main scene. In order to make the camera active with animation recognition, a cone-shaped monitoring area is drawn in front of each camera, which is adsorbed on the camera to define the monitoring range of the camera, so as to achieve the effect of intelligent monitoring.

In the interactive implementation, after clicking the selected camera, the conical area of the camera becomes a straight line, which is represented as the selected state. At the same time, the order before and after the selected camera is marked, and 2D is driven by dispatching events The drawing shows the camera pop-up window. While the pop-up window is displayed, the real-time changing center position information (Center) is obtained through calculation. As long as the location information is transmitted to the camera pop-up scene through the global dispatch event, the camera scene perspective can be synchronized with that of the camera in the home scene; the interactive way to cancel the pop-up window display is to double-click the field Scene background, restore the camera cone area and dispatch events to hide the camera pop-up window on the 2D drawing:

//Global event dispatcher
var G = {}
window.G = G;
G.event = new ht.Notifier();

handleInteractive(e) {
    const {kind, data} = e;
    if(kind === 'clickData') {
        //Judge whether the node is labeled. If there is no label, return
        let tag = data.getTag();
        if(!tag) return;
        //Judge the label name as camera
        if(tag.indexOf('camera') >= 0) {
            //Set to specify the last click camera and the current click camera
            this.lastClickCamera = this.nowClickCamera;
            this.nowClickCamera = data;
            //Initialize the size of the camera cone if there was a previous click on the camera
            if (this.lastClickCamera !== null) {
                let clickRangeNode = this.lastClickCamera.getChildren()._as[0];
                clickRangeNode.s3(300, 150, 500);
            }
            //If there is a click camera, the size of the camera cone is set
            if (this.nowClickCamera !== null) {
                let clickRangeNode = this.nowClickCamera.getChildren()._as[0];
                clickRangeNode.s3(5, 5, 500);
            }
            //Get the location information of the click camera
            var cameraP3 = nowClickCamera.p3();
            //Get the rotation information of the click camera
            var cameraR3 = nowClickCamera.r3();
            //Get the size information of the click camera
            var cameraS3 = nowClickCamera.s3();
            //Current cone start position
            var realP3 = [cameraP3[0], cameraP3[1] + cameraS3[1] / 2, cameraP3[2] + cameraS3[2] / 2]; 
            //Rotate the current eye position around the starting position of the camera to get the correct eye position
            var realEye = getCenter(cameraP3, realP3, cameraR3); 
            //Global events are distributed to eye and center of the camera scene
            G.event.fire({
                type: 'videoCreated',
                eye: realEye,
                center: getCenter(realEye, [realEye[0], realEye[1] ,realEye[2] + 5], cameraR3)
            });

            //Video window display and distribution
            event.fire(SHOW_VIDEO, {g3dDm: this.g3dDm, cameraName:tag});
        }
    }
    //Double click the background to hide the camera scene window and initialize the size of the camera cone
    if(kind === 'doubleClickBackground') {
        //Hidden distribution of video pop-up window
        event.fire(HIDE_VIDEO);
        //Initialize the size of the camera cone if there was a previous click on the camera
        if (this.nowClickCamera !== null) {
            let clickRangeNode = this.nowClickCamera.getChildren()._as[0];
            clickRangeNode.s3(300, 150, 500)
        }
        //Set the current click camera to be empty
        this.nowClickCamera = null;
    }
}

The method getcenter () mentioned above is actually to obtain the corresponding rotation angle of each camera node in the scene. The simplified understanding is that one point a rotates around another point B, that is, the center rotates around the eye position, while we need to calculate the position of point a (center position) In this paper, a getcenter method is encapsulated to obtain the position of point a in 3D scene after point a rotates angle around point B. in this method, HT encapsulated ht.Math The following method, the following is the implementation code:

Key elements of smart city traffic: the solution of intersection supervision visualization system

The implementation code is as follows:

//Pointa is the rotation point around pointb
//Pointb is the point to be rotated
//R3 is the rotation angle array [xangle, YangLe, zangle] is the rotation angle around the X, y, Z axis respectively 
const getCenter = function(pointA, pointB, r3) {
    const mtrx = new ht.Math.Matrix4();
    const euler = new ht.Math.Euler();
    const v1 = new ht.Math.Vector3();
    const v2 = new ht.Math.Vector3();

    mtrx.makeRotationFromEuler(euler.set(r3[0], r3[1], r3[2]));

    v1.fromArray(pointB).sub(v2.fromArray(pointA));
    v2.copy(v1).applyMatrix4(mtrx);
    v2.sub(v1);

    return [pointB[0] + v2.x, pointB[1] + v2.y, pointB[2] + v2.z];
};

2.2 realization principle of live camera

For the realization of real scene, we can connect the real-time video data stream. Now the main common streaming media transmission protocols are:RTMPRTSPHLSAndHTTP-FLV

  • RTMP(real time messaging protocol): real time message transmission protocol. In RTMP protocol, video must be H264 encoded, audio must be AAC or MP3 encoded, and most of them are in FLV format. Because RTMP protocol transfers FLV format stream file, flash player must be used to play.
  • RTSP(real time stream protocol): RTSP has very good real-time effect and is suitable for video chat, video monitoring and other directions.
  • HLSHTTP live streaming: HTTP based real-time streaming protocol defined by apple. The transmission includes two parts: 1. M3u8 description file, 2. TS media file. The video in TS media file must be H264 encoding, and audio must be AAC or MP3 encoding. The data is transmitted through the HTTP protocol. At present video.js The library supports the playback of files in this format.
  • HTTP-FLVThis protocol is HTTP + flv, which encapsulates audio and video data into FLV format, and then transmits it to the client through HTTP protocol. This protocol greatly facilitates the browser client to play live video stream flv.js The library supports file playback in this format.

For example, through a simple RTMP video stream docking, we can understand the principle of its implementation. For the loading of video, you need to use video.js Therefore, the plug-in is introduced first, and then the video stream is connected. The rendering element renderhtml, which is also distributed to HT through global events, is used to render the video stream to the scene drawing. The following pseudo code is implemented:

//Introduction video.js  plug-in unit
<script></script>

//Global events are dispatched to render element renderhtml to render the video to the scene drawing
G.event.add(function(e){
    if(e.type==='videoCreated'){
        var div=e.div;
        div.innerHTML='<video id="video" class="video-js vjs-default-skin"><source type="rtmp/flv"></video>';
        window.player = videojs('video');
    }
});

3. Real time data and maintenance of intersection monitoring information

The key data of some intersections can be displayed in the form of interface docking. Through the monitoring of real-time data changes, the information data of road junctions can be fed back immediately, including some accident statistics, traffic flow analysis, equipment maintenance status and vehicle violation. These data depend on the display carrier, which is realized by HT’s 2D configuration vector diagram. The vector diagram is suitable for many occasions. Its characteristic is that the enlarged image will not be distorted, and the screen with different resolutions will not be blurred, which makes the whole system applicable to different screens, including the monitoring system of large screen. On the information display of vector graph, compared with the previous page of simple data display, the system is through some custom animation interaction, which can make the effect of the whole page present an immersive experience, and the overall customer experience can be greatly improved.

Key elements of smart city traffic: the solution of intersection supervision visualization system

Of course, eht has a variety of ways to show the progress of EHT. For example, we can use a variety of ways to present the data in the form of charts and tables Configuration, if you are interested, you can learn how to use it through the HT home page.

For data interface acquisition, some mainstream methods can be used

  • ajax:JavaScript is used to make a request to the server and process the response without blocking the user core object XMLHttpRequest;
  • axios:Promise based HTTP library can be used in browsers and node.js Medium;
  • WebSocket:**HTML5 provides a full duplex communication protocol over a single TCP connection;

Ajax and Axios want to obtain the interface data in real time by polling the interface for transmission, while websocket can carry out data transmission in two directions, which can match their own implementation requirements in the selection and application. In this system, the real-time data is obtained through Axios calling interface.

The bar chart and line chart in the example are implemented through the mechanism in HT, and some charts on eecharts are used for user-defined configuration. Then, the loading data is loaded by polling the Axios interface to show the real-time traffic monitoring data information

loadData() {
    //Get the data model of the drawing
    let dm = this.g2d.dm();
    //Get data of traffic flow interface
    axios.get('/traffic').then(res => {
        //Access data of daily traffic flow line chart
        this.lineChart1.a({
            'seriesData1': res.lineChartData1,
            'axisData' : res.axisData
        });
        //Access to the data of broken line chart of vehicle operation peak
        this.lineChart2.a({
            'seriesData1': res.lineChartData2,
            'axisData' : res.axisData
        })
        //Load some data content by digital jumping
        setBindingDatasWithAnim(dm, res, 800, v => Math.round(v));
        //Access to peak operation time
        this.peakTime.s('text', res.peakTime);
    });
    //Load data of equipment running status
    axios.get('/equipmentStatus').then(res => {
        setBindingDatasWithAnim(dm, res, 800, v => Math.round(v));
    });
    //Loading accident statistics data
    axios.get('/accident').then(res => {
        setBindingDatasWithAnim(dm, res, 800, v => Math.round(v));
        //Access to the data of monthly accident histogram
        this.accidentBar.a({
            axisData: res.axisData,
            seriesData1: res.seriesData1
        })
    }); 
}

In fact, the drawing of the table is realized by encapsulating a component, and the interactive animation mainly uses the animation function of HT ht.Default.startAnim (), horizontally scroll 100 width and data transparency slowly emerge, and vertically offset a row of table row height 54 to add new alarm information.

addTableRow() {
    //Get table node
    let table = this.table;
    //Request interface data through Axios promise
    axios.get('getEvent').then(res => {
        //Get table node滚动信息的数据绑定
        let tableData = table.a('dataSource');
        //You can add one or more elements to the beginning of the scrolling information array by adding it to the unshift() method
        tableData.unshift(res);
        //Initializes the vertical offset of the table
        table.a('ty', -54);
        //Turn on table scrolling animation
        ht.Default.startAnim({
            duration: 600,
            //Animation execution function action
            action: (v, t) => {
                table.a({
                    //After adding data, scroll horizontally by 100
                    'firstRowTx': 100 * (1 - v),
                    //Transparency gradient effect for the first row height
                    'firstRowOpacity': v,
                    //Height of longitudinal offset 54
                    'ty': (v - 1) * 54
                });
            }
        });
    });
}

summary

The city’s brain is being heated. Traffic light regulation and control to solve the problem of congestion is only the tip of the iceberg of intelligent transportation. Under normal conditions, monitoring, analysis, research and judgment are the favorable guarantee for traffic managers to make strategies. These are inseparable from the “intelligent traffic visualization decision-making platform” which makes traffic data visible and perceptible.

By loading real-time data, the intelligent traffic visualization system can effectively and immediately reflect the state of the intersection, which makes the previous scattered systems connected in series to form a complete intelligent transportation system. Under this system, many roads are monitored in series, which is actually the epitome of a smart city. Based on the accumulated experience, HT Also completed a complete set of smart city system solutions, a lot of city data and facilities construction records, can play a combination of many functional subsystems for real-time data monitoring and display: HT smart city

Key elements of smart city traffic: the solution of intersection supervision visualization system

In 2019, we have also updated hundreds of 2D / 3D visualization case sets of industrial Internet, where you can find many novel examples and discover different industrial Internet: https://mp.weixin.qq.com/s/ZbhB6LO2kBRPrRIfHlKGQA

At the same time, you can also view more cases and effects: https://www.hightopo.com/demos/index.html