Use nginx to collect pages, and Kafka collects the corresponding topics

Time:2021-2-23

0. Architecture introduction

Simulate the real-time stream on the line, such as the user‘s operation log. After collecting the data, process it. For the time being, only consider the data collection and useHtml+Jquery+Nginx+Ngx_kafka_module+KafkaTo achieve, among themNgx_kafka_moduleIt’s open source. It’s used to dock nginx andKafkaA component of.

1. Requirement description

1.1 usehtmlandjquerySimulate user request log

It includes the following items:

User ID: user_ ID, access time: Act_ Time, operation: (action, including click, job_ collect,cv_ send,cv_ upload)

Enterprise coding job_ code

1.2 using nginx to accept requests in 1.1

1.3 after accepting the request, use NGX_ kafka_ Module sends data to Kafka’s topic TP_ In individual.

1.4 use a consumer in Kafka to consume the topic and observe

2. Construction steps

2.1 Kafka installation

Due to the use of off the shelf installeddocker-kafkaImage, so you can start it directly

2.2 install nginx and start it

$ cd /usr/local/src
$ git clone [email protected]:edenhill/librdkafka.git
#Go to librdkafka and compile
$  cd librdkafka
$  yum install -y gcc gcc-c++ pcre-devel zlib-devel
$  ./configure
$  make && make install

$ yum -y install make zlib-devel gcc-c++ libtool openssl openssl-devel
$ cd /opt/hoult/software
#1. Download
$ wget http://nginx.org/download/nginx-1.18.0.tar.gz
#2. Decompress
$ tar -zxf nginx-1.18.0.tar.gz -C /opt/hoult/servers
#3. Download the module source code
$ cd /opt/hoult/software
$ git clone [email protected]:brg-liuwei/ngx_kafka_module.git
#4. Compilation
$ cd /opt/hoult/servers/nginx-1.18.0
$ ./configure --add-module=/opt/hoult/software/ngx_kafka_module/
$ make && make install 
#5. Delete the nginx installation package
$ rm /opt/hoult/software/nginx-1.18.0.tar.gz
#6. Start nginx
$ cd /opt/hoult/servers/nginx-1.18.0
$ nginx

3. Related configuration

3.1 nginx configuration nginx.conf

#pid        logs/nginx.pid;


events {
    worker_connections  1024;
}


http {
    include       mime.types;
    default_type  application/octet-stream;

    #log_format  main  '$remote_addr - $remote_user [$time_local] "$request" '
    #                  '$status $body_bytes_sent "$http_referer" '
    #                  '"$http_user_agent" "$http_x_forwarded_for"';

    #access_log  logs/access.log  main;

    sendfile        on;
    #tcp_nopush     on;

    #keepalive_timeout  0;
    keepalive_timeout  65;

    #gzip  on;

    kafka;
    kafka_broker_list linux121:9092;

    server {
        listen       9090;
        server_name  localhost;

        #charset koi8-r;

        #access_log  logs/host.access.log  main;

        #------------Kafka related configuration starts------------
        location = /kafka/log {
                #Cross domain configuration
                add_header 'Access-Control-Allow-Origin' $http_origin;
                add_header 'Access-Control-Allow-Credentials' 'true';
                add_header 'Access-Control-Allow-Methods' 'GET, POST, OPTIONS';

                kafka_topic tp_individual;
        }

        #error_page  404              /404.html;
    }

}

3.2 start Kafka producers and consumers

#Create topic
kafka-topics.sh --zookeeper linux121:2181/myKafka --create --topic tp_individual --partitions 1 --replication-factor 1
#Creating consumers
kafka-console-consumer.sh --bootstrap-server linux121:9092 --topic tp_individual --from-beginning
#Create producer test
kafka-console-producer.sh --broker-list linux121:9092 --topic tp_individual

3.3 write HTML + jQuery code

index
        
        
    
    
        
        
        
        
    

    

        function operate(action) {

            var json = {'user_id': 'u_donald', 'act_time': current().toString(), 'action': action, 'job_code': 'donald'};

            $.ajax({
                url:"http://linux121:8437/kafka/log",
                type:"POST" ,
                crossDomain: true,
                data: JSON.stringify(json),
                //The following sentence allows cross domain cookie access
                xhrFields: {
                    withCredentials: true
                },
                success:function (data, status, xhr) {

                    //  console.log ("operation successful: '" + action)
                },
                error:function (err) {

                    // console.log(err.responseText);
                }
            });
        };

        function current() {
            var d   = new Date(),
                str = '';
            str += d.getFullYear() + '-';
            str += d.getMonth() + 1 + '-';
            str += d.getDate() + ' ';
            str += d.getHours() + ':';
            str += d.getMinutes() + ':';
            str += d.getSeconds();
            return str;
        }