Docker based log analysis platform (4) platform integration

Time:2021-7-12

In the previous article, we have basically completed the installation of elk and Kafka environment, and we have also started with a few simple examples. Now let’s add kakfa as a buffer to the built architecture. Again, first logstash reads the log from the log source and stores it in Kafka. Then logstash reads the log from Kafka and stores it in elasticsearch. So we need two steps.

Logstash -> Kafka

Logstash will send the logs directly to elastic search and kibana will display them. Because logstash will synchronously transfer logs to elasticsearch, once elasticsearch fails, data may be lost. Therefore, we consider using Kafka as a buffer to make logstash unaffected by elastic search. The first step is to let logstash send logs to Kafka, where logstash is equivalent to producer. Let’s take a look at the logstash configuration file

input {
    file {
        path => ["/var/log/laravel/storage/logs/*.log"]
    }
}
filter {
   grok {
        match => {
            "message" => "\[%{TIMESTAMP_ISO8601:logtime}\] %{WORD:env}\.%{LOGLEVEL:level}\: %{GREEDYDATA:msg}"
        }
    }
}
output {
    kafka {
        bootstrap_servers => "kafka:9092"
        topic_id => "laravellog"
    }
}

Here is the log file used to read the laravel project. We add a filter between the input and output, which is a plug-in of logstash. The user formats the read data. In general, laravel’s log file is like this:

[2017-12-05 17:45:07] production. Error: error reporting interface {"API": "/ admin / sales"}

It is divided into several parts, which are the log recording time, the log generation environment, the log level, the log information and additional data. So we do a format, and finally let it be stored in elasticsearch in the form of JSON. By default, if there is no filter, it will be stored in one line directly. The formatted data is like this (part)

{
    "MSG": "interface parameters {\" params \ ":]}",
    "path": "/var/log/fenyong/storage/logs/laravel-2017-12-05.log",
    "level": "ERROR",
    "env": "local",
    "logtime": "2017-12-05 17:54:50"
  }

Kafka -> Elasticsearch

Use logstash to read data from Kafka and store it in elasticsearch. Here, as a consumer, the only thing to pay attention to is to ensure that the names of topics are consistent.

input {
    kafka {
        bootstrap_servers => "kafka:9092"
        topics => ["laravellog"]
    }
}

output {
    elasticsearch {
        hosts => "elasticsearch:9200"
        index => "laravellog"
        user => "elastic"
        password => "changeme"
    }
}

In this way, we have completed the log storage from logstash to Kafka to elasticsearch, and then we can use kibana to display the data.
Docker based log analysis platform (4) platform integration

So far, we have successfully added Kafka to the architecture of log analysis platform.