Learning Flink from 0 to 1 – Flink writes data to Kafka

Time:2021-9-19

Learning Flink from 0 to 1 - Flink writes data to Kafka

preface

Previous articlesLearning Flink from 0 to 1 – Flink writes data to elasticsearchIt describes how to store the data in Kafka in elasticsearch. In fact, Flink’s own Kafka source connector (flinkkafkaconsumer) has been used. Saving to ES is just one case. If we need this data converted through Flink in multiple places, do we have to continue to write a sink plug-in? Indeed, Flink supports many sinks by default, such as the Kafka sink connector (flinkkafka producer). In this article, we will talk about how to write data to Kafka.

Learning Flink from 0 to 1 - Flink writes data to Kafka

get ready

Add dependency

Flink supports Kafka 0.8, 0.9, 0.10 and 0.11. You can analyze the implementation of the source code later.

Learning Flink from 0 to 1 - Flink writes data to Kafka

Here we need to install Kafka. Please add the version that Flink Kafka connector depends on. Here we use version 0.11:

<dependency>
    <groupId>org.apache.flink</groupId>
    <artifactId>flink-connector-kafka-0.11_2.11</artifactId>
    <version>${flink.version}</version>
</dependency>

Kafka installation

I won’t write this content here. You can refer to my previous articlesKafka installation and quick start

Here, we demonstrate how to write topic data from other Kafka clusters into their own local Kafka.

configuration file

kafka.brokers=xxx:9092,xxx:9092,xxx:9092
kafka.group.id=metrics-group-test
kafka.zookeeper.connect=xxx:2181
metrics.topic=xxx
stream.parallelism=5
kafka.sink.brokers=localhost:9092
kafka.sink.topic=metric-test
stream.checkpoint.interval=1000
stream.checkpoint.enable=false
stream.sink.parallelism=5

Now let’s see if there is a metric test topic in Kafka? You need to execute this command:

bin/kafka-topics.sh --list --zookeeper localhost:2181

Learning Flink from 0 to 1 - Flink writes data to Kafka

We can see that there is no topic in the local Kafka. If the metric test topic appears after our program runs again, it proves that my program works and has written the Kafka data of other clusters to the local Kafka.

Program code

Main.java

public class Main {
    public static void main(String[] args) throws Exception{
        final ParameterTool parameterTool = ExecutionEnvUtil.createParameterTool(args);
        StreamExecutionEnvironment env = ExecutionEnvUtil.prepare(parameterTool);
        DataStreamSource<Metrics> data = KafkaConfigUtil.buildSource(env);

        data.addSink(new FlinkKafkaProducer011<Metrics>(
                parameterTool.get("kafka.sink.brokers"),
                parameterTool.get("kafka.sink.topic"),
                new MetricSchema()
                )).name("flink-connectors-kafka")
                .setParallelism(parameterTool.getInt("stream.sink.parallelism"));

        env.execute("flink learning connectors kafka");
    }
}

Operation results

Start the program, check the running results, and check whether there is a new topic without executing the above command:

Learning Flink from 0 to 1 - Flink writes data to Kafka

Execute the command to view the information of this topic:

bin/kafka-topics.sh --describe --zookeeper localhost:2181 --topic metric-test

Learning Flink from 0 to 1 - Flink writes data to Kafka

analysis

In the above code, we use Flink Kafka producer to pass only three parameters: brokerlist, topicid and serializationschema (serialization)

Learning Flink from 0 to 1 - Flink writes data to Kafka

In fact, you can also pass in multiple parameters. Now some parameters use the default parameters, because there are many contents. You can take out an article to talk about it separately later.

summary

In this article, Flink reads data from other Kafka clusters and then writes it to the local Kafka. I didn’t do any data conversion on the Flink layer, but forwarded the data as it is. If you have any other needs, you can perform various data conversion operations on the Flink layer, such as some conversion in this article:Learning Flink from 0 to 1 – Flink data transformationAnd then send the converted data to Kafka.

The original address of this article is:http://www.54tianzhisheng.cn/2019/01/06/Flink-Kafka-sink/, reprinting without permission is prohibited.

Pay attention to me

The official account of WeChat:zhisheng

In addition, I have compiled some Flink learning materials, and I have put all the official account of WeChat. You can add my wechat:zhisheng_tian, and then reply to the keyword:FlinkYou can get it unconditionally.

Learning Flink from 0 to 1 - Flink writes data to Kafka

GitHub code warehouse

https://github.com/zhisheng17/flink-learning/

In the future, all the code of this project will be put in this warehouse, including some demos and blogs for learning Flink

Related articles

1、Learning Flink from 0 to 1 – Introduction to Apache Flink

2、Learning Flink from 0 to 1 — an introduction to building Flink 1.6.0 environment and building and running simple programs on MAC

3、Learn Flink from 0 to 1 – detailed explanation of Flink profile

4、Learning Flink from 0 to 1 – Introduction to data source

5、Learn Flink from 0 to 1 – how to customize the data source?

6、Learning Flink from 0 to 1 – Introduction to data sink

7、Learn Flink from 0 to 1 – how to customize data sink?

8、Learning Flink from 0 to 1 – Flink data transformation

9、“Learning Flink from 0 to 1” — introducing stream windows in Flink

10、Learning Flink from 0 to 1 — detailed explanations of several times in Flink

11、Learning Flink from 0 to 1 – Flink writes data to elasticsearch

12、Learning Flink from 0 to 1 – how does the Flink project run?

13、Learning Flink from 0 to 1 – Flink writes data to Kafka