MQTT Sink Connector for Confluent Platform¶
The Kafka Connect MQTT Sink connector connects to an MQTT broker and publishes data to an MQTT topic. SSL is supported. For information on how to create SSL keys and certificates see Creating SSL Keys and Certificates. For the relevant configuration properties, see the MQTT Sink Connector Configuration Properties.
Features¶
At least once delivery¶
This connector guarantees that records are delivered at least once from the Kafka topic.
Dead Letter Queue¶
This connector supports the Dead Letter Queue (DLQ) functionality. For information about accessing and using the DLQ, see Confluent Platform Dead Letter Queue.
Multiple tasks¶
The MQTT Sink connector supports running one or more tasks. You can
specify the number of tasks in the tasks.max
configuration parameter. This
can lead to huge performance gains when multiple files need to be parsed.
Configuration Properties¶
For a complete list of configuration properties for this connector, see MQTT Sink Connector Configuration Properties.
Note
For an example of how to get Kafka Connect connected to Confluent Cloud, see Distributed Cluster.
Usage Notes¶
If you want to change the topic take a look at the RegexRouter transformation which can be used to change the topic name it is sent to the MQTT Server.
This connector publishes using the Apache Kafka® topic name. If you need to publish to a different topic name use a transformation to change the topic name.
Examples¶
Property-based example¶
This configuration is used typically along with standalone workers.
name=MqttSinkConnector1
connector.class=io.confluent.connect.mqtt.MqttSinkConnector
tasks.max=1
topics=< Required Configuration >
mqtt.server.uri=< Required Configuration >
REST-based example¶
This configuration is used typically along with distributed
workers. Write the following json to
connector.json
, configure all of the required values, and use the command
below to post the configuration to one the distributed connect worker(s). Check
here for more information about the Kafka Connect REST
API
Connect Distributed REST example:
{
"config" : {
"name" : "MqttSinkConnector1",
"connector.class" : "io.confluent.connect.mqtt.MqttSinkConnector",
"tasks.max" : "1",
"topics" : "< Required Configuration >",
"mqtt.server.uri" : "< Required Configuration >"
}
}
Use curl to post the configuration to one of the Kafka Connect Workers. Change http://localhost:8083/ the endpoint of one of your Kafka Connect worker(s).
Create a new connector:
curl -s -X POST -H 'Content-Type: application/json' --data @connector.json http://localhost:8083/connectors
Update an existing connector:
curl -s -X PUT -H 'Content-Type: application/json' --data @connector.json http://localhost:8083/connectors/MqttSinkConnector1/config