You are viewing documentation for an older version of Confluent Platform. For the latest, click here.

RabbitMQ Source Connector

The RabbitMQ Source connector is used to read data from a RabbitMQ queue or topic and persist the data in a Kafka topic.


The queue or topic that this connector will be reading from must exist prior to starting the connector.


All of the headers that are associated with each message from RabbitMQ are prefixed with rabbitmq. and copied over to each of the records that are produced to Kafka. Due to this your configured message format must support headers.


This connector is based on the AMQP protocol so it may work with other servers that implement this protocol.


Property based example

This configuration is used typically along with standalone workers.

kafka.topic=< Required Configuration >
rabbitmq.queue=< Required Configuration >

Rest based example

This configuration is used typically along with distributed workers. Write the following json to connector.json, configure all of the required values, and use the command below to post the configuration to one the distributed connect worker(s). Check here for more information about the Kafka Connect Rest API

Connect Distributed REST example
  "config" : {
    "name" : "RabbitMQSourceConnector1",
    "connector.class" : "io.confluent.connect.rabbitmq.RabbitMQSourceConnector",
    "tasks.max" : "1",
    "kafka.topic" : "< Required Configuration >",
    "rabbitmq.queue" : "< Required Configuration >"

Use curl to post the configuration to one of the Kafka Connect Workers. Change http://localhost:8083/ the the endpoint of one of your Kafka Connect worker(s).

Create a new connector
curl -s -X POST -H 'Content-Type: application/json' --data @connector.json http://localhost:8083/connectors
Update an existing connector
curl -s -X PUT -H 'Content-Type: application/json' --data @connector.json http://localhost:8083/connectors/RabbitMQSourceConnector1/config