Kafka Connect HTTP Sink Connector

The Kafka Connect HTTP Sink Connector integrates Kafka with an API via HTTP or HTTPS.

The connector consumes records from Kafka topic(s) and converts each record value to a String before sending it in the request body to the configured http.api.url, which optionally can reference the record key and/or topic name. The targeted API must support either a POST or PUT request.

The connector batches records up to the set batch.max.size before sending the batched request to the API. Each record is converted to its String representation and then separated with the batch.separator.

The HTTP Sink Connector supports connecting to APIs using SSL along with Basic Authentication, OAuth2, or a Proxy Authentication Server.

Install HTTP Connector

You can install this connector by using the Confluent Hub client (recommended) or you can manually download the ZIP file.

Install Connector Using Confluent Hub

Confluent Hub Client must be installed. This is installed by default with Confluent Platform commercial features.

Navigate to your Confluent Platform installation directory and run this command to install the latest (latest) connector version. The connector must be installed on every machine where Connect will be run.

confluent-hub install confluentinc/kafka-connect-http:latest

You can install a specific version by replacing latest with a version number. For example:

confluent-hub install confluentinc/kafka-connect-http:1.0.0

Install Connector Manually

Download and extract the ZIP file for your connector and then follow the manual connector installation instructions.

Kafka Connect HTTP Connector Quick Start

This quick start uses the HTTP Sink Connector to consume records and send HTTP requests to a demo HTTP service running locally that is running without any authentication.

Additional examples can be found in the Feature Descriptions and Examples section below.

  1. Before starting the connector, clone and run the kafka-connect-http-demo app on your machine.

    git clone https://github.com/confluentinc/kafka-connect-http-demo.git
    cd kafka-connect-http-demo
    mvn spring-boot:run -Dspring.profiles.active=simple-auth
  2. Install the connector through the Confluent Hub Client.

    # run from your CP installation directory
    confluent-hub install confluentinc/kafka-connect-http:latest
  3. Start Confluent Platform using the confluent local commands.

    confluent local start
  4. Produce test data to the http-messages topic in Kafka using the Confluent CLI confluent local produce command.

    seq 10 | confluent local produce http-messages
  5. Create a http-sink.json file with the following contents:

      "name": "HttpSink",
      "config": {
        "topics": "http-messages",
        "tasks.max": "1",
        "connector.class": "io.confluent.connect.http.HttpSinkConnector",
        "http.api.url": "http://localhost:8080/api/messages",
        "value.converter": "org.apache.kafka.connect.storage.StringConverter",
        "confluent.topic.bootstrap.servers": "localhost:9092",
        "confluent.topic.replication.factor": "1"
  6. Load the HTTP Sink Connector.


    You must include a double dash (--) between the topic name and your flag. For more information, see this post.

    confluent local load httpsink -- -d http-sink.json


    Don't use the confluent local commands in production environments.

  7. Confirm that the connector is in a RUNNING state.

    confluent local status httpsink
  8. Confirm that the data was sent to the HTTP endpoint.

    curl localhost:8080/api/messages


Before running other examples, kill the demo app (CTRL + C) to avoid port conflicts.


This connector is available under a Confluent enterprise license. You can use this connector for a 30-day trial period without a license key. If you are a subscriber, please contact Confluent Support at support@confluent.io for more information.

For more information, see License Topic Configuration.