Kafka Connectors¶
Use connectors to copy data between Apache Kafka® and other systems that you want to pull data from or push data to. You can download connectors from Confluent Hub.
To learn more about Kafka Connect see the free Kafka Connect 101 course.
Note
For managed connectors available on Confluent Cloud, see Connect External Systems to Confluent Cloud.
JDBC Source and Sink
The Kafka Connect JDBC Source connector imports data from any relational database with a JDBC driver into an Kafka topic. The Kafka Connect JDBC Sink connector exports data from Kafka topics to any relational database with a JDBC driver.
JMS Source
The Kafka Connect JMS Source connector is used to move messages from any JMS-compliant broker into Kafka.
Elasticsearch Service Sink
The Kafka Connect Elasticsearch Service Sink connector moves data from Kafka to Elasticsearch. It writes data from a topic in Kafka to an index in Elasticsearch.
Amazon S3 Sink
The Kafka Connect Amazon S3 Sink connector exports data from Kafka topics to S3 objects in either Avro, JSON, or Bytes formats.
HDFS 2 Sink
The Kafka Connect HDFS 2 Sink connector allows you to export data from Apache Kafka® topics to HDFS 2.x files in a variety of formats. The connector integrates with Hive to make data immediately available for querying with HiveQL.
Replicator
Replicator allows you to easily and reliably replicate topics from one Kafka cluster to another.