Kafka Connect FileStream Connectors

The Kafka Connect FileStream connector examples are intended to show how a simple connector runs for users getting started with Apache Kafka®.


  • Starting with version 6.2.1, the FileStream Sink and Source connector artifacts have been moved out of Kafka Connect. To run the FileStream connector, you must add the new path in the plugin.path configuration property as shown in the following example:

  • Confluent does not recommend the FileStream Connector for production use. If you want a production connector to read from files, use a Spool Dir connector.

The following examples include both a file source and a file sink to demonstrate end-to-end data flow through Kafka Connect, in a local environment. You can find additional information about this connector in the developer guide as a demonstration of how a custom connector can be implemented.

Quick start

For a simple demo of how to use the FileStream connectors, refer to the Kafka Connect quick start guide.

FileSource connector

The FileSource connector reads data from a file and sends it to Apache Kafka®. Beyond the configurations common to all connectors it takes only an input file and output topic as properties. Here is an example configuration:


This connector reads only one file and sends the data within that file to Kafka. The connector then watches the file for appended updates only. The connector does not reprocess any modification of file lines already sent to Kafka.

FileSink connector

The FileSink connector reads data from Kafka and outputs it to a local file. Multiple topics may be specified as with any other sink connector. The FileSink connector only supports the file property and the configurations common to all connectors. Here is an example key-value mapping:


As messages are added to the topics specified in the configuration, they are produced to a local file as specified in the configuration.