Kafka Connect FileStream Connectors¶
The Kafka Connect FileStream Connector examples are intended to show how a simple connector runs for users getting started with Apache Kafka®.
Confluent does not recommended the FileStream Connector for production use. If you want a production connector to read from and write to files, use a Spool Dir connector.
The following examples include both a file source and a file sink to demonstrate end-to-end data flow through Kafka Connect, in a local environment. You can find additional information about this connector in the developer guide as a demonstration of how a custom connector can be implemented.
A simple demo of how to use the FileStreamSourceConnector and FileStreamSinkConnector is available in the Kafka Connect quick start guide.
The FileSource Connector reads data from a file and sends it to Apache Kafka®. Beyond the configurations common
to all connectors it takes only an input
file and output
topic as properties.
Here is an example configuration:
name=local-file-source connector.class=FileStreamSource tasks.max=1 file=/tmp/test.txt topic=connect-test
This connector will read only one file and send the data within that file to Kafka. It will then watch the file for appended updates only. Any modification of file lines already sent to Kafka will not be reprocessed.
The FileSink Connector reads data from Kafka and outputs it to a local file. Multiple topics may be specified as with any other
sink connector. The FileSink Connector takes only a
file property in addition to the configurations common
to all connectors. Here is an example configuration:
name=local-file-sink connector.class=FileStreamSink tasks.max=1 file=/tmp/test.sink.txt topics=connect-test
As messages are added to the topics specified in the configuration, they are produced to a local file as specified in the configuration.