.. _schemaregistry_kafka_connect: Kafka Connect ============= Kafka Connect and Schema Registry integrate to capture schema information from connectors. :ref:`Kafka Connect converters ` provide a mechanism for converting data from the internal data types used by Kafka Connect to data types represented as Avro. The AvroConverter will automatically register schemas generated by source connectors. Sink Connectors will receive schema information in addition to the data for each message. This allows sink connectors to know the structure of the data to provide additional capabilities like maintaining a database table structure or creating a search index. The AvroConverter will convert existing Avro data to the internal data types used by Kafka Connect. Example Configuration --------------------- Configuring Kafka Connect to use the schema registry requires the user to change the ``key.converter`` or ``value.converter`` properties in the :ref:`Connect worker configuration `. The ``key.converter`` and ``value.converter`` properties can be configured independently of each other. .. sourcecode:: properties key.converter=io.confluent.connect.avro.AvroConverter key.converter.schema.registry.url=http://localhost:8081 value.converter=io.confluent.connect.avro.AvroConverter value.converter.schema.registry.url=http://localhost:8081 Configuration Options --------------------- ``schema.registry.url`` Comma-separated list of URLs for schema registry instances that can be used to register or look up schemas. * Type: list * Default: "" * Importance: high ``auto.register.schemas`` Specify if the Serializer should attempt to register the Schema with Schema Registry * Type: boolean * Default: true * Importance: medium ``max.schemas.per.subject`` Maximum number of schemas to create or cache locally. * Type: int * Default: 1000 * Importance: low ``key.subject.name.strategy`` Determines how to construct the subject name under which the key schema is registered with the schema registry. Any implementation of ``io.confluent.kafka.serializers.subject.SubjectNameStrategy`` can be specified. By default, -key is used as subject. * Type: class * Default: class io.confluent.kafka.serializers.subject.TopicNameStrategy * Importance: medium ``value.subject.name.strategy`` Determines how to construct the subject name under which the value schema is registered with the schema registry. Any implementation of ``io.confluent.kafka.serializers.subject.SubjectNameStrategy`` can be specified. By default, -value is used as subject. * Type: class * Default: class io.confluent.kafka.serializers.subject.TopicNameStrategy * Importance: medium ``basic.auth.credentials.source`` Specify how to pick the credentials for Basic Auth header. The supported values are URL, USER_INFO and SASL_INHERIT * Type: string * Default: "URL" * Importance: medium ``schema.registry.basic.auth.user.info`` Specify the user info for Basic Auth in the form of {username}:{password} * Type: password * Default: "" * Importance: medium