Schema Registry Maven Plugin¶
Looking for Schema Management Confluent Cloud docs? You are currently viewing Confluent Platform documentation. If you are looking for Confluent Cloud docs, check out Schema Management on Confluent Cloud.
A Maven plugin for Confluent Schema Registry is available to help throughout the development process, with configuration options as listed below.
Tip
There is no official out-of-the-box Gradle plugin available for Schema Registry. However, you can reference any of these plugins in your Maven pom.xml
(Project Object Model file):
- Imflog Kafka Schema Registry Gradle Plugin (link to GitHub repository for this plugin is here)
- com.commercehub.gradle.plugin.avro (link to GitHub repository for this plugin is here)
- Maven plugin, used in the pom.xml example file by Avro clients for the Schema Registry Tutorials
configs for all goals¶
Starting with Confluent Platform 7.0.0, the configs
option is available for the Schema Registry Maven
plugin for all goals. You can use configs
to add any valid configuration to
the CachedSchemaRegistryClient
.
The syntax details are:
configs
- Type: Map<String, String>
- Required: false
For example, to set up SSL for the Maven plugin, specify the keystore/truststore location by adding the following configuration to the plugin:
<configuration>
<configs>
<schema.registry.ssl.keystore.location>path-to-keystore.jks</schema.registry.ssl.keystore.location>
<schema.registry.ssl.keystore.password>password</schema.registry.ssl.keystore.password>
<schema.registry.ssl.truststore.location>path-to-truststore.jks</schema.registry.ssl.truststore.location>
<schema.registry.ssl.truststore.password>password</schema.registry.ssl.truststore.password>
</configs>
</configuration>
Note that the schema.registry.
prefix is needed, just like other Schema Registry client configurations for HTTPS.
schema-registry:download¶
The download
goal is used to pull down schemas from a Schema Registry server. This plugin is used to download schemas for
the requested subjects and write them to a folder on the local file system. If the versions
array is empty, then the latest
schemas are downloaded for all subjects. If the array is populated, its length must match the length of subjectPatterns
array.
This allows you to compare a new schema against a specific schema and to get older versions.
Tip
In Confluent Platform version 7.2.0-0 and later, you can specify a version to download, which better supports schema-registry:test-local-compatibility. Previous to Confluent Platform 7.2.0-0, you could only download the latest version.
schemaRegistryUrls
Schema Registry URLs to connect to.
- Type: String[]
- Required: true
userInfoConfig
User credentials for connecting to Schema Registry, of the form
user:password
. This is required if connecting to Confluent Cloud Schema Registry.- Type: String[]
- Required: false
- Default: null
outputDirectory
Output directory to write the schemas to.
- Type: File
- Required: true
schemaExtension
The file extension to use for the output file name. This must begin with a
.
character.- Type: File
- Required: false
- Default: .avsc
subjectPatterns
The subject patterns to download. This is a list of regular expressions. Patterns must match the entire subject name.
- Type: String[]
- Required: true
Example 1: Download the latest version of a schema¶
The following code uses the plugin to download the latest version of schemas with the subject pattern ^TestSubject000-(key|value)$
:
<plugin>
<groupId>io.confluent</groupId>
<artifactId>kafka-schema-registry-maven-plugin</artifactId>
<version>7.2.11</version>
<configuration>
<schemaRegistryUrls>
<param>http://192.168.99.100:8081</param>
</schemaRegistryUrls>
<outputDirectory>src/main/avro</outputDirectory>
<subjectPatterns>
<param>^TestSubject000-(key|value)$</param>
</subjectPatterns>
</configuration>
</plugin>
Example 2: Specify a schema version to download¶
The following code uses the plugin to download version 1
of schemas with the subject pattern
^topic1-(key|value)$
, and the latest version of schemas with the subject pattern ^topic2-(key|value)$
:
<plugin>
<groupId>io.confluent</groupId>
<artifactId>kafka-schema-registry-maven-plugin</artifactId>
<version>7.2.11</version>
<configuration>
<schemaRegistryUrls>
<param>http://127.0.0.1:8081</param>
</schemaRegistryUrls>
<outputDirectory>outputDir/target/</outputDirectory>
<subjectPatterns>
<param>^topic1-(key|value)$</param>
<param>^topic2-(key|value)$</param>
</subjectPatterns>
<versions>
<param>1</param>
<param>latest</param>
</versions>
</configuration>
</plugin>
schema-registry:set-compatibility¶
The goal, schema-registry:set-compatibility
, is available in Confluent Platform version 7.2.0-0 and later.
This goal is used to update the configuration of a subject or at a global level directly from the plugin. This enables you to change the compatibility levels with evolving schemas, and centralize your subject and schema management.
schemaRegistryUrls
Schema Registry URLs to connect to.
- Type: String[]
- Required: true
compatibilityLevels
Map of subjects and the compatibility types to be set in them, respectively.
- Type: Map<String,String> (Subject,CompatibilityLevel)
- Required: true
If subject is NULL
or __GLOBAL
, then the Global level configuration is updated.
If CompatibilityLevel
is NULL, then the configuration is deleted.
The following example uses the plugin to set the compatibility of the subject order
to BACKWARD
, product
to FORWARD_TRANSITIVE
, deleting the compatibility of customer
and changing the Global compatibility to BACKWARD_TRANSITIVE
:
<plugin>
<groupId>io.confluent</groupId>
<artifactId>kafka-schema-registry-maven-plugin</artifactId>
<version>7.2.11</version>
<configuration>
<schemaRegistryUrls>
<param>http://192.168.99.100:8081</param>
</schemaRegistryUrls>
<compatibilityLevels>
<order>BACKWARD</order>
<product>FORWARD_TRANSITIVE</product>
<customer>null</customer>
<__GLOBAL>BACKWARD_TRANSITIVE</__GLOBAL>
</compatibilityLevels>
</configuration>
</plugin>
Example Usage¶
Example usage of schema-registry:test-compatibility
:
- to a local Schema Registry, see Schema Registry Tutorials.
- to Confluent Cloud Schema Registry, see GitHub example.
schema-registry:test-local-compatibility¶
This goal, available in Confluent Platform version 7.2.0-0 and later, tests compatibility of a local schema with other existing local schemas during development and testing phases.
Before the addition of schema-registry:test-local-compatibility
, if you
wanted to check compatibility of a new schema you had to connect to the Schema Registry.
This meant registering all the schemas for which you want to perform
compatibility checks, resulting in a reduction in available free schemas.
This new goal solves that problem and supports quick, efficient
compatibility testing of local schemas as appropriate for development phases.
Tip
For examples of testing schema compatibility using GitHub Actions, see the Workflows and Examples provided below, and the kafka-github-actions demo repo.
schemas
Map of schema and location of schemas for which compatibility test is performed
- Type: Map<String, File>
- Required: true
previousSchemaPaths
Map of schema and location of previous schemas. The compatibility test is performedfor a schema against the schemas in
previousSchemaPaths
. The location can be a directory or file name. If it is a directory name, all files inside folder are added. Subdirectories are ignored.- Type: Map<String, File>
- Required: true
compatibilityLevels
Map of schema and the compatibility type for which check is performed.
CompatibilityLevel
is of type enum (one ofNONE
,BACKWARD
,BACKWARD_TRANSITIVE
,FORWARD
,FORWARD_TRANSITIVE
,FULL
, orFULL_TRANSITIVE
) For compatibility levelBACKWARD
,FORWARD
, orFULL
, exactly onepreviousSchema
is expected per schema.- Type: Map<String, CompatibilityLevel>
- Required: true
schemaTypes
String that specifies the schema type.
- Type: Map<String, String> (one of
AVRO
(default),JSON
,PROTOBUF
) - Required: false
- Default: AVRO
- Type: Map<String, String> (one of
The following example uses the plugin to configure three subjects (order
, product
, and customer
) using schema type: AVRO
<configuration>
<schemas>
<order>src/main/avro/order.avsc</order>
<product>src/main/avro/product.avsc</product>
<customer>src/main/avro/customer.avsc</customer>
</schemas>
<schemaTypes>
<order>AVRO</order>
<product>AVRO</product>
<customer>AVRO</customer>
</schemaTypes>
<compatibilityLevels>
<order>BACKWARD</order>
<product>FORWARD</product>
<customer>NONE</customer>
</compatibilityLevels>
<previousSchemaPaths>
<order>src/main/avro/order.avsc</order>
<product>src/main/avro/products/</product>
<customer>src/main/avro/customer.avsc</customer>
</previousSchemaPaths>
</configuration>
schema-registry:test-compatibility¶
This goal is used to read schemas from the local file system and test them for compatibility against the Schema Registry server(s). This goal can be used in a continuous integration pipeline to ensure that schemas in the project are compatible with the schemas in another environment.
Tip
For examples of testing schema compatibility using GitHub Actions, see the Workflows and Examples provided below, and the kafka-github-actions demo repo.
schemaRegistryUrls
Schema Registry URLs to connect to.
- Type: String[]
- Required: true
userInfoConfig
User credentials for connecting to Schema Registry, of the form
user:password
. This is required if connecting to Confluent Cloud Schema Registry.- Type: String[]
- Required: false
- Default: null
subjects
Map containing subject to schema path of the subjects to be registered.
- Type: Map<String, File>
- Required: true
Tip
Starting with Confluent Platform 5.5.5, you can specify a slash
/
, and other special characters, in a subject name. To do so, first URL-encode the subject name, and then replace non-valid characters in the output. For example, if you have a subject name such aspath/to/my.proto
, the URL encoding would produce something like`%2Fpath%2Fto%2Fmy.proto`
, which you can then revise by replacing%
with_x
as follows:_x2Fpath_x2Fto_x2Fmy.proto
(because%
is not valid in an XML name). The reasoning behind this is that the Maven plugin requires subject names be specified as XML elements, but some characters, like slashes, are not valid characters in an XML name. You might want to use slashes to register a Protobuf schema that is referenced by another schema, such as/path/to/my.proto
for the subject. This workaround enables you to do that.schemaTypes
String that specifies the schema type.
- Type: String (one of
AVRO
(default),JSON
,PROTOBUF
) - Required: false
- Default: AVRO
- Type: String (one of
references
Map containing a reference name and a subject.
- Type: Map<String, Reference[]>
- Required: false
verbose
Include in the output the reason the schema fails the compatibility test, in cases where it fails.
- Type: Boolean
- Required: false
- Default: true
The following example uses the plugin to configure three subjects (order
, product
, and customer
) using schema type: AVRO
<plugin>
<groupId>io.confluent</groupId>
<artifactId>kafka-schema-registry-maven-plugin</artifactId>
<version>7.2.11</version>
<configuration>
<schemaRegistryUrls>
<param>http://192.168.99.100:8081</param>
</schemaRegistryUrls>
<subjects>
<order>src/main/avro/order.avsc</order>
<product>src/main/avro/product.avsc</product>
<customer>src/main/avro/customer.avsc</customer>
</subjects>
<schemaTypes>
<order>AVRO</order>
<product>AVRO</product>
<customer>AVRO</customer>
</schemaTypes>
<references>
<order>
<reference>
<name>com.acme.Product</name>
<subject>product</subject>
</reference>
<reference>
<name>com.acme.Customer</name>
<subject>customer</subject>
</reference>
</order>
</references>
</configuration>
<goals>
<goal>test-compatibility</goal>
</goals>
</plugin>
Example Usage¶
Example usage of schema-registry:test-compatibility
:
- to a local Schema Registry, see Schema Registry Tutorials.
- to Confluent Cloud Schema Registry, see GitHub example.
schema-registry:validate¶
This goal is used to read schemas from the local file system and validate them locally, before registering them.
If you find syntax errors, you can examine and correct them before submitting schemas to Schema Registry with schema-registry:register
.
schemaRegistryUrls
Schema Registry URLs to connect to.
- Type: String[]
- Required: true
userInfoConfig
User credentials for connecting to Schema Registry, of the form
user:password
. This is required if connecting to Confluent Cloud Schema Registry.- Type: String[]
- Required: false
- Default: null
subjects
Map containing subject to schema path of the subjects to be registered.
- Type: Map<String, File>
- Required: true
schemaTypes
String that specifies the schema type.
- Type: String (one of
AVRO
(default),JSON
,PROTOBUF
) - Required: false
- Default: AVRO
- Type: String (one of
references
Map containing a reference name and a subject.
- Type: Map<String, Reference[]>
- Required: false
schema-registry:register¶
This goal is used to read schemas from the local file system and register them on the target Schema Registry server(s). This goal can be used in a continuous deployment pipeline to push schemas to a new environment.
schemaRegistryUrls
Schema Registry URLs to connect to.
- Type: String[]
- Required: true
userInfoConfig
User credentials for connecting to Schema Registry, of the form
user:password
. This is required if connecting to Confluent Cloud Schema Registry.- Type: String[]
- Required: false
- Default: null
subjects
Map containing subject to schema path of the subjects to be registered.
- Type: Map<String, File>
- Required: true
schemaTypes
String that specifies the schema type.
- Type: String (one of
AVRO
(default),JSON
,PROTOBUF
) - Required: false
- Default: AVRO
- Type: String (one of
normalizeSchemas
Normalizes schemas based on semantic equivalence during registration or lookup. To learn more, see Schema Normalization in Formats, Serializers, and Deserializers.
- Type: Boolean
- Required: false
- Default: false
references
Map containing a reference name and a subject.
- Type: Map<String, Reference[]>
- Required: false
The following example uses the plugin to configure three subjects (order
, product
, and customer
) using schema type: AVRO
<plugin>
<groupId>io.confluent</groupId>
<artifactId>kafka-schema-registry-maven-plugin</artifactId>
<version>7.2.11</version>
<configuration>
<schemaRegistryUrls>
<param>http://192.168.99.100:8081</param>
</schemaRegistryUrls>
<subjects>
<order>src/main/avro/order.avsc</order>
<product>src/main/avro/product.avsc</product>
<customer>src/main/avro/customer.avsc</customer>
</subjects>
<schemaTypes>
<order>AVRO</order>
<product>AVRO</product>
<customer>AVRO</customer>
</schemaTypes>
<references>
<order>
<reference>
<name>com.acme.Product</name>
<subject>product</subject>
</reference>
<reference>
<name>com.acme.Customer</name>
<subject>customer</subject>
</reference>
</order>
</references>
</configuration>
<goals>
<goal>register</goal>
</goals>
</plugin>
Workflows and Examples¶
You can integrate Maven Plugin goals with GitHub Actions into a continuous integration/continuous deployment (CI/CD) pipleline to manage schemas on Schema Registry. A general example for developing and validating an Apache Kafka® client application with a Python producer and consumer is provided in the kafka-github-actions demo repo.
Here is an alternative sample pom.xml with project configurations for more detailed validate and register steps.
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>io.confluent</groupId>
<artifactId>GitHub-Actions-Demo</artifactId>
<version>1.0</version>
<pluginRepositories>
<pluginRepository>
<id>confluent</id>
<url>https://packages.confluent.io/maven/</url>
</pluginRepository>
</pluginRepositories>
<properties>
<schemaRegistryUrl><$CONFLUENT_SCHEMA_REGISTRY_URL></schemaRegistryUrl>
<schemaRegistryBasicAuthUserInfo>
<$CONFLUENT_BASIC_AUTH_USER_INFO>
</schemaRegistryBasicAuthUserInfo>
<confluent.version>7.2.11</confluent.version>
</properties>
<build>
<plugins>
<plugin>
<groupId>io.confluent</groupId>
<artifactId>kafka-schema-registry-maven-plugin</artifactId>
<version>${confluent.version}</version>
<configuration>
<schemaRegistryUrls>
<param>${schemaRegistryUrl}</param>
</schemaRegistryUrls>
<userInfoConfig>${schemaRegistryBasicAuthUserInfo}</userInfoConfig>
</configuration>
<executions>
<execution>
<id>validate</id>
<phase>validate</phase>
<goals>
<goal>validate</goal>
</goals>
<configuration>
<subjects>
<Orders-value>src/main/resources/order.avsc</Orders-value>
<Flights-value>src/main/resources/flight.proto</Flights-value>
</subjects>
<schemaTypes>
<Flights-value>PROTOBUF</Flights-value>
</schemaTypes>
</configuration>
</execution>
<execution>
<id>set-compatibility</id>
<phase>validate</phase>
<goals>
<goal>set-compatibility</goal>
</goals>
<configuration>
<compatibilityLevels>
<Orders-value>FORWARD_TRANSITIVE</Orders-value>
<Flights-value>FORWARD_TRANSITIVE</Flights-value>
</compatibilityLevels>
</configuration>
</execution>
<execution>
<id>test-local</id>
<phase>validate</phase>
<goals>
<goal>test-local-compatibility</goal>
</goals>
<configuration>
<schemas>
<order>src/main/resources/order.avsc</order>
<flight>src/main/resources/flight.proto</flight>
</schemas>
<schemaTypes>
<flight>ProtoBuf</flight>
</schemaTypes>
<previousSchemaPaths>
<flight>src/main/resources/flightSchemas</flight>
<order>src/main/resources/orderSchemas</order>
</previousSchemaPaths>
<compatibilityLevels>
<order>FORWARD_TRANSITIVE</order>
<flight>FORWARD_TRANSITIVE</flight>
</compatibilityLevels>
</configuration>
</execution>
<execution>
<id>test-compatibility</id>
<phase>validate</phase>
<goals>
<goal>test-compatibility</goal>
</goals>
<configuration>
<subjects>
<Orders-value>src/main/resources/order.avsc</Orders-value>
<Flights-value>src/main/resources/flight.proto</Flights-value>
</subjects>
<schemaTypes>
<Flights-value>PROTOBUF</Flights-value>
</schemaTypes>
</configuration>
</execution>
<execution>
<id>register</id>
<goals>
<goal>register</goal>
</goals>
<configuration>
<subjects>
<Orders-value>src/main/resources/order.avsc</Orders-value>
<Flights-value>src/main/resources/flight.proto</Flights-value>
</subjects>
<schemaTypes>
<Flights-value>PROTOBUF</Flights-value>
</schemaTypes>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
The following workflows can be coded as GitHub actions to accomplish CICD for schema management.
When a pull request is created to merge a new schema to master, validate the schema, check local schema compatibility, set compatibility of subject, and test schema compatibility with subject.
run: mvn validate
The validate step would include:
mvn schema-registry:validate@validate mvn schema-registry:test-local-compatibility@test-local mvn schema-registry:set-compatibility@set-compatibility mvn schema-registry:test-compatibility@test-compatibility
Integrated with GitHub Actions, the
pull-request.yaml
for this step might look like this:name: Testing branch for compatibility before merging on: pull_request: branches: [ master ] paths: [src/main/resources/*] jobs: validate: runs-on: ubuntu-latest steps: - uses: actions/checkout@v2 - uses: actions/setup-java@v2 with: java-version: '11' distribution: 'temurin' cache: maven - name: Validate if schema is valid run: mvn schema-registry:validate@validate test-local-compatibility: needs: validate runs-on: ubuntu-latest steps: - uses: actions/checkout@v2 - uses: actions/setup-java@v2 with: java-version: '11' distribution: 'temurin' cache: maven - name: Test schema with locally present schema run: mvn schema-registry:test-local-compatibility@test-local set-compatibility: needs: test-local-compatibility runs-on: ubuntu-latest steps: - uses: actions/checkout@v2 - uses: actions/setup-java@v2 with: java-version: '11' distribution: 'temurin' cache: maven - name: Set compatibility of subject run: mvn schema-registry:set-compatibility@set-compatibility test-compatibility: needs: set-compatibility runs-on: ubuntu-latest steps: - uses: actions/checkout@v2 - uses: actions/setup-java@v2 with: java-version: '11' distribution: 'temurin' cache: maven - name: Test schema with subject run: mvn schema-registry:test-compatibility@test-compatibility
If compatibility checking passes a new pull request is created for approval.
Register schema when a pull request is approved and merged to master.
Run the action to register the new schema on the Schema Registry:
run: mvn schema-registry:register@register
The
push.yaml
for this step would look like this:name: Registering Schema on merge of pull request on: push: branches: [ master ] paths: [src/main/resources/*] jobs: register-schema: runs-on: ubuntu-latest steps: - uses: actions/checkout@v2 - uses: actions/setup-java@v2 with: java-version: '11' distribution: 'temurin' cache: maven - name: Register Schema run: mvn io.confluent:kafka-schema-registry-maven-plugin:register@register