Schema Registry

Schema Registry provides a serving layer for your metadata. It provides a RESTful interface for storing and retrieving Avro schemas. It stores a versioned history of all schemas, provides multiple compatibility settings and allows evolution of schemas according to the configured compatibility setting. It provides serializers that plug into Kafka clients that handle schema storage and retrieval for Kafka messages that are sent in the Avro format.


Start by running the Schema Registry and the services it depends on: ZooKeeper and Kafka:

$ ./bin/zookeeper-server-start ./etc/kafka/ &
$ ./bin/kafka-server-start ./etc/kafka/ &
$ ./bin/schema-registry-start ./etc/schema-registry/ &

See the Confluent Platform quickstart for a more detailed explanation of how to get these services up and running.

# Register a new version of a schema under the subject "Kafka-key"
$ curl -X POST -i -H "Content-Type: application/vnd.schemaregistry.v1+json" \
    --data '{"schema": "{\"type\": \"string\"}"}' \

# Register a new version of a schema under the subject "Kafka-value"
$ curl -X POST -i -H "Content-Type: application/vnd.schemaregistry.v1+json" \
    --data '{"schema": "{\"type\": \"string\"}"}' \

# List all subjects
$ curl -X GET -i http://localhost:8081/subjects

# List all schema versions registered under the subject "Kafka-value"
$ curl -X GET -i http://localhost:8081/subjects/Kafka-value/versions

# Fetch a schema by globally unique id 1
$ curl -X GET -i http://localhost:8081/schemas/ids/1

# Fetch version 1 of the schema registered under subject "Kafka-value"
$ curl -X GET -i http://localhost:8081/subjects/Kafka-value/versions/1

# Fetch the most recently registered schema under subject "Kafka-value"
$ curl -X GET -i http://localhost:8081/subjects/Kafka-value/versions/latest

# Check whether a schema has been registered under subject "Kafka-key"
$ curl -X POST -i -H "Content-Type: application/vnd.schemaregistry.v1+json" \
    --data '{"schema": "{\"type\": \"string\"}"}' \

# Test compatibility of a schema with the latest schema under subject "Kafka-value"
$ curl -X POST -i -H "Content-Type: application/vnd.schemaregistry.v1+json" \
    --data '{"schema": "{\"type\": \"string\"}"}' \

# Get top level config
$ curl -X GET -i http://localhost:8081/config

# Update compatibility requirements globally
$ curl -X PUT -i -H "Content-Type: application/vnd.schemaregistry.v1+json" \
    --data '{"compatibility": "NONE"}' \

# Update compatibility requirements under the subject "Kafka-value"
$ curl -X PUT -i -H "Content-Type: application/vnd.schemaregistry.v1+json" \
    --data '{"compatibility": "BACKWARD"}' \


See the installation instructions for the Confluent Platform. Before starting the Schema Registry you must start Kafka. The Confluent Platform quickstart explains how to start these services locally for testing.

Starting the Schema Registry service is simple once its dependencies are running:

$ cd confluent-1.0/

# The default settings in work automatically with
# the default settings for local ZooKeeper and Kafka nodes.
$ bin/schema-registry-start etc/schema-registry/

If you installed Debian or RPM packages, you can simply run schema-registry-start as it will be on your PATH. If you started the service in the background, you can use the following command to stop it:

$ bin/schema-registry-stop


The REST interface to schema registry includes a built-in Jetty server. The wrapper scripts bin/schema-registry-start and bin/schema-registry-stop are the recommended method of starting and stopping the service. However, you can also start the server directly yourself:

$ bin/schema-registry-start []

where contains configuration settings as specified by the SchemaRegistryConfig class. Although the properties file is not required, the default configuration is not intended for production. Production deployments should specify a properties file. By default the server starts bound to port 8081, expects Zookeeper to be available at localhost:2181, and a Kafka broker at localhost:9092.


To build a development version, you may need a development versions of common and rest-utils. After installing these, you can build the Schema Registry with Maven. All the standard lifecycle phases work. During development, use

$ mvn compile

to build,

$ mvn test

to run the unit and integration tests, and

$ mvn exec:java

to run an instance of the Schema Registry against a local Kafka cluster (using the default configuration included with Kafka).

To create a packaged version, optionally skipping the tests:

$ mvn package [-DskipTests]

This will produce a version ready for production in package/target/kafka-schema-registry-package-$VERSION-package containing a directory layout similar to the packaged binary versions. You can also produce a standalone fat jar using the standalone profile:

$ mvn package -P standalone [-DskipTests]

generating package/target/kafka-schema-registry-package-$VERSION-standalone.jar, which includes all the

dependencies as well.


  • Kafka:


The Schema Registry is licensed under the Apache 2 license.