You are viewing documentation for an older version of Confluent Platform. For the latest, click here.

Schema Registry

Schema Registry provides a serving layer for your metadata. It provides a RESTful interface for storing and retrieving Avro schemas. It stores a versioned history of all schemas, provides multiple compatibility settings and allows evolution of schemas according to the configured compatibility setting. It provides serializers that plug into Kafka clients that handle schema storage and retrieval for Kafka messages that are sent in the Avro format.


Start by running the Schema Registry and the services it depends on: ZooKeeper and Kafka:

$ ./bin/zookeeper-server-start ./etc/kafka/ &
$ ./bin/kafka-server-start ./etc/kafka/ &
$ ./bin/schema-registry-start ./etc/schema-registry/ &

See the Confluent Platform quickstart for a more detailed explanation of how to get these services up and running.

# Register a new version of a schema under the subject "Kafka-key"
$ curl -X POST -H "Content-Type: application/vnd.schemaregistry.v1+json" \
    --data '{"schema": "{\"type\": \"string\"}"}' \

# Register a new version of a schema under the subject "Kafka-value"
$ curl -X POST -H "Content-Type: application/vnd.schemaregistry.v1+json" \
    --data '{"schema": "{\"type\": \"string\"}"}' \

# List all subjects
$ curl -X GET http://localhost:8081/subjects

# List all schema versions registered under the subject "Kafka-value"
$ curl -X GET http://localhost:8081/subjects/Kafka-value/versions

# Fetch a schema by globally unique id 1
$ curl -X GET http://localhost:8081/schemas/ids/1

# Fetch version 1 of the schema registered under subject "Kafka-value"
$ curl -X GET http://localhost:8081/subjects/Kafka-value/versions/1

# Fetch the most recently registered schema under subject "Kafka-value"
$ curl -X GET http://localhost:8081/subjects/Kafka-value/versions/latest

# Check whether a schema has been registered under subject "Kafka-key"
$ curl -X POST -H "Content-Type: application/vnd.schemaregistry.v1+json" \
    --data '{"schema": "{\"type\": \"string\"}"}' \

# Test compatibility of a schema with the latest schema under subject "Kafka-value"
$ curl -X POST -H "Content-Type: application/vnd.schemaregistry.v1+json" \
    --data '{"schema": "{\"type\": \"string\"}"}' \

# Get top level config
$ curl -X GET http://localhost:8081/config

# Update compatibility requirements globally
$ curl -X PUT -H "Content-Type: application/vnd.schemaregistry.v1+json" \
    --data '{"compatibility": "NONE"}' \

# Update compatibility requirements under the subject "Kafka-value"
$ curl -X PUT -H "Content-Type: application/vnd.schemaregistry.v1+json" \
    --data '{"compatibility": "BACKWARD"}' \


See the installation instructions for the Confluent Platform. Before starting the Schema Registry you must start Kafka. The Confluent Platform quickstart explains how to start these services locally for testing.


Starting the Schema Registry service is simple once its dependencies are running.

Note: The Schema Registry version must not exceed the CP/Kafka version. That’s to say Schema Registry 3.0 will not be compatible with Kafka 0.9.x. See the Requirements section below for version compatibility.

$ cd confluent-3.0.0/

# The default settings in work automatically with
# the default settings for local ZooKeeper and Kafka nodes.
$ bin/schema-registry-start etc/schema-registry/

If you installed Debian or RPM packages, you can simply run schema-registry-start as it will be on your PATH. The file contains configuration settings. The default configuration included with the Schema Registry includes convenient defaults for a local testing setup and should be modified for a production deployment. By default the server starts bound to port 8081, expects Zookeeper to be available at localhost:2181, and a Kafka broker at localhost:9092.

If you started the service in the background, you can use the following command to stop it:

$ bin/schema-registry-stop


To build a development version, you may need a development versions of common and rest-utils. After installing these, you can build the Schema Registry with Maven. All the standard lifecycle phases work. During development, use

$ mvn compile

to build,

$ mvn test

to run the unit and integration tests, and

$ mvn exec:java

to run an instance of the Schema Registry against a local Kafka cluster (using the default configuration included with Kafka).

To create a packaged version, optionally skipping the tests:

$ mvn package [-DskipTests]

This will produce a version ready for production in package/target/kafka-schema-registry-package-$VERSION-package containing a directory layout similar to the packaged binary versions. You can also produce a standalone fat jar using the standalone profile:

$ mvn package -P standalone [-DskipTests]

generating package/target/kafka-schema-registry-package-$VERSION-standalone.jar, which includes all the dependencies as well.


  • Kafka:


The Schema Registry is licensed under the Apache 2 license.