Schema Registry API Usage Examples for Confluent Platform

This section provides examples of calls to the Schema Registry API using curl commands.

You can also see a few more examples of using curl to interact with these APIs in the Schema Registry Tutorial.

For your API testing, you may want to use jq along with --silent flag for curl to get nicely formatted output for the given commands. For example, the command: curl -X GET http://localhost:8081/subjects results in:

["my-cool-topic-value","my-other-cool-topic-value"]
Copy

Whereas, the same command using curl in silent mode and piped through jq: curl --silent -X GET http://localhost:8081/subjects | jq results in:

"my-cool-topic-value",
"my-other-cool-topic-value"
Copy

Starting Schema Registry

Start Schema Registry and its dependent services ZooKeeper and Kafka. Each service reads its configuration from its property files under etc.

Development or Test Environment

Prerequisites

You can use the Confluent CLI confluent local commands to start Schema Registry and its dependent services with this command:

confluent local services schema-registry start
Copy

Important

The Confluent CLI confluent local commands are intended for a single-node development environment and are not suitable for a production environment. The data that are produced are transient and are intended to be temporary. For production-ready workflows, see Install and Upgrade Confluent Platform.

Production Environment

Start each Confluent Platform service in its own terminal using this order of operations:

  1. Start ZooKeeper. Run this command in its own terminal.

    bin/zookeeper-server-start ./etc/kafka/zookeeper.properties
    
    Copy
  2. Start Kafka. Run this command in its own terminal.

    bin/kafka-server-start ./etc/kafka/server.properties
    
    Copy
  3. Start Schema Registry. Run this command in its own terminal.

    bin/schema-registry-start ./etc/schema-registry/schema-registry.properties
    
    Copy

See the Install Confluent Platform On-Premises for a more detailed explanation of how to get these services up and running.

Common Schema Registry API Usage Examples

See also

These examples use curl commands to interact with the Schema Registry Schema Registry API.

Commands and results are shown separately to make it easy to copy-paste the commands into a shell.

For Schema Registry on Confluent Cloud, pass the API key and secret with the --user (or -u) flag on the curl command. For example, to view all subjects in the registry:

curl --user <schema-registry-api-key>:<schema-registry-api-secret> \
<schema-registry-url>/subjects
Copy

Register a new version of a schema under the subject “Kafka-key”

curl -X POST -H "Content-Type: application/vnd.schemaregistry.v1+json" \
  --data '{"schema": "{\"type\": \"string\"}"}' \
  http://localhost:8081/subjects/Kafka-key/versions
Copy

Example result:

{"id":1}
Copy

Register a new version of a schema under the subject “Kafka-value”

curl -X POST -H "Content-Type: application/vnd.schemaregistry.v1+json" \
--data '{"schema": "{\"type\": \"string\"}"}' \
http://localhost:8081/subjects/Kafka-value/versions
Copy

Example result:

{"id":1}
Copy

Register an existing schema to a new subject name

Use case: there is an existing schema registered to a subject called Kafka1, and this same schema needs to be available to another subject called Kafka2. The following one-line command reads the existing schema from Kafka1-value and registers it to Kafka2-value. It assumes the tool jq is installed on your machine.

curl -X POST -H "Content-Type: application/vnd.schemaregistry.v1+json" \
--data "{\"schema\": $(curl -s http://localhost:8081/subjects/Kafka1-value/versions/latest | jq '.schema')}" \
http://localhost:8081/subjects/Kafka2-value/versions
Copy

Example result:

{"id":1}
Copy

Tip

You do not need to use the AvroConverter for topic replication or schema management, even if the topic is Avro format. The ByteArrayConverter retains the “magic byte”, which is the schema ID. When a replicator is created, messages are replicated with the schema ID. You do not need to create a schema subject. A best practice is to avoid the use of Avro due to the overhead, as there is no real value to it in this context.

To learn more, see Tutorial: Replicate Data Across Kafka Clusters in Confluent Platform, including the use of topic.rename.format=${topic}.replica in the subsection on Configure and run Replicator, and Configure Replicator for Cross-Cluster Failover in Confluent Platform.

List all subjects

The following API call lists all schema subjects.

curl -X GET http://localhost:8081/subjects
Copy

Example result:

["Kafka-value","Kafka-key"]
Copy

You can use the deleted flag at the end of the request to list all subjects, including subjects that have been soft-deleted (?deleted=true).

curl -X GET http://localhost:8081/subjects?deleted=true
Copy

Example result, assuming you had a schema subject called “my-cool-topic-value” that was previously soft-deleted:

["Kafka-value","Kafka-key","my-cool-topic-value"]
Copy

List all subjects associated with a given ID

To find subjects associated with a given ID, use GET /schemas/ids/{int: id}/versions.

Fetch a schema by globally unique ID 1

curl -X GET http://localhost:8081/schemas/ids/1
Copy

Example result:

{"schema":"\"string\""}
Copy

List all schema versions registered under the subject “Kafka-value”

curl -X GET http://localhost:8081/subjects/Kafka-value/versions
Copy

Example result:

[1]
Copy

Fetch Version 1 of the schema registered under subject “Kafka-value”

curl -X GET http://localhost:8081/subjects/Kafka-value/versions/1
Copy

Example result:

{"subject":"Kafka-value","version":1,"id":1,"schema":"\"string\""}
Copy

Tip

If the schema type is JSON Schema or Protobuf, the response will also include the schema type. If the schema type is Avro, which is the default, the schema type is not included in the response, per the above example.

Delete Version 1 of the schema registered under subject “Kafka-value”

curl -X DELETE http://localhost:8081/subjects/Kafka-value/versions/1
Copy

Example result:

1
Copy

Delete the most recently registered schema under subject “Kafka-value”

curl -X DELETE http://localhost:8081/subjects/Kafka-value/versions/latest
Copy

Example result:

2
Copy

Register the same schema under the subject “Kafka-value”

curl -X POST -H "Content-Type: application/vnd.schemaregistry.v1+json" \
  --data '{"schema": "{\"type\": \"string\"}"}' \
   http://localhost:8081/subjects/Kafka-value/versions
Copy

Example result:

{"id":1}
Copy

Fetch the schema again by globally unique ID 1

curl -X GET http://localhost:8081/schemas/ids/1
Copy

Example result:

{"schema":"\"string\""}
Copy

Check if a schema Is registered under subject “Kafka-key”

curl -X POST -H "Content-Type: application/vnd.schemaregistry.v1+json" \
  --data '{"schema": "{\"type\": \"string\"}"}' \
  http://localhost:8081/subjects/Kafka-key
Copy

Example result:

{"subject":"Kafka-key","version":3,"id":1,"schema":"\"string\""}
Copy

Test compatibility of a schema with the latest schema under subject “Kafka-value”

curl -X POST -H "Content-Type: application/vnd.schemaregistry.v1+json" \
  --data '{"schema": "{\"type\": \"string\"}"}' \
  http://localhost:8081/compatibility/subjects/Kafka-value/versions/latest
Copy

Example result:

{"is_compatible":true}
Copy

Tip

Starting with Confluent Platform 6.1.0, you can add ?verbose=true at the end of the request to output the reason a schema fails the compatibility test, in cases where it fails. To learn more, see Compatibility in the Schema Registry API Reference.

Get the top level config

curl -X GET http://localhost:8081/config
Copy

Example result:

{"compatibility":"BACKWARD"}
Copy

Update compatibility requirements globally

curl -X PUT -H "Content-Type: application/vnd.schemaregistry.v1+json" \
  --data '{"compatibility": "NONE"}' \
  http://localhost:8081/config
Copy

Example result:

{"compatibility":"NONE"}
Copy

Register a schema for a new topic

Tip

This example and the next few examples refer to a new topic called my-kafka which will be used to demonstrate subject-level compatibility configuration. Assume for these examples that you have created this topic either in the Confluent Control Center or at Kafka command line using. If you would like to stick with the command line and create the topic now to follow along, use Kafka commands similar to the following to create the topic, then check for its existence:

kafka-topics --create --bootstrap-server localhost:9092 --partitions 1 --replication-factor 1 --topic my-kafka
kafka-topics --list --bootstrap-server localhost:9092
Copy

Use the Schema Registry API to add a schema for the topic my-kafka.

curl -X POST -H "Content-Type: application/vnd.schemaregistry.v1+json" --data '{"schema": "{\"type\":\"record\",\"name\":\"Payment\",\"namespace\":\"my.examples\",\"fields\":[{\"name\":\"id\",\"type\":\"string\"},{\"name\":\"amount\",\"type\":\"double\"}]}"}' http://localhost:8081/subjects/my-kafka-value/versions
Copy

Example result:

{"id":1}
Copy

Update compatibility requirements on a subject

curl -X PUT -H "Content-Type: application/vnd.schemaregistry.v1+json" --data '{"compatibility": "FULL"}' http://localhost:8081/config/my-kafka-value
Copy

Example result:

{"compatibility":"FULL"}
Copy

Get compatibility requirements on a subject

curl -X GET http://localhost:8081/config/my-kafka-value
Copy

Example result:

{"compatibilityLevel":"FULL"}
Copy

Tip

If the subject you ask about does not have a subject-specific compatibility level set, this command returns an error code. For example, if you run the same command for the subject Kafka-value, for which you have not set subject-specific compatibility, you get:

{"error_code":40401,"message":"Subject 'Kafka-value' not found."}
Copy

Show compatibility requirements in effect for a subject

You can use the flag defaultToGlobal to determine what compatibility requirements, if any, are set at the subject level and what requirements will be used for compatibility checks. These are often, but not always, the same if a subject has subject-level compatibility set.

  • For the subject my-kafka-value, which has a subject-specific compatibility set to “FULL”, defaultToGlobal=true and defaultToGlobal=false both return {"compatibilityLevel":"FULL"}.

    curl -X GET http://localhost:8081/config/my-kafka-value/?defaultToGlobal=true
    
    Copy

    Example result:

    {"compatibilityLevel":"FULL"}
    
    Copy
    curl -X GET http://localhost:8081/config/my-kafka-value/?defaultToGlobal=false
    
    Copy

    Example result:

    {"compatibilityLevel":"FULL"}
    
    Copy
  • For the subject Kafka-value, for which you have not set subject-specific compatibility, defaultToGlobal=true returns the current global default, for example: {"compatibilityLevel":"NONE"}.

    curl -X GET http://localhost:8081/config/Kafka-value/?defaultToGlobal=true
    
    Copy

    Example result:

    {"compatibilityLevel":"NONE"}
    
    Copy
  • Whereas, defaultToGlobal=false on the subject Kafka-value returns an error code:

    curl -X GET http://localhost:8081/config/Kafka-value/?defaultToGlobal=false
    
    Copy

    Example result:

    {"error_code":40401,"message":"Subject 'Kafka-value' not found."}
    
    Copy

Delete all schema versions registered under the subject “Kafka-value”

curl -X DELETE http://localhost:8081/subjects/Kafka-value
Copy

Example result:

[3]
Copy

List schema types currently registered in Schema Registry

curl -X GET http://localhost:8081/schemas/types
Copy

Example result:

["JSON", "PROTOBUF", "AVRO"]
Copy

List all subject-version pairs where a given ID is used

curl -X GET http://localhost:8081/schemas/ids/2/versions
Copy

Example result:

[{"subject":"testproto-value","version":1}]
Copy

List IDs of schemas that reference a given schema

curl -X GET http://localhost:8081/subjects/other.proto/versions/1/referencedby
Copy

Example result:

[2]
Copy

Using Schema Registry over HTTPS

The curl command examples provided above show how to communicate with Schema Registry over HTTP.

These examples show how to communicate with Schema Registry over HTTPS. You can extrapolate from these few examples to know how to run additional commands. When communicating with Schema Registry with HTTPS enabled, apply the patterns shown for the curl commands (specifying a certificate, key, and so forth) to accomplish the other usage examples shown above. For more about configuring and using Schema Registry with security enabled, see Secure Schema Registry for Confluent Platform.

Verify HTTPS on Schema Registry

openssl s_client -connect schemaregistry:8082/subjects -cert client.certificate.pem -key client.key -tls1
Copy

Register a new version of a schema under the subject “Kafka-key”

curl -v -X POST -H "Content-Type: application/vnd.schemaregistry.v1+json" --data '{"schema": "{\"type\": \"string\"}"}' --cert /etc/kafka/secrets/client.certificate.pem --key /etc/kafka/secrets/client.key --tlsv1.2 --cacert /etc/kafka/secrets/snakeoil-ca-1.crt https://schemaregistry:8082/subjects/Kafka-key/versions
Copy

List all subjects

curl -v -X GET --cert /etc/kafka/secrets/client.certificate.pem --key /etc/kafka/secrets/client.key --tlsv1.2 --cacert /etc/kafka/secrets/snakeoil-ca-1.crt https://schemaregistry:8082/subjects/
Copy

Use curl to access Schema Registry in Confluent Cloud

You can also use curl commands to view and manage schemas on Confluent Cloud.

Schema Registry on Confluent Cloud requires that you pass the API Key and Secret with the --user (or -u) flag. For example, to view all subjects in the registry:

curl --user <schema-registry-api-key>:<schema-registry-api-secret> \
<schema-registry-url>/subjects
Copy

For more about using Schema Registry on Confluent Cloud, see Quick Start for Schema Management on Confluent Cloud.