Confluent’s stream quality tools enable teams to deliver a scalable supply of trusted event streams throughout the business, enabling reliable delivery of mission-critical applications, confident decision making, and a simplified design for data standards. These tools set and control the data rules and definitions by which the entire system operates. This determines what data gets in and what data does not, all in the spirit of maintaining high data integrity.
- Schema Registry allows teams to define and enforce universal data standards that enable scalable data compatibility while reducing operational complexity. Avro, JSON, and Protobuf serialization formats are supported.
- Schema Validation, enabled at the topic-level, ensures broker/registry coordination by verifying that schemas tied to incoming messages are both valid and assigned to the specific destination topic in order to publish.
- Schema Linking keeps schemas in sync across Schema Registry clusters. Optionally, use it in combination with Cluster Linking on Confluent Cloud to keep both schemas and topic data in sync across Schema Registry and Kafka clusters.