Data Type Mappings in Confluent Cloud for Apache Flink

Confluent Cloud for Apache Flink®️ supports records in the Avro Schema Registry, JSON_SR, and Protobuf Schema Registry formats.

Avro schemas

Known limitations

  • Avro enums have limited support. Flink supports reading and writing enums but treats them as a STRING type. From Flink’s perspective, enums are not distinguishable from the STRING type. You can’t create an Avro schema from Flink that has an enum field.
  • Flink doesn’t support reading Avro time-micros as a TIME type. Flink supports TIME with precision up to 3. time-micros is read and written as BIGINT.
  • Field names must match Avro criteria. Avro expects field names to start with [A-Za-z_] and subsequently contain only [A-Za-z0-9_].
  • These Flink types are not supported:

JSON schemas

Protobuf schemas