Kafka Connect Single Message Transform Reference for Confluent Cloud

Single Message Transformations (SMTs) are applied to messages as they flow through Connect. SMTs transform inbound messages after a source connector has produced them, but before they are written to Kafka. SMTs transform outbound messages before they are sent to a sink connector. The following SMTs are available for use with Kafka Connect.

Tip

For a tutorial and a deep dive into this topic, see How to Use Single Message Transforms in Kafka Connect.

If none of the available SMTs provide the necessary transformation, you can create your own for Confluent Cloud. For more information, see Configure Custom SMT for Kafka Connectors in Confluent Cloud.

Caution

This document contains SMT information that applies to all connectors, self-managed and fully-managed. However, some information linked to from this page may go to platform-specific docs that do not apply to fully-managed connectors. For instance, RegexRouter and Custom SMTs are not currently available with managed connector SMTs.

Transform

Description

AdjustPrecisionAndScale

Adjust the precision and scale of decimal fields within a record’s key or value.

BytesToString

Convert binary data (bytes field) in a Kafka Connect structure into a string format.

Cast

Cast fields or the entire key or value to a specific type (for example, to force an integer field to a smaller width).

ChangeTopicCase

Change the case of the record topic name (for example, to uppercase or lowercase).

Drop

Drop either a key or a value from a record and set it to null.

DropHeaders

Drop one or more headers from each record.

EventRouter

Only available for Oracle XStream CDC source and managed Debezium connectors. Route Debezium outbox events using a regex configuration option.

ExtractField

Extract the specified field from a Struct when schema present, or a Map in the case of schemaless data. Any null values are passed through unmodified.

ExtractNestedField

Promote a nested field within the record key or value to a top-level field to simplify data access for downstream systems.

ExtractTimestamp

Extract the timestamp from a field within the record key or value and use it to overwrite the record’s existing timestamp.

ExtractTopic

Replace the record topic with a new topic derived from its key or value.

ExtractXPath

Extract specific elements or attributes from XML data and place them into the record’s key or value.

Filter (Apache Kafka)

Drop all records. Designed to be used in conjunction with a Predicate.

Filter (Confluent)

Include or drop records that match a configurable filter.condition.

Flatten (Kafka)

Flatten a nested data structure. This generates names for each field by concatenating the field names at each level with a configurable delimiter character.

Flatten (Confluent)

Flatten a nested data structure including nested maps. This generates names for each field by concatenating the field names at each level with a configurable delimiter character.

FromXML

Convert XML data (stored as bytes or a string) into a strongly typed Connect structure, allowing conversion to formats like Avro.

GzipDecompress

Not currently available for managed connectors. Gzip-decompress the entire byteArray key or value input.

HeaderFrom

Moves or copies fields in a record key or value into the record’s header.

HeaderToValue

Moves or copies fields in the record header to a record key or value.

HoistField

Wrap data using the specified field name in a Struct when schema present, or a Map in the case of schemaless data.

InsertField

Insert field using attributes from the record metadata or a configured static value.

InsertHeader

Insert a literal value as a record header.

KeyToValue

Convert a record key into a field within the record value (by default, a string field named _key) to store key data directly within the value.

MaskField

Mask specified fields with a valid null value for the field type.

MessageTimeStampRouter

Update the record’s topic field as a function of the original topic value and the record’s timestamp field.

RegexRouter

Not currently available for managed connectors. Update the record topic using the configured regular expression and replacement string.

ReplaceField (Apache Kafka)

Filter or rename fields.

ReplaceField (Confluent)

Filter or rename nested fields.

SetMaximumPrecision

Only available for managed source connectors. Ensure all decimal fields in a Kafka Connect structure do not exceed a specified maximum precision.

SetSchemaMetadata

Set the schema name, version, or both on the record’s key or value schema.

TimestampConverter

Convert timestamps between different formats such as Unix epoch, strings, and Connect Date and Timestamp types.

TimestampNow

Replace the existing timestamp of an incoming record with the precise time the record is processed by the connector.

TimestampNowField

Add a new field to the record key or value, populating it with the system timestamp at the moment of processing.

TimestampRouter

Update the record’s topic field as a function of the original topic value and the record timestamp.

TimezoneConverter

Only available for Oracle XStream CDC source and managed Debezium connectors. Convert the timezone of timestamp fields in Debezium change event records to a specified target timezone.

TombstoneHandler

Manage tombstone records. A tombstone record is defined as a record with the entire value field being null, whether or not it has ValueSchema.

TopicRegexRouter

Only available for managed Source connectors. Update the record topic using the configured regular expression and replacement string.

ValueToKey

Replace the record key with a new key formed from a subset of fields in the record value.