Quick Start – Use Client-Side Field Level Encryption to Protect Sensitive Data

Follow the steps below to use client-side field level encryption (CSFLE) to protect sensitive data in Confluent Cloud on AWS, Azure, and Google Cloud.

Requirements

CSFLE requires the Advanced Stream Governance package.

To run the examples below on Confluent Cloud, add the following properties (where config.properties contains the properties for connecting to the Kafka cluster on Confluent Cloud):

--producer.config config.properties (for the producer)
--consumer.config config.properties (for the consumer)
--property basic.auth.credentials.source=USER_INFO
--property basic.auth.user.info=${SR_API_KEY}:${SR_API_SECRET}

Here is an example of CSFLE using the Avro data format for each of the KMS providers.

Step 1 - Configure the KMS provider

  1. Configure AWS KMS. For more information, see AWS KMS.
  2. Add the environment variables for your AWS credentials to both the producer and consumer.
export AWS_ACCESS_KEY_ID=XXXX
export AWS_SECRET_ACCESS_KEY=XXXX

Step 2 - Start the producer

./bin/kafka-avro-console-producer --broker-list localhost:9092 --property schema.registry.url=http://localhost:8081 --topic test  \
  --property value.schema='{"type":"record","name":"myrecord","fields":[{"name":"f1","type":"string","confluent:tags":["PII"]}]}' \
  --property value.rule.set='{ "domainRules": [ { "name": "encryptPII", "type": "ENCRYPT", "tags":["PII"], "params": { "encrypt.kek.name": "aws-kek1", "encrypt.kms.key.id": "arn:aws:kms:us-east-1:xxxx:key/xxxx", "encrypt.kms.type": "aws-kms" }, "onFailure": "ERROR,NONE"}]}'

Step 3 - Start the consumer with decryption

./bin/kafka-avro-console-consumer --topic test \
--bootstrap-server localhost:9092 \
--property schema.registry.url=http://localhost:8081