Quick Start – Use Client-Side Field Level Encryption to Protect Sensitive Data¶
Follow the steps below to use client-side field level encryption (CSFLE) to protect sensitive data in Confluent Cloud on AWS, Azure, and Google Cloud.
Requirements¶
CSFLE requires the Advanced Stream Governance package.
To run the examples below on Confluent Cloud, add the following properties (where
config.properties
contains the properties for connecting to the Kafka
cluster on Confluent Cloud):
--producer.config config.properties (for the producer)
--consumer.config config.properties (for the consumer)
--property basic.auth.credentials.source=USER_INFO
--property basic.auth.user.info=${SR_API_KEY}:${SR_API_SECRET}
Here is an example of CSFLE using the Avro data format for each of the KMS providers.
Step 1 - Configure the KMS provider¶
- Configure AWS KMS. For more information, see AWS KMS.
- Add the environment variables for your AWS credentials to both the producer and consumer.
export AWS_ACCESS_KEY_ID=XXXX
export AWS_SECRET_ACCESS_KEY=XXXX
- Configure Azure Key Vault. For more information, see Azure Key Vault.
- Add the environment variables for your Azure credentials to both the producer and consumer.
export AZURE_TENANT_ID=XXXX
export AZURE_CLIENT_ID=XXXX
export AZURE_CLIENT_SECRET=XXXX
- Configure Google Cloud KMS. For more information, see Google Cloud KMS.
- Add the environment variables for your Google Cloud credentials to both the producer and consumer.
export GOOGLE_APPLICATION_CREDENTIALS=PATH_TO_CREDS.json
Configure Hashicorp Vault. For more information, see Hashicorp Vault - Getting Started.
Add the environment variables for your Hashicorp Vault credentials to both the producer and consumer.
export VAULT_TOKEN=dev-only-token
Create a local key.
Add the environment variables for your local key credentials.
export LOCAL_SECRET=<output of "openssl rand -base64 16">
Step 2 - Start the producer¶
./bin/kafka-avro-console-producer --broker-list localhost:9092 --property schema.registry.url=http://localhost:8081 --topic test \
--property value.schema='{"type":"record","name":"myrecord","fields":[{"name":"f1","type":"string","confluent:tags":["PII"]}]}' \
--property value.rule.set='{ "domainRules": [ { "name": "encryptPII", "type": "ENCRYPT", "tags":["PII"], "params": { "encrypt.kek.name": "aws-kek1", "encrypt.kms.key.id": "arn:aws:kms:us-east-1:xxxx:key/xxxx", "encrypt.kms.type": "aws-kms" }, "onFailure": "ERROR,NONE"}]}'
./bin/kafka-avro-console-producer --topic test \
--broker-list localhost:9092 \
--property schema.registry.url=http://localhost:8081 \
--property value.schema='{"type":"record","name":"myrecord","fields":[{"name":"f1","type":"string","confluent:tags":["PII"]}]}' \
--property value.rule.set='{ "domainRules": [ { "name": "encryptPII", "type": "ENCRYPT", "tags":["PII"], "params": { "encrypt.kek.name": "azure-kek1", "encrypt.kms.key.id": "https://xxxx.vault.azure.net/keys/key1/xxxx", "encrypt.kms.type": "azure-kms" }, "onFailure": "ERROR,NONE"}]}'
./bin/kafka-avro-console-producer --broker-list localhost:9092 --property schema.registry.url=http://localhost:8081 --topic test \
--property value.schema='{"type":"record","name":"myrecord","fields":[{"name":"f1","type":"string","confluent:tags":["PII"]}]}' \
--property value.rule.set='{ "domainRules": [ { "name": "encryptPII", "type": "ENCRYPT", "tags":["PII"], "params": { "encrypt.kek.name": "gcp-kek1", "encrypt.kms.key.id": "projects/xxxx/locations/us-east1/keyRings/xxxx/cryptoKeys/key1", "encrypt.kms.type": "gcp-kms" }, "onFailure": "ERROR,NONE"}]}'
./bin/kafka-avro-console-producer --topic test \
--broker-list localhost:9092 \
--property schema.registry.url=http://localhost:8081 \
--property value.schema='{"type":"record","name":"myrecord","fields":[{"name":"f1","type":"string","confluent:tags":["PII"]}]}' \
--property value.rule.set='{ "domainRules": [ { "name": "encryptPII", "type": "ENCRYPT", "tags":["PII"], "params": { "encrypt.kek.name": "hcvault-kek1", "encrypt.kms.key.id": "http://127.0.0.1:8200/transit/keys/xxxx", "encrypt.kms.type": "hcvault" }, "onFailure": "ERROR,NONE"}]}'
./bin/kafka-avro-console-producer --topic test \
--broker-list localhost:9092 \
--property schema.registry.url=http://localhost:8081 \
--property value.schema='{"type":"record","name":"myrecord","fields":[{"name":"f1","type":"string","confluent:tags":["PII"]}]}' \
--property value.rule.set='{ "domainRules": [ { "name": "encryptPII", "type": "ENCRYPT", "tags":["PII"], "params": { "encrypt.kek.name": "local-kek1", "encrypt.kms.key.id": "mykey", "encrypt.kms.type": "local-kms" }, "onFailure": "ERROR,NONE"}]}'
Step 3 - Start the consumer with decryption¶
./bin/kafka-avro-console-consumer --topic test \
--bootstrap-server localhost:9092 \
--property schema.registry.url=http://localhost:8081
./bin/kafka-avro-console-consumer --topic test \
--bootstrap-server localhost:9092 \
--property schema.registry.url=http://localhost:8081
./bin/kafka-avro-console-consumer --topic test \
--bootstrap-server localhost:9092 \
--property schema.registry.url=http://localhost:8081
./bin/kafka-avro-console-consumer --topic test \
--bootstrap-server localhost:9092 \
--property schema.registry.url=http://localhost:8081
./bin/kafka-avro-console-consumer --topic test \
--bootstrap-server localhost:9092 \
--property schema.registry.url=http://localhost:8081