Configuring Confluent Cloud Audit Logging

Because audit logging is enabled by default (for Standard and Dedicated clusters only), the only configuration task required is to consume from the audit log topic.

Prerequisites

Kafka client
You can use any Kafka Clients (for example, Confluent CLI, C/C++, or Java) to consume from the Confluent Cloud audit log topic as long as they support SASL authentication. Thus, any prerequisites are specific to the Kafka client you use. For details about configuring Confluent Cloud clients, see Configure Confluent Cloud Clients.
Consume configuration
Confluent Cloud provides a configuration you can copy and paste into your Kafka client of choice. This configuration allows you to connect to the audit log cluster and consume from the audit log topic.
Cluster type
Both Standard and Dedicated cluster types support audit logs.

Access the audit log user interface

After logging in to Confluent Cloud, to access the audit log user interface, click the Administration menu (☰) in the upper-right corner. If your role gives you permission to manage your organization, and your organization has at least one Standard or Dedicated cluster, there will be an Audit log menu entry. Enablement is automatic, and typically takes place within five minutes of successfully provisioning your first Standard or Dedicated cluster.

Consume with CLI

  1. Log in to your Confluent Cloud organization using the Confluent Cloud CLI.

    ccloud login
    
  2. Run audit-log describe to identify which resources to use.

    ccloud audit-log describe
     +-----------------+----------------------------+
     | Cluster         | lkc-yokxv6                 |
     | Environment     | env-x11xz                  |
     | Service Account |                     292163 |
     | Topic Name      | confluent-audit-log-events |
     +-----------------+----------------------------+
    
  3. Specify the environment and cluster to use (using the data that you retrieved in the previous step).

    ccloud environment use env-x11xz
    ccloud kafka cluster use lkc-yokxv6
    
  4. If you have an existing API key and secret for audit logs, you can use it as shown here:

    ccloud api-key store <API-key> --resource lkc-yokxv6
    

    To view the existing API keys for your audit log cluster:

    ccloud api-key list --resource lkc-yokxv6
    

    Note

    You must have an API key and secret to consume from the audit log topic.

  5. If you need to create a new API key and secret for your audit log cluster, run the following Confluent CLI command:

    ccloud api-key create --service-account 292163 --resource lkc-yokxv6
    

    Note

    Be sure to save the API key and secret. The secret is not retrievable later.

    Important

    There is a limit of 2 API keys per audit log cluster. If you need to delete an existing API key for your audit log cluster, run the following command:

    ccloud api-key delete <API-key>
    
  6. After creating your API key pair, copy the API key and paste it into the following command:

    ccloud api-key use <API-key> --resource lkc-yokxv6
    
  7. Consume audit log events from the audit log topic.

    ccloud kafka topic consume -b confluent-audit-log-events
    

    Refer to ccloud kafka topic consume for details about the flags you can use with this command.

New connections using a deleted API key are not allowed. You can rotate keys by creating a new key, configuring your clients to use the new key, and then deleting the old one.

To watch events “live” as they’re being used to authenticate, and also view which Kafka cluster API keys are being used:

ccloud kafka topic consume confluent-audit-log-events > audit-events.json &

tail -f audit-events.json \
| grep 'kafka.Authentication' \
| jq .data.authenticationInfo.metadata.identifier \
> recently-used-api-keys.txt &

tail -f recently-used-api-keys.txt

Note

The jq processor is third-party software that is not included or installed in Confluent Cloud. If you wish to use it, you must download and install it yourself.

Consume with Java

  1. Sign in to Confluent Cloud at https://confluent.cloud.

  2. Navigate to ADMINISTRATION -> Audit log, which you can find in the top-right Administration menu in the Confluent Cloud user interface.

  3. On the Audit log page, click the Consume with Java tab.

    ../../_images/ccloud-audit-log-consume-java.png
  4. Copy and paste the provided configuration into your client.

  5. If necessary, click Create Kafka cluster API key & secret to create a key/secret pair for your Kafka cluster.

  6. Start and connect to the Java client.

Consume with C/C++

  1. Sign in to Confluent Cloud at https://confluent.cloud.

  2. Navigate to ADMINISTRATION -> Audit log.

  3. On the Audit log page, click the Consume with C/C++ tab.

    ../../_images/ccloud-audit-log-consume-c++.png
  4. Copy and paste the provided configuration into your client.

  5. If necessary, click Create Kafka cluster API key & secret to create a key/secret pair for your Kafka cluster.

  6. Start and connect to the C/C++ client.

Securing Confluent Cloud audit logs

The Confluent Cloud audit log destination cluster is located on a public cluster in AWS us-west-2 with encrypted storage and a default KMS-managed key rotation schedule of every three years. API keys specific to the audit log cluster are the only means of gaining read-only access.

Secure your organization’s audit logs by protecting the API keys used to read them.

As mentioned previously, Confluent Cloud supports up to two active audit log cluster API keys, and allows live key rotation.

To delete an existing API key for your audit log cluster, run the following command:

ccloud api-key delete <API-key>

Accessing audit log messages from behind a firewall

Like other Confluent Cloud public clusters, audit log clusters do not have a static IP address. To access your audit log messages, you must move your consumer outside the firewall. For details about configuring access to the Confluent Cloud web UI, see Configure Confluent Cloud UI access for AWS PrivateLink and Azure Private Link.