Access and Consume Audit Logs

Confluent Cloud audit logs are enabled by default in Standard, Enterprise, and Dedicated clusters. Basic clusters do not include audit logs.

Live audit log records are deleted after seven days, but you can retain audit logs by replicating them to another Kafka cluster or external system. For details, see Retain Audit Log Records.

Tip

You can also use Cluster Linking to sync your audit logs to your own Dedicated Confluent Cloud clusters, which enables you to consume audit logs with your existing consumer configurations, or use audit logs with fully-managed Connect and ksqlDB. For a step-by-step guide, see Use Cluster Linking to Manage Audit Logs.

Prerequisites

Kafka client
You can use any Kafka client (for example, Confluent CLI, C/C++, or Java) to consume data from the Confluent Cloud audit log topic as long as the client supports SASL authentication. Thus, any prerequisites are specific to the Kafka client you use. For details about configuring Confluent Cloud clients, see Kafka Client Quick Start for Confluent Cloud.
Consume configuration
The Confluent Cloud Console provides a configuration you can copy and paste into your Kafka client of choice. This configuration allows you to connect to the audit log cluster and consume from the audit log topic.
Cluster type
Standard, Enterprise, and Dedicated Kafka clusters support audit logs, which are enabled by default. Basic clusters do not include audit logs.

Access the audit log user interface

To access audit log information in the Confluent Cloud Console, click the Administration menu. If your role grants you permission to manage your organization, and your organization has at least one Standard, Enterprise or Dedicated Kafka cluster, cluster, click Audit log to access your audit log cluster information. Enablement is automatic, and typically takes place within five minutes of successfully provisioning your first Standard, Enterprise, and Dedicated cluster.

Consume with Confluent CLI

Access to Confluent Cloud audit logs requires the OrganizationAdmin role.

  1. Open a terminal and log in to your Confluent Cloud organization.

    confluent login
    
  2. Run the confluent audit-log describe command to identify which resources to use.

    The following example shows the audit log information provided for a sample cluster.

    confluent audit-log describe
     +-----------------+----------------------------+
     | Cluster         | lkc-yokxv6                 |
     | Environment     | env-abc123                 |
     | Service Account | sa-ymnkzp                  |
     | Topic Name      | confluent-audit-log-events |
     +-----------------+----------------------------+
    

    Note

    The topic confluent-audit-log-events cannot be viewed in the Confluent Cloud Console, but you can consume it using the Confluent CLI or any Kafka client.

  3. Specify the environment and cluster to use by running the confluent environment use and confluent kafka cluster use commands. The environment and cluster names are available from the data retrieved in the previous step.

    confluent environment use env-abc123
    confluent kafka cluster use lkc-yokxv6
    
  4. If you have an existing API key and secret for your audit log cluster, you can store it locally using the confluent api-key store command.

    confluent api-key store <API-key> --resource lkc-yokxv6
    

    To view the existing API keys for your audit log cluster, run the confluent api-key list command using the --resource option.

    confluent api-key list --resource lkc-yokxv6
    

    Note

    To consume data from the audit log topic, you must have an API key and secret.

  5. If you need to create a new API key and secret for your audit log cluster, run the following confluent api-create command with the --service-account and --resource options.

    confluent api-key create --service-account sa-ymnkzp --resource lkc-yokxv6
    

    Note

    Be sure to save the API key and secret. The secret is not retrievable later.

    Important

    API keys for the audit log cluster must be created using the predefined service account. To get the identifier for the audit-log cluster, run the Confluent CLI confluent audit-log describe command.

    There is a limit of two API keys per audit log cluster. If you need to delete an existing API key for your audit log cluster, run the following command:

    confluent api-key delete <API-key>
    
  6. After creating your API key and secret, copy the API key and paste it into the following command:

    confluent api-key use <API-key> --resource lkc-yokxv6
    
  7. Consume audit log event messages from the audit log topic.

    You can use the Confluent CLI to consume audit log events from the audit log cluster by running the following confluent kafka topic consume command:

    confluent kafka topic consume -b confluent-audit-log-events
    

    For details about the options you can use with this command, see confluent kafka topic consume.

New connections using a deleted API key are not allowed. You can rotate keys by creating a new key, configuring your clients to use the new key, and then deleting the old one.

To watch events “live” as they’re being used to authenticate, and also view which Kafka cluster API keys are being used:

confluent kafka topic consume confluent-audit-log-events > audit-events.json &

tail -f audit-events.json \
| grep 'kafka.Authentication' \
| jq .data.authenticationInfo.metadata.identifier \
> recently-used-api-keys.txt &

tail -f recently-used-api-keys.txt

Note

The jq command-line JSON processor is third-party software that is not included or installed in Confluent Cloud. If you wish to use it, you must download and install it yourself.

Consume with Java

Access to Confluent Cloud audit logs requires the OrganizationAdmin role.

  1. Sign in to Confluent Cloud Console at https://confluent.cloud.

  2. Go to ADMINISTRATION -> Audit log, which you can find in the top-right Administration menu in the Confluent Cloud Console.

  3. On the Audit log page, click the Consume with Java tab.

    ../../_images/ccloud-audit-log-consume-java.png
  4. Copy and paste the provided configuration into your client.

  5. If necessary, click Create Kafka cluster API key & secret to create a key/secret pair for your Kafka cluster.

  6. Start and connect to the Java client.

Consume with C/C++

  1. Sign in to Confluent Cloud Console at https://confluent.cloud.

  2. Go to ADMINISTRATION -> Audit log.

  3. On the Audit log page, click the Consume with C/C++ tab.

    ../../_images/ccloud-audit-log-consume-c++.png
  4. Copy and paste the provided configuration into your client.

  5. If necessary, click Create Kafka cluster API key & secret to create a an API key and secret for your Kafka cluster.

  6. Start and connect to the C/C++ client.

Securing Confluent Cloud audit logs

The Confluent Cloud audit log destination cluster is located on a public cluster in AWS us-west-2 with encrypted storage and a default KMS-managed key rotation schedule of every three years. API keys specific to the audit log cluster are the only means of gaining read-only access.

Secure your organization’s audit logs by protecting the API keys used to read them.

As mentioned previously, Confluent Cloud supports up to two active audit log cluster API keys, and allows live key rotation. For API key best practices, see Best Practices for Using API Keys in Confluent Cloud.

Accessing audit log messages from behind a firewall

Like other Confluent Cloud public clusters, audit log clusters do not have a static IP address. To access your audit log messages, you must move your consumer outside the firewall. For details about configuring access to the Confluent Cloud Console, see Use Confluent Cloud with Private Networking.