Service Accounts

All Confluent Cloud connectors require credentials to allow the connector to operate and access Kafka. You can either create and use a Kafka API key and secret or use a service account.

When you create a service account, you configure access control list (ACL) DESCRIBE, CREATE, READ, and WRITE access to topics and create the API key and secret. Once the service account is created, a user creating a connector can select the service account ID when configuring the connector.

Important

A connector configuration must include either an API key and secret or a service account ID. For additional Confluent Cloud service account information, see Service Accounts.

Example: Configuring a service account

The following examples show how to set up a service account using the Confluent Cloud CLI. These steps can be used for a cluster running on any cloud provider.

Sink connector service account

This example assumes the following:

  • You have a Kafka cluster with cluster ID lkc-gqgvx.
  • You want the sink connector to read from a topic named pageviews.

Use the following example steps to create a service account, set ACLs, and add the API key and secret.

Note

The following steps:

  • show basic ACL entries for sink connector service accounts. Be sure to review the Sink connector success and error topics section for additional ACL entries that may be required for certain connectors.
  • show Confluent CLI version 2 example commands. For more information see, Confluent CLI v2.
  1. Create a service account named myserviceaccount:

    confluent iam service-account create myserviceaccount --description "test service account"
    
  2. Find the service account ID for myserviceaccount:

    confluent iam service-account list
    
  3. Set a DESCRIBE ACL to the cluster.

    confluent kafka acl create --allow --service-account "<service-account-id>" --operation "DESCRIBE" --cluster-scope
    
  4. Set a READ ACL to pageviews:

    confluent kafka acl create --allow --service-account "<service-account-id>" --operation "READ" --topic "pageviews"
    
  5. Set a CREATE ACL to the following topic prefix:

    confluent kafka acl create --allow --service-account "<service-account-id>" --operation "CREATE" --prefix --topic "dlq-lcc"
    
  6. Set a WRITE ACL to the following topic prefix:

    confluent kafka acl create --allow --service-account "<service-account-id>" --operation "WRITE" --prefix --topic "dlq-lcc"
    
  7. Set a READ ACL to a consumer group with the following prefix:

    confluent kafka acl create --allow --service-account "<service-account-id>" --operation "READ"  --prefix --consumer-group "connect-lcc"
    
  8. Create a Kafka API key and secret for <service-account-id>:

    confluent api-key create --resource "lkc-gqgvx" --service-account "<service-account-id>"
    
  9. Save the API key and secret.

The connector configuration must include either an API key and secret or a service account ID. For additional service account information, see Service Accounts.

Source connector service account

This example assumes the following:

  • You have a Kafka cluster with cluster ID lkc-gqgvx.
  • You want the source connector to write to a topic named passengers.

Use the following example steps to create a service account, set ACLs, and add the API key and secret.

Note

The following steps show basic ACL entries for source connector service accounts. Make sure to review Debezium Source Connectors and JDBC-based Source Connectors and the MongoDB Atlas Source Connector for additional ACL entries that may be required for certain connectors.

  1. Create a service account named myserviceaccount:

    confluent iam service-account create myserviceaccount --description "test service account"
    
  2. Find the service account ID for myserviceaccount:

    confluent iam service-account list
    
  3. Set a DESCRIBE ACL to the cluster.

    confluent kafka acl create --allow --service-account "<service-account-id>" --operation "DESCRIBE" --cluster-scope
    
  4. Set a WRITE ACL to passengers:

    confluent kafka acl create --allow --service-account "<service-account-id>" --operation "WRITE" --topic "passengers"
    
  5. Create a Kafka API key and secret for <service-account-id>:

    confluent api-key create --resource "lkc-gqgvx" --service-account "<service-account-id>"
    
  6. Save the API key and secret.

The connector configuration must include either an API key and secret or a service account ID. For additional service account information, see Service Accounts.

Additional ACL entries

Certain connectors require additional ACL entries.

Debezium Source Connectors

The Source connector service account section provides basic ACL entries for source connector service accounts. Debezium Source connectors require additional ACL entries. Add the following ACL entries for Debezium Source connectors:

  • ACLs to create and write to table related topics prefixed with <database.server.name>. Use the following commands to set these ACLs:

     confluent kafka acl create --allow --service-account "<service-account-id>" \
    --operation "CREATE" --prefix --topic "<database.server.name>"
    
     confluent kafka acl create --allow --service-account "<service-account-id>" \
    --operation "WRITE" --prefix --topic "<database.server.name>"
    
  • ACLs to describe configurations at the cluster scope level. Use the following commands to set these ACLs:

     confluent kafka acl create --allow --service-account "<service-account-id>" \
    --cluster-scope --operation "DESCRIBE"
    
     confluent kafka acl create --allow --service-account "<service-account-id>" \
    --cluster-scope --operation "DESCRIBE-CONFIGS"
    

The Debezium MySQL CDC Source and the Debezium Microsoft SQL Source connectors require the following additional ACL entries:

  • ACLs to create and write to database history topics prefixed with dbhistory.<database.server.name>.lcc-. For example, the server name is cdc in the configuration property "database.server.name": "cdc". Use the following commands to set these ACLs:

     confluent kafka acl create --allow --service-account "<service-account-id>" \
    --operation "CREATE" --prefix --topic "dbhistory.<database.server.name>.lcc-"
    
     confluent kafka acl create --allow --service-account "<service-account-id>" \
    --operation "WRITE" --prefix --topic "dbhistory.<database.server.name>.lcc-"
    
  • ACLs to read database history consumer group named <database.server.name>-dbhistory. For example, the server name is cdc in the configuration property "database.server.name": "cdc". Use the following commands to set these ACLs:

     confluent kafka acl create --allow --service-account "<service-account-id>" \
    --operation "READ" --consumer-group "<database.server.name>-dbhistory"
    

JDBC-based Source Connectors and the MongoDB Atlas Source Connector

The Source connector service account section provides basic ACL entries for source connector service accounts. Several source connectors allow a topic prefix. When a prefix is used, you need to add additional ACL entries. The following source connectors require the additional ACL entries if a prefix is used.

Add the following ACL entries for these source connectors:

confluent kafka acl create --allow --service-account "<service-account-id>" --operation "CREATE" --prefix --topic "<topic.prefix>"
confluent kafka acl create --allow --service-account "<service-account-id>" --operation "WRITE" --prefix --topic "<topic.prefix>"

Oracle CDC Source connector

To access redo log topics, you must grant the connector a corresponding operation–that is, CREATE, READ, or WRITE in an ACL. The default redo log topic for the Oracle CDC Sourc connector is ${connectorName}-${databaseName}-redo-log. When this topic is created by the connector, it appends the lcc- prefix.

Add the following ACL entries:

confluent kafka acl create --allow --service-account "<service-account-id>" --operation "CREATE" --prefix --topic "-lcc"
confluent kafka acl create --allow --service-account "<service-account-id>" --operation "READ" --prefix --topic "-lcc"
confluent kafka acl create --allow --service-account "<service-account-id>" --operation "WRITE" --prefix --topic "-lcc"

If you set the following configuration properties, you need to set ACLs for the resulting output topics:

  • table.topic.name.template for table-specific topics.
  • lob.topic.name.template for LOB objects.
  • redo.log-corruption.topic for corrupted redo log records.

For these output topics, you must grant the connector either CREATE or WRITE. When granted READ, WRITE, or DELETE, the connector implicitly derives the DESCRIBE operation.

Sink connector success and error topics

The Sink connector service account section provides basic ACL entries for sink connector service accounts. Several sink connectors create additional success-lcc and error-lcc topics when the connector is launched. The following sink connectors create these topics and require additional ACL entries:

Add the following ACL entries for these sink connectors:

confluent kafka acl create --allow --service-account "<service-account-id>" --operation "CREATE" --prefix --topic "success-lcc"
confluent kafka acl create --allow --service-account "<service-account-id>" --operation "WRITE" --prefix --topic "success-lcc"
confluent kafka acl create --allow --service-account "<service-account-id>" --operation "CREATE" --prefix --topic "error-lcc"
confluent kafka acl create --allow --service-account "<service-account-id>" --operation "WRITE" --prefix --topic "error-lcc"