Connector Service Accounts

All Confluent Cloud connectors require credentials to allow the connector to operate and access resources in Confluent Cloud. You can either create and use a Confluent Cloud API key and secret or use a service account API key and secret. This section provides the steps to create a service account and the API key and secret key.

You create and manage Confluent Cloud service accounts using the Confluent Cloud CLI. See Service Accounts for detailed information about Confluent Cloud service accounts.

Note

The Confluent Cloud service account is separate from the cloud provider platform service account that may be required for your connector to access cloud platform resources. For example, a Confluent Cloud sink connector sending data to a GCS bucket requires both a service account for Confluent Cloud and a service account to access the GCS bucket in GCP.

Example: Configuring a service account for Amazon S3 Sink Connector

The following example shows how to set up a service account for the Amazon S3 Sink Connector for Confluent Cloud. The example assumes the following:

  • You have a Kafka cluster with cluster ID lkc-gqgvx in the AWS us-west-2 region.
  • You want an S3 sink connector to read from a topic named pageviews.
  • You want pageviews to send data to an S3 bucket named pageviews-avro in the AWS us-west-2 region.

Note

The S3 sink connector creates a topic named dlq automatically. The consumer group for the connector should have read access to the dlq topic.

Use the following CLI commands to create a service account for this example.

  1. Create a service account named s3Pageviews:

    ccloud service-account create s3Pageviews --description "This is a demo service account."
    
  2. Find the service account ID for s3Pageviews:

    ccloud service-account list
    
  3. Set a READ ACL to pageviews:

    ccloud kafka acl create --allow --service-account-id "s3PageviewsId" --operation "READ" --topic "pageviews"
    
  4. Set a CREATE ACL to a dlq topic with the following prefix:

    ccloud kafka acl create --allow --service-account-id "s3PageviewsId" --operation "CREATE" --prefix --topic "dlq-lcc"
    
  5. Set a WRITE ACL to a dlq topic with the following prefix:

    ccloud kafka acl create --allow --service-account-id "s3PageviewsId" --operation "WRITE" --prefix --topic "dlq-lcc"
    
  6. Set a READ ACL to a consumer group with the following prefix:

    ccloud kafka acl create --allow --service-account-id "s3PageviewsId" --operation "READ"  --prefix --consumer-group "connect-lcc"
    
  7. Create a Kafka API key and secret for s3Pageviews:

    ccloud api-key create --cluster "lkc-gqgvx" --service-account-id "s3PageviewsId"
    
  8. Save the API key and secret. You need this to configure your client applications. This is the only time you can get these keys.

    Client applications that connect to the cluster must have at least the following three parameters configured:

    • API key: available when you create the API key/secret pair the first time
    • API secret: available when you create the API key/secret pair the first time
    • bootstrap.servers: set to the Endpoint in the output of ccloud kafka cluster describe