Service Accounts for Confluent Cloud¶
Each service account represents an application programmatically accessing Confluent Cloud.
You can manage application access to Confluent Cloud by using service accounts. Permissions can be specified using ACLs and role bindings tied to a specific service account. ACLs and role bindings for service accounts are set by an administrator or another user with a similar role within the organization.
Service accounts are an organization-level resource, and there is a limit on the number of service accounts in an organization.
Service accounts span the entire organization, so they can own API keys for different resources: dev and prod clusters, and/or Kafka and Schema Registry. A typical use case has one team administering the Confluent Cloud streaming platform and issuing service accounts (with ACLs applied) to various application teams that use the streaming platform. While service accounts cannot log in to the Confluent Cloud Console, they can own any type of API keys which can be used for CLI or API access. Because users can leave or change roles within a company, while applications continue to operate independently, service accounts are especially useful in organizations that require special identifiers for applications (or services) not be tied to a specific user.
Note
The number of API keys that can belong to a specific service account is limited. For details, see Service Quotas for Confluent Cloud.
You can create and manage Confluent Cloud service accounts using any of the following methods:
- Confluent Cloud Console
- The Confluent CLI command confluent iam service-account create
- REST API
Note that:
You must be assigned the OrganizationAdmin role to create a service account. For details, refer to Confluent Cloud RBAC roles.
When creating service accounts using Confluent CLI you can optionally create any type of API key and make the owner a service account.
Note
Kafka ACLs only apply to Kafka and are not supported for other resource types. Hence, ACLs that exist for a service account are not inherited when you create a Schema Registry API key from that service account.
When you create service accounts using the Confluent Cloud Console, you can create Kafka and Confluent Cloud API keys and make the owner a service account.
You cannot use the Cloud Console to create Schema Registry or ksqlDB API keys owned by a service account.
API key type | Managed in Confluent Cloud Console? | Managed in Confluent CLI? |
---|---|---|
Kafka | Yes | Yes |
Schema Registry | No | Yes |
ksqlDB | No | Yes |
Confluent Cloud | Yes | Yes |
Caution
When you delete a user account or service account, all associated API keys will also be deleted. Any client applications using a deleted API key will lose access, which may cause an outage for your streaming application. Always confirm that none of the API keys owned by an account are in active use before deleting a user or service account.
Create a service account using the Confluent Cloud Console¶
You can use the Cloud Console to create service accounts. Using a service account, you can control access to resources (in this case, the Kafka cluster) and is best for production use. Note that the workflow also includes the creation and pairing of an API key.
To create a service account and API key using the Confluent Cloud Console:
In the Confluent Cloud Console, go to the cluster you want to create the API key, click Cluster Overview, and then click API keys. The API keys page appears.
Click + Add key. The Create key page appears.
Under Select scope for API key, select Granular access, and then click Next.
Click Create a new one to create an API key that will be owned by a new service account. Enter the new service account name and a description, then click Save and Next.
If you are creating an API key that will be owned by an existing service account, click Use existing account instead. Select the existing service account and proceed.
Add ACLs to the service account for clusters, consumer groups, topics, and transactional IDs.
Cluster ACLs
For cluster ACLs, specify the cluster name and select the operation and corresponding permissions. To add additional operations and permissions, click +Add ACLs.
Consumer group, Topic, and Transactional ID ACLs
Select the consumer group ID, topic name, or transactional ID to which the ACL applies. If you want the ACL to apply to a specific pattern type (for example, LITERAL or PREFIXED), specify a new pattern or select from existing ones. Specify the operation and its corresponding permissions. To add additional operations and permissions, click +Add ACLs.
Important
You can specify an ACL for a consumer group, topic, or transactional ID even if that resource does not yet exist. For example, if you create an ACL for a consumer group that does not exist, you can subsequently create that consumer group and configure it to use the ACLs you create here.
Click Save and Next.
The API key and secret are generated for the service account. You will need this API key and secret to connect to the cluster.
Click Save. The new service account with the API key and associated ACLs is created. When you return to the API access tab, you can search for the newly-created API key to confirm.
Create a service account using the ksqlDB Console¶
When you create a new ksqlDB application, you can provision a new service account.
In the navigation bar, click ksqlDB. On the All ksqlDB Applications page, click Add application.
On the New application page, click Granular access to enable the Choose service account option.
Click Create a new one and enter the name of the new service account.
Click Add all required ACLs when your ksqlDB app is created. To review the ACLs that apply to the new application, click View required ACLs.
Click Continue to configure provisioning for the new ksqlDB application.
Create and manage service accounts using the Confluent CLI¶
The following example shows a typical end-to-end workflow that works for Confluent Cloud running on any cloud provider. Specifically, the example shows how to use the Confluent CLI to:
- Create a Kafka cluster and make it active
- Add topics in the cluster
- Set up a service account and ACLs
- Create a Kafka API key and secret
Create a Confluent Cloud Kafka cluster (
sales092020
):confluent kafka cluster create sales092020 --cloud aws --region us-west-2 --type basic It may take up to 5 minutes for the Kafka cluster to be ready. +--------------+-----------------------------------------------------------+ | ID | lkc-abc123 | | Name | sales092020 | | Type | BASIC | | Ingress | 100 | | Egress | 100 | | Storage | 5000 | | Provider | aws | | Availability | single-zone | | Region | us-west-2 | | Status | UP | | Endpoint | SASL_SSL://pkc-v8wpn.us-west-2.aws.confluent.cloud:9092 | | ApiEndpoint | https://pkac-95yx5.us-west-2.aws.confluent.cloud | +--------------+-----------------------------------------------------------+
Note
Make note of your cluster ID. You will need to specify it in subsequent steps. If at any time you are unsure of the ID, run the
confluent kafka cluster list
command to view all your Kafka clusters and corresponding cluster IDs.Make the newly-created Kafka cluster the active cluster:
confluent kafka cluster use lkc-abc123 Set Kafka cluster "lkc-abc123" as the active cluster for environment "env-123ab"
Create topics (
raw_pageview_data
andanalytics_enriched_events
) in the Kafka cluster:confluent kafka topic create raw_pageview_data confluent kafka topic create analytics_enriched_events
Create a service account named
analytics
. You must include a description:confluent iam service-account create analytics –-description "My API analytics and secrets service account." +-------------+---------------------------------------+ | ID | sa-1a2b3c | | Name | analytics | | Description | My API analytics and secrets service | | | account. | +-------------+---------------------------------------+
Tip
Name requirements:
- 64-character maximum
- Allowed characters:
- Unicode classes of letter/mark/number
- Special characters:
.,'’&_+|[]/-()
- Whitespace
If you ever lose track of the service account ID, run
confluent iam service-account list
to retrieve it.Create a READ ACL for the topic
raw_pageview_data
.confluent kafka acl create --allow --service-account sa-1a2b3c --operations read --topic raw_pageview_data Principal | Permission | Operation | ResourceType | ResourceName | PatternType +------------------+------------+-----------+--------------+-------------------+------------+ User:sa-1a2b3c | ALLOW | READ | TOPIC | raw_pageview_data | LITERAL
Optionally, you can create ACLs using the
--prefix
option, which Kafka uses to match all resource names that are prefixed with the specified value. This example shows how to create a READ ACL that applies for all consumer groups that use the prefixkeyreaders
:confluent kafka acl create --allow --service-account sa-1a2b3c --operations read --prefix --consumer-group keyreaders Principal | Permission | Operation | ResourceType | ResourceName | PatternType +------------------+------------+-----------+--------------+-------------------+------------+ User:sa-1a2b3c | ALLOW | READ | GROUP | keyreaders | PREFIXED
Create ACLs for all topics that use a specific prefix. This example shows how to specify a CREATE ACL for topics with the prefix
analytics_
. Running this command creates ACLs that provide CREATE and WRITE access to any topic whose name starts withanalytics_
:confluent kafka acl create --allow --service-account sa-1a2b3c --operations create --prefix --topic analytics_ Principal | Permission | Operation | ResourceType | ResourceName | PatternType +------------------+------------+-----------+--------------+-------------------+------------+ User:sa-1a2b3c | ALLOW | CREATE | TOPIC | analytics_ | PREFIXED
Create a WRITE ACL to a
analytics_enriched_events
topic with a prefix:confluent kafka acl create --allow --service-account sa-1a2b3c --operations write --prefix --topic analytics_ Principal | Permission | Operation | ResourceType | ResourceName | PatternType +------------------+------------+-----------+--------------+-------------------+------------+ User:sa-1a2b3c | ALLOW | WRITE | TOPIC | analytics_ | PREFIXED
Create a Kafka API key and secret for service account
sa-1a2b3c
. Be sure to replace the service account ID and Kafka cluster ID values shown here with your own:confluent api-key create --service-account sa-1a2b3c --resource lkc-abc123 It may take a couple of minutes for the API key to be ready. Save the API key and secret. The secret is not retrievable later. +---------+------------------------------------------------------------------+ | API Key | 12A3BCDEFGHI4JKL | | Secret | aB+c12dEfghiJkLMNopqr3StUVWxyzabCdEFGHiJ4kL5mnop6QrS78TUVwxyzaB9 | +---------+------------------------------------------------------------------+
Warning
Save the API key and secret. You require this information to configure your client applications. Be aware that this is the only time you can access and view the key and secret.
Optionally, if you are using the Confluent Cloud Metrics or Health+ and you require a Confluent Cloud API key:
confluent api-key create --service-account sa-1a2b3c --resource cloud It may take a couple of minutes for the API key to be ready. Save the API key and secret. The secret is not retrievable later. +---------+------------------------------------------------------------------+ | API Key | AB1CDEF2GHI3J4KL | | Secret | j3Am6e+loCkCJUQ43iq9Es1z5KO7kKZQGmBvjg7jombv1PR0kxCvjsh6IDrz9LHY | +---------+------------------------------------------------------------------+
Note that ACLs are not supported against Confluent Cloud API keys.
Important
Client applications that connect to the Confluent Cloud cluster must have at least the following three parameters configured:
- API key – available when you create the API key/secret pair the first time
- API secret – available when you create the API key/secret pair the first time
bootstrap.servers
– set to theEndpoint
in the output ofconfluent kafka cluster describe
For details about Confluent CLI service account commands, see Confluent CLI.
Use Confluent Cloud service accounts to produce and consume¶
After creating a service account, you can use it to control application access to Confluent Cloud produce and consume topics:
# Produce to topic
confluent kafka topic produce <topic-service-account-writes-to>
# Consume from topic (-b consumes from beginning of topic)
confluent kafka topic consume -b <topic-service-account-reads-from>