Get Started with Confluent CLI¶
The Confluent command-line interface (CLI), confluent
, enables developers to
manage both Confluent Cloud and Confluent Platform and is source-available under the Confluent Community License
(for more details, check out the Announcing the Source Available Confluent CLI
blog post). The Confluent CLI is feature-packed to help users go from learning
Confluent to building automated workflows.
Prerequisites¶
Before you proceed with the Confluent CLI Quick start, ensure you review the following requirements.
Operating systems¶
The Confluent CLI is compatible with the following operating systems and architectures only:
- macOS with 64-bit Intel chips (Darwin AMD64)
- macOS with Apple chips (Darwin ARM64)
- Windows with 64-bit Intel or AMD chips (Microsoft Windows AMD64)
- Linux with 64-bit Intel or AMD chips (Linux AMD64)
- Linux with 64-bit ARM chips (Linux ARM64)
- Alpine with 64-bit Intel or AMD chips (Alpine AMD64)
- Alpine with 64-bit ARM chips (Alpine ARM64)
Note that on non-Alpine Linux systems, the glibc
is dynamically linked when
Confluent CLI executes. On all other systems, the dependencies are statically
linked.
Confluent Platform versions¶
For the compatible Confluent Platform versions for this version of Confluent CLI, see the compatibility table.
The Confluent CLI for Confluent Platform requires that you have the Confluent REST Proxy server for Apache Kafka running. The Confluent REST Proxy server mediates uses the APIs and mediate between the Confluent CLI and your clusters.
This is not required for the Apache Kafka® tools or “scripts” that come with Kafka and ship with Confluent Platform. These alternatives to the Confluent CLI for Confluent Platform do not require the Confluent REST Proxy service to be running. Therefore, the Confluent Platform tutorials in the documentation sometimes feature Kafka scripts rather than the Confluent CLI commands in order to simplify setup for getting started tasks. For example, the basic Cluster Linking tutorial for Confluent Platform that describes how to Share data across topics uses the Kafka scripts throughout (such as
kafka-cluster-links --list
to list mirror topics rather than confluent kafka topic list or confluent kafka link list). In such scenarios, running the Confluent CLI commands would fail to work if you did not have the REST Proxy server running.(This is also not an issue for the Confluent CLI on Confluent Cloud, which is fully-managed, and integrates with the Confluent Cloud APIs under the hood.)
Network access¶
When the Confluent CLI interacts with Confluent Cloud, it requires network access to the following domains:
confluent.cloud
.login.confluent.io
when using SSO.api.stripe.com
when using the confluent billing payment commands.s3-us-west-2.amazonaws.com/confluent.cloud
when the update check is enabled.
To minimize access to these domains, you can:
- Disable update checks.
- Do not use the confluent billing payment commands.
Quick start¶
To get started, install the latest version of the Confluent CLI by completing the following steps. For more installation options, see the Install Confluent CLI page.
Download and install the latest version in the default directory,
./bin
:curl -sL --http1.1 https://cnfl.io/cli | sh -s -- latest
Add the
./bin
directory to your$PATH
:export PATH=$(pwd)/bin:$PATH
Sign up for a free Confluent Cloud account by entering the following command in your terminal:
confluent cloud-signup
You should be redirected to the free Confluent Cloud account sign up page.
After you have signed up for a free account, start autocomplete by entering the following command in your terminal:
confluent shell
Using the
confluent
interactive shell, enter the following command to log in to your Confluent Cloud account:login
If your credentials are not saved locally, you must enter your credentials as shown in the following output:
Enter your Confluent Cloud credentials: Email: Password:
Note
- If you signed up for a free Confluent Cloud account using your GitHub or Google credentials, you must provide your GitHub or Google username and password to sign in.
- You add the
--save
flag if you want to save your credentials locally. This prevents you from having to enter them again in the future.
Create your first Kafka cluster:
kafka cluster create <name> --cloud <cloud provider> --region <cloud region>
For example:
kafka cluster create dev0 --cloud aws --region us-east-1
You should see output similar to following:
It may take up to 5 minutes for the Kafka cluster to be ready. +-----------------------+----------------------------------------------------------+ | Current || false | | ID || lkc-dfgrt7 | | Name || dev0 | | Type || BASIC | | Ingress Limit (MB/s) || 250 | | Egress Limit (MB/s) || 750 | | Storage || 5 TB | | Provider || aws | | Region || us-east-1 | | Availability || single-zone | | Status || PROVISIONING | | Endpoint || SASL_SSL://xxx-xxxx.us-east-1.aws.confluent.cloud:1234 | | REST Endpoint || https://yyy-y11yy.us-east-1.aws.confluent.cloud:345 | +-----------------------+----------------------------------------------------------+
Create a topic in the cluster using the cluster ID from the output of the previous step:
kafka topic create <name> --cluster <cluster ID>
For example:
kafka topic create test_topic --cluster lkc-dfgrt7
You should see output confirming that the topic was created:
Created topic "test_topic".
Create an API key for the cluster:
api-key create --resource lkc-dfgrt7
You should see output similar to the following:
It may take a couple of minutes for the API key to be ready. Save the API key and secret. The secret is not retrievable later. +-------------+-------------------------------------------------------------------+ | API Key | <YOUR API KEY> | | API Secret | <YOUR API SECRET> | +-------------+-------------------------------------------------------------------+
Produce messages to your topic:
kafka topic produce <topic name> --api-key <YOUR API KEY> --api-secret <YOUR API SECRET>
For example:
kafka topic produce test_topic --api-key <YOUR API KEY> --api-secret <YOUR API SECRET>
You should see output similar to:
Starting Kafka Producer. Use Ctrl-C or Ctrl-D to exit.
Once the producer is active, type messages, delimiting them with return. For example:
today then now forever
When you’re finished producing, exit with
Ctrl-C
orCtrl-D
.Read back your produced messages, from the beginning:
kafka topic consume <topic name> --api-key <YOUR API KEY> --api-secret <YOUR API SECRET> --from-beginning
For example:
kafka topic consume test_topic --api-key <YOUR API KEY> --api-secret <YOUR API SECRET> --from-beginning
Based on the previous messages entered, you should see output similar to:
Starting Kafka Consumer. Use Ctrl-C to exit. forever now today then
Check for Confluent CLI updates¶
Important
On Alpine Linux, you can’t directly upgrade from the Confluent CLI
versions v2.0.0
through v2.17.1
using the standard confluent
update
command. Running confluent update
on those versions on Alpine
Linux will result in an updated confluent
client that is incompatible
with the operating system.
To upgrade from the v2.0.0
through v2.17.1
versions on Alpine Linux,
remove your existing confluent
client and re-install using the command:
curl -sL --http1.1 https://cnfl.io/cli | sh -s -- latest
.
confluent v2.17.2
and later can be updated directly with confluent
update
on Alpine Linux.
Confluent CLI provides an option to check for a newer version of Confluent CLI.
The check is controlled by the disable_update_check
setting in the
~/.confluent/config.json
file. The Confluent CLI checks for updates once a
day when it is set to "disable_update_check": false
.
The default setting is:
- The independently downloaded Confluent CLI has update checks enabled
(
"disable_update_check": false
). - The Confluent CLI packaged with Confluent Platform has update checks disabled
(
"disable_update_check": true
).
When the update check feature is enabled, your Confluent CLI needs access to
s3-us-west-2.amazonaws.com/confluent.cloud
. For more information, see
Network access.
When a check returns with a message that a newer version of Confluent CLI is available, you can update the Confluent CLI to the new version using the confluent update command.