Confluent Cloud Quick Start¶
This quick start gets you up and running with Confluent Cloud using a basic cluster. It shows how to use Confluent Cloud to create topics, produce and consume to an Apache Kafka® cluster. The quick start introduces both the web UI and the Confluent Cloud CLI to manage clusters and topics in Confluent Cloud, as these can be used interchangeably for most tasks.
Follow these steps to set up a Kafka cluster on Confluent Cloud and produce data to Kafka topics on the cluster.
Confluent Cloud is a resilient, scalable streaming data service based on Apache Kafka®, delivered as a fully managed service. Confluent Cloud has a web interface and local command line interface. You can manage cluster resources, settings, and billing with the web interface. You can use Confluent Cloud CLI to create and manage Kafka topics. Sign up for Confluent Cloud to get started.
For more information about Confluent Cloud, see the Confluent Cloud documentation.
- Prerequisites
- Access to Confluent Cloud
- Supported Features for Confluent Cloud
- Internet connectivity.
Step 1: Create a Kafka Cluster in Confluent Cloud¶
Important
This step is for Confluent Cloud users only. Confluent Cloud Enterprise users can skip to Step 2: Install the Confluent Cloud CLI.
Sign in to Confluent Cloud at https://confluent.cloud.
Click Create cluster.
Specify a cluster name, cloud provider and region, cluster type Basic, and click Continue.
Important
This quick start creates a Basic cluster which only supports single zone availability. For information about other cluster types, including standard and dedicated types, see Confluent Cloud Cluster Types.
Confirm your cluster subscription details, payment information, and click Launch.
Important
This step is only applicable to non-marketplace users.
Step 2: Install the Confluent Cloud CLI¶
After you have a working Kafka cluster in Confluent Cloud, you can use the Confluent Cloud CLI to interact with your cluster from your local computer. For example you can produce and consume to your topic using the Confluent Cloud CLI.
Scripted installation¶
Run this command to install the Confluent Cloud CLI. This command creates a bin
directory in your designated
location (<path-to-directory>/bin
). On Windows, an appropriate Linux environment may need to be installed
in order to have the curl
and sh
commands available, such as the
Windows Subsystem for Linux.
Important
The CLI installation location must be in your PATH (e.g. /usr/local/bin
).
curl -L --http1.1 https://cnfl.io/ccloud-cli | sh -s -- -b /<path-to-directory>/bin
Tarball installation¶
Download and install the raw binaries by platform.
Step 3: Create a Topic¶
In this step, you log in to your Kafka cluster create a Kafka topic using the Confluent Cloud CLI.
Tip
You can also create topics using the Confluent Cloud UI.
Log in to your Confluent Cloud cluster.
ccloud login
Your output should resemble:
Enter your Confluent credentials: Email: jdoe@myemail.io Password: Logged in as jdoe@myemail.io Using environment t118 ("default")
View your cluster.
ccloud kafka cluster list
Your output should resemble:
Id | Name | Provider | Region | Durability | Status +-------------+-------------------+----------+-------------+------------+--------+ lkc-emmox | My first cluster | gcp | us-central1 | LOW | UP
Set the active cluster.
ccloud kafka cluster use lkc-emmox
Create a topic named
users
.ccloud kafka topic create users
The output from
ccloud kafka topic list
should resemble:Name +------------+ users
For more information about Confluent Cloud CLI commands, see Confluent Cloud CLI Command Reference.
Step 4: Create an API Key¶
Tip
An API key and associated secret applies to the active Kafka cluster; in this
case, “My first cluster” in the “default” environment. If you add a new cluster,
you must create a new API key for producers and consumers on that new Kafka
cluster. In the following steps, provide the ID for the cluster in place of <resource-id>
.
Create an API key and secret, and save them. This is required to produce or consume to your topic.
You can generate the API key from the Confluent Cloud web UI or on the Confluent Cloud CLI. Be sure to save the API key and secret.
On the web UI, click the Kafka API keys tab, and click + Add key. Save the key and secret, then click the checkbox next to I have saved my API key and secret and am ready to continue.
Or, from the Confluent Cloud CLI, type the following command:
ccloud api-key create --resource <resource-id>
Your output should resemble:
Save the API key and secret. The secret is not retrievable later. +---------+------------------------------------------------------------------+ | API Key | LD35EM2YJTCTRQRM | | Secret | 67JImN+9vk+Hj3eaj2/UcwUlbDNlGGC3KAIOy5JNRVSnweumPBUpW31JWZSBeawz | +---------+------------------------------------------------------------------+
Optional: Add the API secret with
ccloud api-key store <key> <secret>
. When you create an API key with the CLI, it is automatically stored locally. However, when you create an API key using the UI, API, or with the CLI on another machine, the secret is not available for CLI use until you store it. This is required because secrets cannot be retrieved after creation.ccloud api-key store LD35EM2YJTCTRQRM 67JImN+9vk+Hj3eaj2/UcwUlbDNlGGC3KAIOy5JNRVSnweumPBUpW31JWZSBeawz \ --resource <resource-id>
Set the API key to use for Confluent Cloud CLI commands with the command
ccloud api-key use <key>
.ccloud api-key use --resource <resource-id> LD35EM2YJTCTRQRM
Step 5: Create Sample Producer¶
In this step you use the Confluent Cloud CLI to produce and consume messages from your local workstation to your topic in Confluent Cloud. This demonstrates how you can produce data to Confluent Cloud.
Produce records to the topic named
users
. PressCtrl + C
to exit.ccloud kafka topic produce users
You can type messages in as standard input. By default they are newline separated. Press Ctrl+C to exit.
Starting Kafka Producer. ^C to exit foo bar baz ^C
Click the Messages tab in the topics page in the Confluent Cloud UI to view the messages being produced.
Consume items from the topic named
users
. Press Ctrl+C to exit.ccloud kafka topic consume -b users
Your output should show the items that you entered in
ccloud kafka topic produce users
.Starting Kafka Consumer. ^C to exit foo bar baz ^CStopping Consumer.
For more information, see Confluent Cloud CLI Command Reference.
Step 6: Produce Sample Data to Confluent Cloud¶
In this step, you run a sample producer application which produces data and sends messages to your Confluent Cloud topic.
- Prerequisite
Copy and paste the following into a file and save as
generate-users.js
.const userIdBase = 'User_' const regionIdBase = 'Region_' const genderFemale = 'Female' const genderMale = 'Male' const genderNotSpecified = 'Not Specified' const maxUserId = 100 const maxRegionId = 1000 let interval = 1000; function generateUserId() { return generateId(userIdBase, maxUserId); } function generateRegionId() { return generateId(regionIdBase, maxRegionId); } function generateId(idBase, maxIdValue) { var id = Math.floor(Math.random() * maxIdValue); var idString = idBase + id; return idString; } function generateGender() { var gender = "Not Specified"; var genderValue = Math.random(); if (genderValue <= 0.33) { gender = "Female" } else if (genderValue <= 0.66) { gender = "Male" } return gender; } function sleep(ms) { return new Promise(resolve=>{ setTimeout(resolve,ms) }) } async function begin() { while(true) { // Get a user ID. userId = generateUserId(); // Concatenate a delimited record of the form <key>:<value>. // - The record key is userId, delimited by a ':' character. // - The record value is a comma-delimited list of fields. // // <userId>:<register-time>,<regionId>,<gender> // // Example record: // User_29:1567546454224,User_29,Region_923,Female record = userId + ":"; record += new Date().valueOf() + ","; record += generateRegionId() + ","; record += generateGender(); console.log(record); await sleep(interval); } } begin();
Navigate to the directory where you saved
generate-users.js
and run this command. This command pipes the output of thegenerate-users.js
file to your topic in Confluent Cloud. Leave this running in your terminal. PressCtrl+C
to end.node generate-users.js | ccloud kafka topic produce users
Your output should resemble:
Starting Kafka Producer. ^C to exit
Click the Messages tab in the Confluent Cloud topics page in the Confluent Cloud UI to view the messages being produced. The message viewer shows messages produced since the page was loaded, but it doesn’t show a historical view.
Next Steps¶
- Connecting Clients to Confluent Cloud
- Create streaming queries in Confluent Cloud ksqlDB
- Run an automated Confluent Cloud quickstart with Avro, Protobuf, and JSON formats
- Manage Schemas on Confluent Cloud
- Connect External Systems to Confluent Cloud
- Connect your components and data to Confluent Cloud
- Configure Multi-Node Environment
- Confluent Cloud documentation
- Confluent Cloud Demos Overview
- Confluent Cloud schemas limits and how to free up space