Confluent Cloud Quick Start

This quick start gets you up and running with Confluent Cloud. It shows how to use Confluent Cloud to create topics, produce and consume to an Apache Kafka® cluster. The quick start introduces both the web UI and the Confluent Cloud CLI to manage clusters and topics in Confluent Cloud, as these can be used interchangeably for most tasks.

Follow these steps to set up a Kafka cluster on Confluent Cloud and produce data to Kafka topics on the cluster.

Confluent Cloud is a resilient, scalable streaming data service based on Apache Kafka®, delivered as a fully managed service. Confluent Cloud has a web interface and local command line interface. You can manage cluster resources, settings, and billing with the web interface. You can use Confluent Cloud CLI to create and manage Kafka topics.

For more information about Confluent Cloud, see the Confluent Cloud documentation.


Step 1: Create a Kafka Cluster in Confluent Cloud


This step is for Confluent Cloud users only. Confluent Cloud Enterprise users can skip to Step 2: Install the Confluent Cloud CLI.

  1. Sign in to Confluent Cloud at

  2. Click Create cluster.

  3. Specify a cluster name, choose a cloud provider and region, and click Continue.

  4. Confirm your cluster subscription details, payment information, and click Save and launch cluster.


Step 2: Install the Confluent Cloud CLI

After you have a working Kafka cluster in Confluent Cloud, you can use the Confluent Cloud CLI to interact with your cluster from your local computer. For example you can produce and consume to your topic using the Confluent Cloud CLI.

Scripted installation

Run this command to install the Confluent Cloud CLI. This command creates a bin directory in your designated location (<path-to-directory>/bin).


The CLI installation location must be in your PATH (e.g. /usr/local/bin).

curl -L | sh -s -- -b /<path-to-directory>/bin

Tarball installation

Download and install the raw binaries by platform.

Step 3: Create a Topic

In this step, you log in to your Kafka cluster create a Kafka topic using the Confluent Cloud CLI.


You can also create topics using the Confluent Cloud UI.

  1. Log in to your Confluent Cloud cluster.

    ccloud login --url

    Your output should resemble:

    Enter your Confluent credentials:
    Logged in as
    Using environment t118 ("default")
  2. View your cluster.

    ccloud kafka cluster list

    Your output should resemble:

          Id      |       Name        | Provider |   Region    | Durability | Status
        lkc-emmox | My first cluster  | gcp      | us-central1 | LOW        | UP
  3. Set the active cluster.

    ccloud kafka cluster use lkc-emmox
  4. Create a topic named users.

    ccloud kafka topic create users

    The output from ccloud kafka topic list should resemble:


For more information about Confluent Cloud CLI commands, see Confluent Cloud CLI Command Reference.

Step 4: Create an API Key


An API key and associated secret applies to the active Kafka cluster; in this case, “My first cluster” in the “default” environment. If you add a new cluster, you must create a new API key for producers and consumers on that new Kafka cluster.

  1. Create an API key and secret, and save them. This is required to produce or consume to your topic.

    You can generate the API key from the Confluent Cloud web UI or on the Confluent Cloud CLI.

    • On the web UI, click Cluster settings, then API access tab, and click + Add key.


      Save the key and secret, then click the checkbox next to I have saved my API key and secret and am ready to continue.

    • Or, from the Confluent Cloud CLI, type the following command:

      ccloud api-key create

      Your output should resemble:

      Save the API key and secret. The secret is not retrievable later.
      | API Key | LD35EM2YJTCTRQRM                                                 |
      | Secret  | 67JImN+9vk+Hj3eaj2/UcwUlbDNlGGC3KAIOy5JNRVSnweumPBUpW31JWZSBeawz |

    Regardless of which method you use, be sure to save the API key and secret.

    The API key is listed on Cluster settings > API access on the Confluent Cloud web UI.

  2. Add the API secret with ccloud api-key store <key> <secret>.

    ccloud api-key store LD35EM2YJTCTRQRM 67JImN+9vk+Hj3eaj2/UcwUlbDNlGGC3KAIOy5JNRVSnweumPBUpW31JWZSBeawz
  3. Set the API key to use for Confluent Cloud CLI commands with the command ccloud api-key use <key>.

    ccloud api-key use LD35EM2YJTCTRQRM

Step 5: Create Sample Producer

In this step you use the Confluent Cloud CLI to produce and consume messages from your local workstation to your topic in Confluent Cloud. This demonstrates how you can produce data to Confluent Cloud.

  1. Produce records to the topic named users. Press Ctrl + C to exit.

    ccloud kafka topic produce users

    You can type messages in as standard input. By default they are newline separated. Press Ctrl+C to exit.

    Starting Kafka Producer. ^C to exit
  2. Click the Messages tab in the topics page in the Confluent Cloud UI to view the messages being produced.

  3. Consume items from the topic named users. Press Ctrl+C to exit.

    ccloud kafka topic consume -b users

    Your output should show the items that you entered in ccloud kafka topic produce users.

    Starting Kafka Consumer. ^C to exit
    ^CStopping Consumer.

For more information, see Confluent Cloud CLI Command Reference.

Step 6: Produce Sample Data to Confluent Cloud

In this step, you run a sample producer application which produces data and sends messages to your Confluent Cloud topic.

  1. Copy and paste the following into a file and save as generate-users.js.

    const userIdBase = 'User_'
    const regionIdBase = 'Region_'
    const genderFemale = 'Female'
    const genderMale = 'Male'
    const genderNotSpecified = 'Not Specified'
    const maxUserId = 100
    const maxRegionId = 1000
    let interval = 1000;
    function generateUserId() {
        return generateId(userIdBase, maxUserId);
    function generateRegionId() {
        return generateId(regionIdBase, maxRegionId);
    function generateId(idBase, maxIdValue) {
        var id = Math.floor(Math.random() * maxIdValue);
        var idString = idBase + id;
        return idString;
    function generateGender() {
        var gender = "Not Specified";
        var genderValue = Math.random();
        if (genderValue <= 0.33) {
            gender = "Female"
        } else if (genderValue <= 0.66) {
            gender = "Male"
        return gender;
    function sleep(ms) {
        return new Promise(resolve=>{
    async function begin() {
    while(true) {
            // Get a user ID.
            userId = generateUserId();
            // Concatenate a delimited record of the form <key>:<value>.
            // - The record key is userId, delimited by a ':' character.
            // - The record value is a comma-delimited list of fields.
            //   <userId>:<register-time>,<userId>,<regionId>,<gender>
            // Example record:
            //   User_29:1567546454224,User_29,Region_923,Female
            record = userId + ":";
            record += new Date().valueOf() + ",";
            record += userId + ",";
            record += generateRegionId() + ",";
            record += generateGender();
            await sleep(interval);
  2. Run this command to pipe the output of the generate-users.js file to your topic in Confluent Cloud. Leave this running in your terminal. Press Ctrl+C to end.

    node generate-users.js | ccloud kafka topic produce users

    Your output should resemble:

    Starting Kafka Producer. ^C to exit
  3. Click the Messages tab in the Confluent Cloud topics page in the Confluent Cloud UI to view the messages being produced. The message viewer shows messages produced since the page was loaded, but it doesn’t show a historical view.