Quick Start for Apache Kafka using Confluent Cloud

This quick start gets you up and running with Confluent Cloud using a basic cluster. It shows how to use Confluent Cloud to create topics, how to produce data to an Apache Kafka® cluster, and how to and consume from the cluster. Also, the quick start introduces the Confluent CLI to manage clusters and topics in Confluent Cloud.

Follow these steps to set up a Kafka cluster on Confluent Cloud and produce data to Kafka topics on the cluster.

Confluent Cloud is a resilient, scalable streaming data service based on Apache Kafka®, delivered as a fully managed service. Confluent Cloud has a web interface and local command line interface. You can manage cluster resources, settings, and billing with the web interface. You can use Confluent CLI to create and manage Kafka topics. Sign up for Confluent Cloud to get started.

For more information about Confluent Cloud, see the Confluent Cloud documentation.

Note

Confluent Cloud Console includes an in-product tutorial that guides you through the basic steps for setting up your environment. This tutorial enables you to practice configuring Confluent Cloud components from directly within the console. Log in to Confluent Cloud and follow the tutorial link or click the LEARN button in the console to start the tutorial.

Prerequisites

Step 1: Create a Kafka cluster in Confluent Cloud

  1. Sign in to Confluent Cloud at https://confluent.cloud.

  2. Click Add cluster, and on the Create cluster page, click Basic.

    Screenshot of Confluent Cloud showing the Create Cluster page

    This example creates a Basic cluster, which supports single zone availability. For information about Standard and Dedicated cluster types, see Confluent Cloud Features and Limits by Cluster Type.

  3. Click Begin configuration. The Region/zones page opens. Choose a cloud provider, region, and availability zone. Click Continue.

    Screenshot of Confluent Cloud showing the Create Cluster workflow
  4. Specify a cluster name, review your settings, cost, usage, and click Launch cluster.

    Screenshot of Confluent Cloud showing the Create Cluster workflow
  5. Depending on the chosen cloud provider and other settings, it may take a few minutes to provision your cluster, buy once the cluster has provisioned, the Cluster Overview page displays. Now you can get started configuring apps and data on your new cluster.

    Screenshot of Confluent Cloud showing Cluster Overview page

Step 2: Create a Kafka topic

In this step, you create the users topic by using the Cloud Console.

Tip

You can also create topics by using the Confluent CLI. See Create a Topic.

  1. From the navigation menu, click Topics, and in the Topics page, click Create topic.

    Create topic page Confluent Cloud
  2. In the Topic name field, type “users”. Click Create with defaults.

    Topic page in Confluent Cloud showing a newly created topic

The users topic is created on the Kafka cluster and is available for use by producers and consumers.

Step 3: Create a sample producer

You can produce example data to your Kafka cluster by using the hosted Datagen Source Connector for Confluent Cloud.

  1. From the navigation menu, select Data integration > Connectors. The Connectors page opens.

  2. In the Search box, type “datagen”.

    Screenshot that shows searching for the datagen connector
  3. From the search results, select the Datagen Source connector.

  4. On the Configuration pane, select Users, and then Continue.

  5. On the Topic selection pane, the users topic you created in the previous section should display. Select it and click Continue.

  6. In the Kafka credentials pane, click Generate API key & download. This creates an API key and secret that allows the connector to access your cluster, and downloads the key and secret to your computer.

    An API key is required for the connector and also for the Confluent CLI and ksqlDB CLI to access your cluster.

    Note

    An API key and associated secret apply to the active Kafka cluster. If you add a new cluster, you must create a new API key for producers and consumers on the new Kafka cluster. For more information, see Use API Keys to Control Access.

  7. Enter an optional description for the key, and click Continue.

  8. For Connector sizing, leave the slider at the default of 1 task and click Continue

  9. On the Review and launch page, select the text in the Connector name box and replace it with “DatagenSourceConnector_users”. Then click Launch to start the connector.

  10. Click See all connectors to navigate to the Connectors page. The status of your new connector should read Provisioning, which lasts for a few seconds. When the status changes to Running, your connector is producing data to the users topic.

    Screenshot of Confluent Cloud showing a running Datagen Source Connector

Step 4: Consume messages

  1. In the navigation menu, click Topics to show the list of topics in your cluster.

    Screenshot of Confluent Cloud showing the Topics page
  2. Click the users topic name, and in the details page, click the Messages tab to view the messages being produced to the topic. The message viewer shows messages produced since the page was loaded, but it doesn’t show a historical view.

    Screenshot of Confluent Cloud showing the Messages page

Step 5: Inspect the data stream

Track the movement of data through your cluster by using Stream Lineage, where you can see sources, sinks, and topics and monitor messages as they move for one to another.

  1. In the users topic page, click See in Stream Lineage. The stream lineage for the users topic is shown.

    Screenshot of Confluent Cloud showing the data streams page
  2. Click the node labeled DatagenSourceConnector_users, which is the connector that you created in Step 3. The details view opens, showing graphs for total production and other data.

    Screenshot of Confluent Cloud showing details for a source connector
  3. Dismiss the details view and click the Topic labeled users. The details view opens, showing graphs for total throughput and other data.

    Screenshot of Confluent Cloud showing details for a topic

Step 6: Delete the connector and topic

If you don’t plan to complete any next steps and you’re ready to quit the Quick Start, delete the resources you created to avoid unexpected charges to your account.

Skip this step if you want to see how you can use SQL statements to query your data in the Confluent Cloud ksqlDB Quick Start.

  • Delete the connector:
    1. From the navigation menu, select Data Integration > Connectors.
    2. Click DatagenSourceConnector_users and in the details page, click Delete.
    3. Enter the connector name (DatagenSourceConnector_users), and click Confirm.
  • Delete the topic:
    1. From the navigation menu, click Topics, select the users topic, and choose the Configuration tab.
    2. Click Delete topic, enter the topic name (users), and click Continue.