Confluent Cloud Quick Start

This quick start gets you up and running with Confluent Cloud. It shows how to use Confluent Cloud to create topics, produce and consume to an Apache Kafka® cluster, and query KSQL streams and tables.

Confluent Cloud is a resilient, scalable streaming data service based on Apache Kafka®, delivered as a fully managed service. Confluent Cloud has a web interface and local command line interface. You can manage cluster resources, settings, and billing with the web interface. You can use Confluent Cloud CLI to create and manage Kafka topics.

For more information about Confluent Cloud, see the Confluent Cloud documentation.

Follow these steps to set up a Kafka cluster on Confluent Cloud, produce data to Kafka topics on the cluster, and write streaming queries on cloud-hosted KSQL.

  1. Create a Kafka Cluster in Confluent Cloud
  2. Install and Configure the Confluent Cloud CLI
  3. Configure Confluent Cloud Schema Registry for Your Environment
  4. Install Confluent Platform Locally
  5. Create a KSQL Application in Confluent Cloud
  6. Inspect Topics By Using Confluent Cloud KSQL
  7. Create a KSQL Stream and KSQL Table
  8. Write Persistent Queries
  9. Monitor Persistent Queries

Create a Kafka Cluster in Confluent Cloud

Important

This step is for Confluent Cloud users only. Confluent Cloud Enterprise users can skip to Install and Configure the Confluent Cloud CLI.

  1. Sign in to Confluent Cloud at https://confluent.cloud.

  2. Click Create cluster.

    ../_images/cloud-create-cluster1.png
  3. Specify a cluster name, choose a cloud provider, and click Continue. Optionally, you can specify read and write throughput, storage, region, and durability.

    ../_images/cloud-pay-launch1.png
  4. Confirm your cluster subscription details, payment information, and click Save and launch cluster.

    ../_images/cloud-enter-credit1.png

Install and Configure the Confluent Cloud CLI

After you have a working Kafka cluster in Confluent Cloud, you can use the Confluent Cloud CLI to interact with your cluster from your local computer. For more information about installing the Confluent Cloud CLI, see Install the Confluent Cloud CLI.

  1. From the Environment overview page, click your cluster name.

    ../_images/cloud-view-details3.png
  2. In the navigation menu, click Data In/Out and select CLI. Follow the on-screen Confluent Cloud CLI installation instructions.

    ../_images/cloud-cli-config1.png
  3. When you run the ccloud init command for the first time, the ccloud CLI creates a configuration file, named ~/.ccloud/config.

Configure Confluent Cloud Schema Registry for Your Environment

You can use Confluent Cloud Schema Registry to manage schemas in Confluent Cloud. You can create and edit schemas in a schema editor and associate them with Kafka topics.

Important

Enable Schema Registry for Your Environment

  1. From the Environment Overview page, click SCHEMA REGISTRY.

  2. Click Enable Schema Registry.

    ../_images/cloud-sr-enable.png
  3. Under Make requests to this Schema Registry endpoint, copy the Schema Registry endpoint to a safe location.

For more information, see Managing Schemas for Topics in Confluent Cloud.

Configure the Confluent Cloud CLI for Schema Registry

To access Confluent Cloud Schema Registry with the Confluent Cloud CLI, generate an API access key and assign the key in the Confluent Cloud CLI configuration file.

Note

The API key for Confluent Cloud Schema Registry is distinct from the API key that you created in a previous step for accessing Confluent Cloud.

  1. Click Manage Keys to open the API access list of Schema Registry API keys.

    Screenshot of the Schema Registry page in Confluent Cloud
  2. Click Add key to create a new Schema Registry API key. In the Create a new Schema Registry API key dialog, copy the API Key and the API Secret to a safe location.

    Screenshot of the Create New Schema Registry API Key dialog in Confluent Cloud
  3. When the API Key and API Secret are saved, check I have saved my API key and secret and am ready to continue. and click Continue.

    Screenshot of the Registry API Key page in Confluent Cloud
  4. In the Confluent Cloud configuration file, add the Schema Registry authorization settings that you saved in a previous step. The default location for the file is ~/.ccloud/config.

    basic.auth.credentials.source=USER_INFO
    schema.registry.basic.auth.user.info=<schema-registry-api-key>:<schema-registry-api-secret>
    schema.registry.url=https://<schema-registry-url>
    
  5. In the Confluent Cloud configuration file, add the Avro serializer.

    // Enable Avro serializer with Schema Registry (optional but recommended)
    key.serializer=io.confluent.kafka.serializers.KafkaAvroSerializer
    value.serializer=io.confluent.kafka.serializers.KafkaAvroSerializer
    

    Your configuration file should resemble this.

    cat ~/.ccloud/config
    ssl.endpoint.identification.algorithm=https
    sasl.mechanism=PLAIN
    request.timeout.ms=20000
    bootstrap.servers=<bootstrap-server-url>
    retry.backoff.ms=500
    sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="<kafka-api-key>" password="<kafka-api-secret>";
    security.protocol=SASL_SSL
    
    // Schema Registry specific settings
    basic.auth.credentials.source=USER_INFO
    schema.registry.basic.auth.user.info=<schema-registry-api-key>:<schema-registry-api-secret>
    schema.registry.url=<schema-registry-url>
    
    // Enable Avro serializer with Schema Registry (optional but recommended)
    key.serializer=io.confluent.kafka.serializers.KafkaAvroSerializer
    value.serializer=io.confluent.kafka.serializers.KafkaAvroSerializer
    
  6. Save your changes to the Confluent Cloud configuration file.

  7. Optional: Verify that your Schema Registry credentials are properly configured, where Schema Registry API key (<schema-registry-api-key>), API secret (<schema-registry-api-secret>), and endpoint (<schema-registry-url>) are specified.

    Run this command to authenticate with the cluster and list the topics registered in your schema.

    curl -u <schema-registry-api-key>:<schema-registry-api-secret> \
    <schema-registry-url>/subjects
    

    If no subjects are created, your output will be empty ([]). If you have subjects, your output should resemble:

    ["test2-value"]
    

    Here is an example command:

    curl -u GFS3CFEX5ELQTB6O:2Nc69uGg+oKOdajqGXfZzuUoN1mBZarD0ZLiE5I9sVIQQhuNti1NSBnzslpgNG5J \
    https://psrc-lq2dm.us-east-2.aws.confluent.cloud/subjects
    

Install Confluent Platform Locally

To stream data to your Kafka cluster in Confluent Cloud, set up Confluent Platform on your local computer and install the Replicator connector. The Replicator connector streams your local Kafka topics to your Kafka cluster in Confluent Cloud. For more information, see Confluent Replicator.

The following diagram shows how the local Kafka cluster communicates with your Confluent Cloud cluster. For this demo, the Schema Registry instance in the local "origin" cluster is turned off, and only Confluent Cloud Schema Registry is used.

Diagram showing how a local Kafka cluster extends to Confluent Cloud Schema Registry

The ccloud example starts Confluent Platform by using a script named start-docker.sh, or start.sh if you're not using a Docker deployment.

Note

The following steps assume that you're using Docker Compose to run your Confluent Platform deployment. If you prefer, you can use a local installation. For more information, see On-Premises Deployments.

  1. Clone the examples GitHub repository.

    git clone https://github.com/confluentinc/examples
    
  2. Change directory to the ccloud demo.

    cd examples/ccloud
    
  3. In start-docker.sh, change the value of USE_CONFLUENT_CLOUD_SCHEMA_REGISTRY to true. Change this value in start.sh if you're not using Docker Compose.

  4. In schema_registry_docker.config, add the following settings. Change this value in schema_registry.config if you're not using Docker Compose. These are the same settings you added to the ccloud configuration file.

    basic.auth.credentials.source=USER_INFO
    schema.registry.basic.auth.user.info=<schema-registry-api-key>:<schema-registry-api-secret>
    schema.registry.url=https://<schema-registry-url>
    
  5. Start the demo by using the Docker Compose command or by entering the local installation. Typically, it takes less than five minutes for all Confluent Platform components to start.

    # For Docker Compose
    ./start-docker.sh
    
    # For Confluent Platform local install using Confluent CLI
    ./start.sh
    

    The script creates DatagenConnector instances that produce data to pageviews and users topics on your local Kafka cluster. Also, it creates a Replicator connector, which streams data to your Kafka cluster in Confluent Cloud.

  6. If you're using Docker Compose, check that the Confluent Platform containers are running.

    docker-compose ps
    

    Your output should resemble:

           Name                      Command                State                     Ports
    ------------------------------------------------------------------------------------------------------------------
    connect-cloud         /etc/confluent/docker/run        Up         8083/tcp, 0.0.0.0:8087->8087/tcp, 9092/tcp
    connect-local         /etc/confluent/docker/run        Up         0.0.0.0:8083->8083/tcp, 9092/tcp
    control-center        /etc/confluent/docker/run        Up         0.0.0.0:9021->9021/tcp
    kafka                 /etc/confluent/docker/run        Up         0.0.0.0:29092->29092/tcp, 0.0.0.0:9092->9092/tcp
    kafka-create-topics   bash -c echo Waiting for K ...   Up         9092/tcp
    ksql-cli              /bin/sh                          Up
    ksql-server           /etc/confluent/docker/run        Up         8088/tcp, 0.0.0.0:8089->8089/tcp
    replicator            tail -f /dev/null                Up         8083/tcp, 9092/tcp
    schema-registry       /etc/confluent/docker/run        Exit 137
    zookeeper             /etc/confluent/docker/run        Up         0.0.0.0:2181->2181/tcp, 2888/tcp, 3888/tcp
    

Note

The schema-registry container reports the state Exit 137, because the start-docker.sh script de-activated it in favor of Confluent Cloud Schema Registry.

Inspect Topics By Using Confluent Cloud

  1. Open your browser to the URL of your Confluent Cloud environment, which is similar to https://confluent.cloud/environments/<your-environment>.

  2. In the navigation menu, click Topics to view the pageviews and users topics that you created previously.

    Screenshot of Confluent Cloud showing the Topics page
  3. On the pageviews topic, click ... and select Schema. The Schema Editor opens, showing the Avro schema for the pageviews topic.

    Screenshot of Confluent Cloud showing the Schema Editor
  4. Click Inspect to open the Inspect Topic page, which shows the most recent records as they stream to the pageviews topic.

    Screenshot of Confluent Cloud showing the Inspect Topic page

Create a KSQL Application in Confluent Cloud

To write queries against KSQL streams and tables, create a new KSQL application in Confluent Cloud.

Important

KSQL is currently available as a preview. For more information, see Confluent Cloud KSQL Preview.

  1. In the the navigation menu, click KSQL.

    Screenshot of Confluent Cloud showing the KSQL Add Application page
  2. Click Add Application, and in the Application name textbox, enter ksql-app1. Click Continue.

    Screenshot of Confluent Cloud showing the KSQL Add Application wizard
  3. Confirm your payment details and click Launch cluster.

    Screenshot of Confluent Cloud showing the KSQL Add Application wizard
  4. The new KSQL application appears in the Application list.

    Screenshot of Confluent Cloud showing the KSQL Applications page

Note

It may take a few minutes to provision the KSQL cluster. When KSQL is ready, its Status changes from Provisioning to Up.

Inspect Topics By Using Confluent Cloud KSQL

  1. In the navigation menu, click KSQL to open the KSQL Applications page.

  2. On ksql-app1, click ... and select KSQL Editor.

    Screenshot of Confluent Cloud showing the KSQL Applcation context menu
  3. In the editing window, use the SHOW TOPICS statement to see the available topics on the Kafka cluster. Click Run to start the query.

    SHOW TOPICS;
    

    Your output should resemble:

    Screenshot of Confluent Cloud showing the KSQL Editor
  4. In the Query Results window, scroll down to view the pageviews topic that you created previously. Your output should resemble:

    {
       "name": "pageviews",
       "registered": false,
       "replicaInfo": [
         3,
         3,
         3,
         3,
         3,
         3,
         3,
         3,
         3,
         3,
         3,
         3
       ],
       "consumerCount": 12,
       "consumerGroupCount": 1
     },
    

    The "registered": false indicator means that you haven't created a stream or table on top of this topic, so you can't write streaming queries against it yet.

Create a KSQL Stream and KSQL Table

To write streaming queries against the pageviews and users topics, register the the topics with KSQL as a stream and a table. Use the CREATE STREAM and CREATE TABLE statements in the KSQL Editor.

These examples query records from the pageviews and users topics using the following schema.

ER diagram showing a pageviews stream and a users table with a common userid column

Create a Stream in the KSQL Editor

You can create a stream or table by using the CREATE STREAM and CREATE TABLE statements in the KSQL Editor, just like how you use them in the KSQL CLI.

  1. Copy the following code into the editing window and click Run.

    CREATE STREAM pageviews_original (viewtime bigint, userid varchar, pageid varchar) WITH
    (kafka_topic='pageviews', value_format='AVRO');
    

    Your output should resemble:

    Screenshot of the KSQL CREATE STREAM statement in Confluent Cloud
  2. In the editing window, use the SHOW TOPICS statement to inspect the status of the pageviews topic. Click Run to start the query.

    SHOW TOPICS;
    
  3. In the Query Results window, scroll to the bottom to view the pageviews topic. Your output should resemble:

    {
      "name": "pageviews",
      "registered": true,
      "replicaInfo": [
        3,
        3,
        3,
        3,
        3,
        3,
        3,
        3,
        3,
        3,
        3,
        3
      ],
      "consumerCount": 12,
      "consumerGroupCount": 1
    },
    

    The "registered": true indicator means that you have registered the topic and you can write streaming queries against it.

  4. In the editing window, use a SELECT query to inspect records in the pageviews_original stream.

    SELECT * FROM PAGEVIEWS_ORIGINAL;
    

    Your output should resemble:

    Screenshot of a KSQL SELECT query in Confluent Cloud
  5. The query continues until you end it explicitly. Click Stop to end the query.

Create a Table in Confluent Cloud KSQL

Confluent Cloud KSQL guides you through the process of registering a topic as a stream or a table.

  1. In the KSQL Editor, navigate to Tables and click Add a table. The Create a KSQL Table dialog opens.

    Screenshot of the Create a KSQL Table wizard in Confluent Cloud
  2. Click users to fill in the details for the table.

    • In the How are your messages encoded? dropdown, select AVRO.
    • Change the name of the first field to registertime, and set its type to BIGINT.
    • Add three more fields to complete the schema for the users topic.
      • gender VARCHAR
      • regionid VARCHAR
      • userid VARCHAR
    • KSQL tables require a key field. In the Key dropdown, select userid.
    Screenshot of the Create a KSQL Table wizard in Confluent Cloud
  3. Click Save Table to create a KSQL table on the the users topic.

  4. In the Tables list, find the USERS table and click .... In the context menu, select Query to open the KSQL Editor with a suggested query.

    Screenshot of the KSQL Table context menu in Confluent Cloud
  5. Click Run to display the query results.

    Screenshot of a KSQL SELECT query in Confluent Cloud

    The Query Results pane displays query status information, like Messages/sec, and it shows the fields that the query returns.

  6. The query continues until you end it explicitly. Click Stop to end the query.

Write Persistent Queries

With the pageviews topic registered as a stream, and the users topic registered as a table, you can write streaming queries that run until you end them with the TERMINATE statement.

  1. Copy the following code into the editing window and click Run.

    CREATE STREAM pageviews_enriched AS
    SELECT users.userid AS userid, pageid, regionid, gender
    FROM pageviews_original
    LEFT JOIN users
      ON pageviews_original.userid = users.userid;
    

    Your output should resemble:

    Screenshot of the KSQL CREATE STREAM AS SELECT statement in Confluent Cloud
  2. To inspect your persistent queries, navigate to the Running Queries page, which shows details about the pageviews_enriched stream that you created in the previous query.

    Screenshot of the KSQL Running Queries page in Confluent Cloud
  3. Click Explain to see the schema and query properties for the persistent query.

    Screenshot of the KSQL Explain Query dialog in Confluent Cloud

Monitor Persistent Queries

You can monitor your persistent queries visually by using Confluent Cloud.

In the navigation menu, click Consumer lag and find the topic that corresponds with your pageviews_enriched stream, which is named _confluent-ksql-<cluster-name>_CSAS_PAGEVIEWS_ENRICHED_0. This view shows how well your persistent query is keeping up with the incoming data.

Screenshot of the Consumer Lag page in Confluent Cloud

Shut Down Your Local Confluent Platform Environment and Clean Up

Use the following steps when you're ready to shut down your local Kafka cluster. Your Confluent Cloud environment isn't affected.

Remove the connectors.

curl -X DELETE http://localhost:8087/connectors/replicator
curl -X DELETE http://localhost:8087/connectors/datagen-users
curl -X DELETE http://localhost:8083/connectors/datagen-pageviews

Shut down the Confluent Platform stack.

./stop-docker.sh