Import and Export a Pipeline for Stream Designer on Confluent Cloud

Stream Designer enables exporting a pipeline to a SQL file and importing it into another pipeline.

Step 1: Create a pipeline project

A Stream Designer pipeline project defines all the components that are deployed for an application. In this step, you create a pipeline project and a canvas for designing the component graph.

  1. Log in to the Confluent Cloud Console and open the Cluster Overview page for the cluster you want to use for creating pipelines.

  2. In the navigation menu, click Stream Designer.

  3. Click Create pipeline.

    The Create a new pipeline page opens.

    Stream Designer Create New Pipeline page in Confluent Cloud Console

Step 2: Create a table definition

In this step, you start creating your pipeline by using a CREATE TABLE statement. The statement includes the schema definition for a users table that has the following columns.

ID STRING PRIMARY KEY
REGISTERTIME BIGINT
USERID STRING
REGIONID STRING
GENDER STRING
  1. Click Start with SQL and click Start building.

    The source code editor opens.

  2. Copy the following code into the source code editor.

    CREATE OR REPLACE TABLE "users_table" (ID STRING PRIMARY KEY, REGISTERTIME BIGINT, USERID STRING, REGIONID STRING, GENDER STRING)
      WITH (kafka_topic='users_topic', partitions=1, value_format='JSON_SR');
    

    Your output should resemble:

    Stream Designer source code editor in Confluent Cloud Console
  3. Click Apply changes.

    Topic and Table components appear on the canvas.

    Stream Designer topic component in Confluent Cloud Console
  4. In the source code editor, click X to dismiss the code view.

Step 2: Create a connector for the topic

  1. In the Components menu, click Source Connector.

  2. Hover over the Datagen connector component, click + and drag the arrow to the Topic icon.

    Stream Designer drag-drop connector to topic components in Confluent Cloud Console

Step 3: Configure the connector

  1. In the Source Connector component, click Configure.

    The Source Connector page opens.

  2. In the search box, enter “datagen”.

    Stream Designer Datagen source connector search in Confluent Cloud Console
  3. Click the Datagen Source tile to open the Configuration page.

  4. In the Topic textbox, type “users_topic”.

    Stream Designer Datagen source connector configuration in Confluent Cloud Console
  5. Click Continue.

    The Kafka credentials page opens.

    Stream Designer Datagen source connector configuration in Confluent Cloud Console
  6. Ensure that the the Global access tile is selected and click Generate API key & download to create the API key for the Datagen connector.

    A text file containing the newly generated API key and secret is downloaded to your local machine.

  7. Click Continue.

  8. In the Select output record value format section, click JSON, and in the Select a template section, click Users.

    Stream Designer Datagen Source Connector configuration for mock users data in Confluent Cloud Console
  9. Click Continue.

  10. In the Connector sizing section, leave the minimum number of tasks at 1 and click Continue.

  11. In the Connector name textbox, enter “Datagen_users” and click Continue.

    The Datagen source connector is configured and appears on the canvas with a corresponding topic component. The topic component is configured with the name you provided during connector configuration.

You have a Datagen Source connector ready to produce mock user data to a Kafka topic named users_topic with a table registered on the topic.

Stream Designer connector and topic components in Confluent Cloud Console

Step 4: Export the pipeline definition

In this step, you export the SQL code that defines your pipeline to a file on yout local machine.

  1. Click the View source code icon (view-source-icon).

    Stream Designer canvas with the view source code button highlighted in Confluent Cloud Console
  2. Inspect the SQL code for the pipeline. Your output should resemble:

    Stream Designer source code editor in Confluent Cloud Console
  3. Click the download button.

    A .sql file is downloaded to your computer. Your pipeline is saved, you can upload it later to another pipeline.

  4. In the source code editor, click X to dismiss it.

Step 5: Import the pipeline code

In this step, you delete the pipeline components from the canvas and import the pipeline definition that you saved earlier.

  1. Right-click on the Connector component and click Remove.

  2. Right-click on the Topic component and click Remove.

    The canvas resets to the Create a new pipeline page.

  3. Click Start with SQL and Start building.

  4. In the source code editor, paste the contents of the SQL file that you downloaded earlier.

  5. From the saved API key file, copy the API key and secret into the corresponding kafka.api.key and kafka.api.secret fields of the CREATE SOURCE CONNECTOR statement.

    Your output should resemble:

    CREATE SOURCE CONNECTOR "Datagen_users" WITH (
      "connector.class"='DatagenSource',
      "kafka.api.key"='<your-api-key>',
      "kafka.api.secret"='<your-api-secret>',
      "kafka.auth.mode"='KAFKA_API_KEY',
      "kafka.topic"='users_topic',
      "output.data.format"='JSON_SR',
      "quickstart"='USERS',
      "tasks.max"='1'
    );
    
    CREATE OR REPLACE TABLE "users_table" (GENDER STRING, ID STRING PRIMARY KEY, REGIONID STRING, REGISTERTIME BIGINT, USERID STRING)
      WITH (kafka_topic='users_topic', partitions=1, key_format='KAFKA', value_format='JSON_SR');
    
  6. Click Apply changes.

    The Datagen Source connector and Topic components are imported and appear on the canvas. Your pipeline is ready to activate.

    Note

    When you import code, the canvas resets and all existing components are deleted.

  7. In the source code editor, click X to dismiss it.

    The Connector and Topic components appear on the canvas.

    Stream Designer connector and topic components in Confluent Cloud Console

Step 6: Activate the pipeline

In this step, you enable security for the pipeline and activate it.

  1. Click Activate to deploy the pipeline components.

    The Pipeline activation dialog opens.

  2. In the ksqlDB Cluster dropdown, select the ksqlDB cluster to use for your pipeline logic.

    Note

    If you don’t have a ksqlDB cluster yet, click Create new ksqlDB cluster to open the ksqlDB Cluster page and then click Add cluster. When you’ve finished creating the ksqlDB cluster, return to the Create new pipeline dialog and click the refresh button to see your new ksqlDB cluster in the dropdown.

  3. In the Activation privileges section, click Grant privileges.

  4. Click Confirm to activate your pipeline.

    After a few seconds, the state of each component goes from Activating to Activated.

    Note

    If the filter component reports an error like Did not find any value schema for the topic, wait for the Datagen source connector to provision completely and activate the pipeline again.

  5. Click the Topic component, and in the details page, click Messages to confirm that the Datagen Source connector is producing messages to the topic.

    Your output should resemble the following.

    Stream Designer showing messages flowing in Confluent Cloud Console

Step 7: Deactivate the pipeline

To avoid incurring costs, click Deactivate pipeline to delete all resources created by the pipeline.

When you deactivate a pipeline, you have the option of retaining or deleting topics in the pipeline.

  1. Click the settings icon (settings-icon).

    The Pipeline Settings dialog opens.

  2. Click Deactivate pipeline to delete all resources created by the pipeline.

    The Revert pipeline to draft? dialog appears. Click the dropdowns to delete or retain the listed topics. For this example, keep the Delete settings.

    Stream Designer showing the Revert dialog in Confluent Cloud Console
  3. Click Confirm and revert to draft to deactivate the pipeline and delete topics.

Step 8: Delete the pipeline

When all components have completed deactivation, you can delete the pipeline safely.

  1. Click the settings icon.

    The Pipeline Settings dialog opens.

    Stream Designer showing filtered messages flowing in Confluent Cloud Console
  2. Click Delete pipeline. In the Delete pipeline dialog, enter “confirm” and click Confirm.

  3. The pipeline and associated resources are deleted. You are returned to the Pipelines list.