Import a Recipe Into a Pipeline for Stream Designer on Confluent Cloud

Stream Designer enables creating a pipeline by importing a ksqlDB file that describes a pipeline.

You can find example pipelines at Confluent Developer. The following list shows just a few of the recipes that you can import.

Note

The ksqlDB recipes provide templates to get you started on production pipelines. Production data sources will require additional workflow to configure for your specific use cases.

Step 1: Create a pipeline project

A Stream Designer pipeline project defines all the components that are deployed for an application. In this step, you create a pipeline project and a canvas for designing the component graph.

  1. Log in to the Confluent Cloud Console and open the Cluster Overview page for the cluster you want to use for creating pipelines.

  2. In the navigation menu, click Stream Designer.

  3. Click Create pipeline.

    The Create a new pipeline page opens.

    Stream Designer Create New Pipeline page in Confluent Cloud Console

Step 2: Import a recipe

  1. Click Use ksqlDB recipe and click Start building.

    The Browse ksqlDB recipes page opens.

  2. Click the Understand user behavior with clickstream data tile and then click Start with recipe.

    The code editor opens and shows the SQL for the recipe.

    Stream Designer view of the source code editor in Confluent Cloud Console
  3. Click Apply changes to import the recipe.

    The Stream Designer canvas opens and is populated with components.

    Stream Designer view of the Clickstream recipe in Confluent Cloud Console
  4. Click X to dismiss the code editor.

Step 3: Provision the Datagen source connectors

The pipeline starts with two Datagen source connectors that produce mock user data and clicks. Before you activate the pipeline, the connectors requires API keys to complete their provisioning.

  1. Click the DATAGEN_CLICKSTREAM connector to open the configuration dialog.

  2. In the Kafka credentials section, click the edit icon.

    Stream Designer view of the Connector config in Confluent Cloud Console
  3. Click Global access and Generate API key & download to create the API key for the Datagen connector.

    A text file containing the newly generated API key and secret is downloaded to your local machine.

    Click Save changes and click and Save.

  4. Repeat the previous steps for the DATAGEN_CLICKSTREAM_USERS connector.

  5. Delete the sink connector, named RECIPE_ELASTICSEARCH_ANALYZED_CLICKSTREAM, which isn’t used in this example. Also, delete the corresponding USER_IP_ACTIVITY and ERRORS_PER_MIN_ALERT topics.

    Stream Designer view of the Clickstream recipe in Confluent Cloud Console

Step 4: Activate the pipeline

In this step, you enable security for the pipeline and activate it.

  1. Click Activate to deploy the pipeline components.

    The Pipeline activation dialog opens.

  2. In the ksqlDB Cluster dropdown, select the ksqlDB cluster to use for your pipeline logic.

    Note

    If you don’t have a ksqlDB cluster yet, click Create new ksqlDB cluster to open the ksqlDB Cluster page and then click Add cluster. When you’ve finished creating the ksqlDB cluster, return to the Create new pipeline dialog and click the refresh button to see your new ksqlDB cluster in the dropdown.

  3. In the Activation privileges section, click Grant privileges.

  4. Click Confirm to activate your pipeline.

    After a few seconds, the state of each component goes from Activating to Activated.

    Note

    If the filter component reports an error like Did not find any value schema for the topic, wait for the Datagen source connector to provision completely and activate the pipeline again.

  5. When all components show the Activated state, click the user_ip_activity topic, and in the details view, click Messages.

    Your output should resemble:

    Stream Designer view of user IP messages recipe in Confluent Cloud Console
  6. Click the other topics in the pipeline to see the output from different queries.

Step 5: Deactivate the pipeline

To avoid incurring costs, click Deactivate pipeline to delete all resources created by the pipeline.

When you deactivate a pipeline, you have the option of retaining or deleting topics in the pipeline.

  1. Click the settings icon (settings-icon).

    The Pipeline Settings dialog opens.

  2. Click Deactivate pipeline to delete all resources created by the pipeline.

    The Revert pipeline to draft? dialog appears. Click the dropdowns to delete or retain the listed topics. For this example, keep the Delete settings.

    Stream Designer showing the Revert dialog in Confluent Cloud Console
  3. Click Confirm and revert to draft to deactivate the pipeline and delete topics.

Step 6: Delete the pipeline

When all components have completed deactivation, you can delete the pipeline safely.

  1. Click the settings icon.

    The Pipeline Settings dialog opens.

    Stream Designer showing filtered messages flowing in Confluent Cloud Console
  2. Click Delete pipeline. In the Delete pipeline dialog, enter “confirm” and click Confirm.

  3. The pipeline and associated resources are deleted. You are returned to the Pipelines list.