Import a recipe into a pipeline

Stream Designer enables creating a pipeline by importing a ksqlDB file that describes a pipeline.

You can find example pipelines at Confluent Developer. The following list shows just a few of the recipes that you can import.

Note

The ksqlDB recipes provide templates to get you started on production pipelines. Production data sources will require additional workflow to configure for your specific use cases.

Step 1: Create a pipeline project

A Stream Designer pipeline project defines all the components that are deployed for an application. In this step, you create a pipeline project and a canvas for designing the component graph.

  1. Log in to the Confluent Cloud Console and open the Cluster Overview page for the cluster you want to use for creating pipelines.

  2. In the navigation menu, click Stream Designer.

  3. Click Create pipeline, and in the Create new pipeline dialog, enter my-pipeline for the pipeline name.

    Stream Designer create new pipeline dialog in Confluent Cloud Console
  4. In the ksqlDB Cluster dropdown, select the ksqlDB cluster to use for your pipeline logic.

    Note

    If you don’t have a ksqlDB cluster yet, click Create new ksqlDB cluster to open the ksqlDB Cluster page and click Add cluster. When you’ve finished creating the ksqlDB cluster, return to the Create new pipeline dialog and click the refresh button to see your new ksqlDB cluster in the dropdown.

  5. Click Create pipeline.

    The Create a new pipeline page opens.

    Stream Designer create new pipeline page in Confluent Cloud Console
  6. Click Grant pipeline privileges.

  7. In the Grant activation privileges dialog, type “confirm” and click Confirm.

    The Activate pipeline and Deactivate pipeline buttons appear.

Step 2: Import the Clickstream recipe

The following steps show how to import the recipe for understanding user behavior with clickstream data.

  1. Click Use ksqlDB recipe and click Start building.

    The Browse ksqlDB recipes page opens.

    Stream Designer view of the ksqlDB recipes in Confluent Cloud Console
  2. In the CUSTOMER 360 tiles, click Understand user behavior with clickstream data and click Start with recipe.

    The code editor opens and shows the SQL for the recipe.

    Stream Designer view of the source code editor in Confluent Cloud Console
  3. Click Apply changes to import the recipe.

    The canvas is populated with components.

    Stream Designer view of the Clickstream recipe in Confluent Cloud Console
  4. Click X to dismiss the code editor.

Step 3: Provision the Datagen source connectors

The pipeline starts with two Datagen source connectors that produce mock user data and clicks. Before you activate the pipeline, the connectors requires API keys to complete their provisioning.

  1. Click the DATAGEN_CLICKSTREAM connector to open the configuration dialog.

  2. In the Kafka credentials section, click the edit icon.

    Stream Designer view of the Connector config in Confluent Cloud Console
  3. Click Global access and Generate API key & download to create the API key for the Datagen connector.

    A text file containing the newly generated API key and secret is downloaded to your local machine.

    Click Save changes and click and Save.

  4. Repeat the previous steps for the DATAGEN_CLICKSTREAM_USERS connector.

  5. Delete the sink connector, named RECIPE_ELASTICSEARCH_ANALYZED_CLICKSTREAM, which isn’t used in this example. Also, delete the corresponding USER_IP_ACTIVITY and ERRORS_PER_MIN_ALERT topics.

    Stream Designer view of the Clickstream recipe in Confluent Cloud Console

Step 4: Activate the pipeline

  1. Click Activate to deploy and start the pipeline.

    Pipeline activation begins. It may take a few minutes for all components to activate.

  2. When all components show the Activated state, click the user_ip_activity topic, and in the details view, click Messages.

    Your output should resemble:

    Stream Designer view of user IP messages recipe in Confluent Cloud Console
  3. Click the other topics in the pipeline to see the output from different queries.

Step 5: Deactivate the pipeline

To avoid incurring costs, click Deactivate pipeline to delete all resources created by the pipeline.

Step 6: Delete the pipeline

When all components have completed deactivation, you can delete the pipeline safely.

  1. Click the settings icon (settings-icon).

    The Pipeline Settings dialog opens.

    Stream Designer showing filtered messages flowing in Confluent Cloud Console
  2. Click Delete pipeline. In the Delete pipeline dialog, enter “confirm” and click Confirm.

  3. The pipeline and associated resources are deleted. You are returned to the Pipelines list.