Import a Recipe Into a Pipeline for Stream Designer on Confluent Cloud¶
Stream Designer enables creating a pipeline by importing a ksqlDB file that describes a pipeline.
You can find example pipelines at Confluent Developer. The following list shows just a few of the recipes that you can import.
- Detect unusual credit card activity
- Understand user behavior with clickstream data
- Monitor security threats by analyzing and filtering audit logs
- Analyze political campaign fundraising performance
- Enrich orders with change data capture (CDC)
- Identify Offline Devices via IoT Data
- Notify passengers of flight updates
- Integrate legacy messaging systems with Kafka
Note
The ksqlDB recipes provide templates to get you started on production pipelines. Production data sources will require additional workflow to configure for your specific use cases.
Step 1: Create a pipeline project¶
A Stream Designer pipeline project defines all the components that are deployed for an application. In this step, you create a pipeline project and a canvas for designing the component graph.
Log in to the Confluent Cloud Console and open the Cluster Overview page for the cluster you want to use for creating pipelines.
In the navigation menu, click Stream Designer.
Click Create pipeline.
The Create a new pipeline page opens.
Step 2: Import a recipe¶
Click Use ksqlDB recipe and click Start building.
The Browse ksqlDB recipes page opens.
Click the Understand user behavior with clickstream data tile and then click Start with recipe.
The code editor opens and shows the SQL for the recipe.
Click Apply changes to import the recipe.
The Stream Designer canvas opens and is populated with components.
Click X to dismiss the code editor.
Step 3: Provision the Datagen source connectors¶
The pipeline starts with two Datagen source connectors that produce mock user data and clicks. Before you activate the pipeline, the connectors requires API keys to complete their provisioning.
Click the DATAGEN_CLICKSTREAM connector to open the configuration dialog.
In the Kafka credentials section, click the edit icon.
Click Global access and Generate API key & download to create the API key for the Datagen connector.
A text file containing the newly generated API key and secret is downloaded to your local machine.
Click Save changes and click and Save.
Repeat the previous steps for the DATAGEN_CLICKSTREAM_USERS connector.
Delete the sink connector, named RECIPE_ELASTICSEARCH_ANALYZED_CLICKSTREAM, which isn’t used in this example. Also, delete the corresponding USER_IP_ACTIVITY and ERRORS_PER_MIN_ALERT topics.
Step 4: Activate the pipeline¶
In this step, you enable security for the pipeline and activate it.
Click Activate to deploy the pipeline components.
The Pipeline activation dialog opens.
In the ksqlDB Cluster dropdown, select the ksqlDB cluster to use for your pipeline logic.
Note
If you don’t have a ksqlDB cluster yet, click Create new ksqlDB cluster to open the ksqlDB Cluster page and then click Add cluster. When you’ve finished creating the ksqlDB cluster, return to the Create new pipeline dialog and click the refresh button to see your new ksqlDB cluster in the dropdown.
In the Activation privileges section, click Grant privileges.
Click Confirm to activate your pipeline.
After a few seconds, the state of each component goes from Activating to Activated.
Note
If the filter component reports an error like
Did not find any value schema for the topic
, wait for the Datagen source connector to provision completely and activate the pipeline again.When all components show the Activated state, click the user_ip_activity topic, and in the details view, click Messages.
Your output should resemble:
Click the other topics in the pipeline to see the output from different queries.
Step 5: Deactivate the pipeline¶
To avoid incurring costs, click Deactivate pipeline to delete all resources created by the pipeline.
When you deactivate a pipeline, you have the option of retaining or deleting topics in the pipeline.
-
The Pipeline Settings dialog opens.
Click Deactivate pipeline to delete all resources created by the pipeline.
The Revert pipeline to draft? dialog appears. Click the dropdowns to delete or retain the listed topics. For this example, keep the Delete settings.
Click Confirm and revert to draft to deactivate the pipeline and delete topics.
Step 6: Delete the pipeline¶
When all components have completed deactivation, you can delete the pipeline safely.
Click the settings icon.
The Pipeline Settings dialog opens.
Click Delete pipeline. In the Delete pipeline dialog, enter “confirm” and click Confirm.
The pipeline and associated resources are deleted. You are returned to the Pipelines list.