Scripted |cp| Demo ================== The scripted |cp| demo (``cp-demo``) example builds a full |cp| deployment with an |ak-tm| event streaming application using `ksqlDB `__ and :ref:`Kafka Streams ` for stream processing, and all the components have security enabled end-to-end. The tutorial includes a module to extend it into a hybrid deployment that runs Cluster Linking and Schema Linking to copy data and schemas from a local on-premises |ak| cluster to |ccloud|, a fully-managed service for |ak-tm|. Follow the accompanying guided tutorial, broken down step-by-step, to learn how |ak| and |ccloud| work with |kconnect|, |sr-long|, |c3|, Cluster Linking, and security enabled end-to-end. .. include:: ../../../includes/cp-cta.rst ======== Overview ======== Use Case -------- The use case is an |ak-tm| event streaming application that processes real-time edits to real Wikipedia pages. .. figure:: images/cp-demo-overview-with-ccloud.svg :alt: image The full event streaming platform based on |cp| is described as follows. Wikimedia's `EventStreams `__ publishes a continuous stream of real-time edits happening to real wiki pages. A Kafka source connector `kafka-connect-sse `__ streams the server-sent events (SSE) from https://stream.wikimedia.org/v2/stream/recentchange, and a custom |kconnect| transform `kafka-connect-json-schema `__ extracts the JSON from these messages and then are written to a |ak| cluster. This example uses `ksqlDB `__ and a :ref:`Kafka Streams ` application for data processing. Then a Kafka sink connector `kafka-connect-elasticsearch `__ streams the data out of Kafka and is materialized into `Elasticsearch `__ for analysis by `Kibana `__. All data is using |sr-long| and Avro, and `Confluent Control Center `__ is managing and monitoring the deployment. Data Pattern ------------ Data pattern is as follows: +-------------------------------------+--------------------------------+---------------------------------------+ | Components | Consumes From | Produces To | +=====================================+================================+=======================================+ | SSE source connector | Wikipedia | ``wikipedia.parsed`` | +-------------------------------------+--------------------------------+---------------------------------------+ | ksqlDB | ``wikipedia.parsed`` | ksqlDB streams and tables | +-------------------------------------+--------------------------------+---------------------------------------+ | Kafka Streams application | ``wikipedia.parsed`` | ``wikipedia.parsed.count-by-domain`` | +-------------------------------------+--------------------------------+---------------------------------------+ | Elasticsearch sink connector | ``WIKIPEDIABOT`` (from ksqlDB) | Elasticsearch/Kibana | +-------------------------------------+--------------------------------+---------------------------------------+ How to use this tutorial ------------------------ We suggest following the ``cp-demo`` tutorial in order: #. :ref:`cp-demo-on-prem-tutorial`: bring up the on-premises |ak| cluster and explore the different technical areas of |cp| #. :ref:`cp-demo-hybrid`: create a cluster link to copy data from a local on-premises |ak| cluster to |ccloud|, and use the Metrics API to monitor both #. :ref:`cp-demo-teardown`: clean up your on-premises and |ccloud| environment