Videos, Demos, and Reading Material¶
If you’re new to stream processing, Apache Kafka®, or Confluent Platform, here is a curated list of resources to get you started.
- Confluent YouTube channel
- The Confluent YouTube channel is a great resource for tutorials, how-tos, customer use cases, and general Confluent education. This content is frequently updated and so it is recommended that you subscribe.
Event Streaming Platform¶
- The Event Streaming Platform Explained
- Jay Kreps discusses key trends and concepts which are at the heart of a revolution happening in data infrastructure and application architectures.
- Why Kafka?
- Why Kafka has become the event-driven, real-time architecture for enterprises.
- Neha Narkhede | Kafka Summit 2018 Keynote (The Present and Future of the Streaming Platform)
- Neha Narkhede is co-founder and CTO at Confluent. Prior to founding Confluent, Neha led streams infrastructure at LinkedIn, where she was responsible for LinkedIn’s streaming infrastructure built on top of Kafka and Apache Samza.
- Confluent Essentials for Apache Kafka
- In this free 1 hour video, learn about the basic concepts driving the use cases and functionality of Apache Kafka®. (In the link above, scroll to the target video)
- Microservices Explained by Confluent
- Microservices architectures enable organizations to evolve their systems away from the slow and unresponsive shared-state architectures of the past. Confluent provides a streaming platform for incorporating data in flight into a lightweight, efficient, and responsive microservices architecture.
- Intro to Streams | Kafka Streams API
- The Streams API of Apache Kafka is the easiest way to write mission-critical real-time applications and microservices with all the benefits of Kafka’s server-side cluster technology. It allows you to build standard Java or Scala applications.
- Intro to KSQL | Streaming SQL for Apache Kafka
- Confluent KSQL is the streaming SQL engine that implements continuous, Interactive Queries against Kafka. KSQL makes it easy to read, write and process streaming data in real-time, at scale, using SQL-like semantics.
GitHub Repos for Examples¶
- Main Repo with 20+ Demos
- End-to-end demos and examples showcasing stream processing on Confluent Platform.
- Kafka Streams Examples
- Demo applications and code examples for Kafka’s Streams API.
- Kafka Client Application Examples
- Client examples for Java producers, Java consumers, and other clients connecting to on-prem or Confluent Cloud.
The best demo to start with is Kafka Event Streaming Application, also known as cp-demo. This demo spins up a Kafka event streaming application using KSQL for stream processing, with many security features enabled, in an end-to-end streaming ETL pipeline with a source connector pulling from live IRC channels and a sink connector connecting to Elasticsearch and Kibana for visualizations. cp-demo also comes with a playbook and is a great configuration reference for Confluent Platform.
- Kafka Event Streaming Application
- This demo shows you how to deploy a Apache Kafka® streaming ETL using KSQL for stream processing and Control Center for monitoring, with security enabled end-to-end. You can follow along with the playbook and watch the video tutorials.
- Stream Processing to Confluent Cloud
- This Confluent Cloud demo is the automated version of the KSQL Tutorial, but instead of KSQL stream processing on your local install, it runs on your Confluent Cloud cluster. You can follow along with the playbook.
- Security Tutorial
- This tutorial is a step-by-step guide to configuring Confluent Platform with SSL encryption, SASL authentication, and authorization.
- Designing Event Driven Systems (Microservices) Demo
- This project goes hand in hand with the book ‘Designing Event Driven Systems’, demonstrating how you can build a small microservices application with Kafka and Kafka Streams.
- Real-Time Streaming ETL from Oracle Transactional Data
- Replace batch extracts with event streams, and batch transformations with in-flight transformation of event streams. Take a stream of data from a transactional system built on Oracle, transform it, and stream it into Elasticsearch. Use KSQL to filter streams of events in real-time from a database and join between events from two database tables to create rolling aggregates on the data.
- Write a User Defined Function (UDF) for KSQL
- Build, deploy, and test a user-defined function (UDF) to extend the set of available functions in your KSQL code. Write Java code within the UDF to convert a timestamp from String to BigInt.
- Monitoring Kafka in Confluent Control Center
- Use the Confluent KSQL CLI and Confluent Control Center to view streams and throughput of incoming records for persistent KSQL queries.