You are viewing documentation for an older version of Confluent Platform. For the latest, click here.
Kafka Streams is a client library for building applications and microservices, where the input and output data are stored in a Apache Kafka® cluster. It combines the simplicity of writing and deploying standard Java and Scala applications on the client side with the benefits of Kafka’s server-side cluster technology.
- Quick Start Guide
- The Tutorial: Creating a Streaming Data Pipeline demonstrates how to run your first Java application that uses the Kafka Streams library by showcasing a simple end-to-end data pipeline powered by Kafka.
- Streams API Screencasts
Watch the Intro to Streams API on YouTube.
- Tutorial: Creating a Streaming Data Pipeline
- Designing Event Driven Systems
- Kafka Streams Demo Application
- Tutorial: Introduction to Streaming Application Development
- Additional Resources
- Connecting Kafka Streams to Confluent Cloud
- Streams Concepts
- Streams Architecture
- Streams Code Examples
- Streams Developer Guide
- Writing a Streams Application
- Testing Streams Code
- Configuring a Streams Application
- Streams DSL
- Optimizing Kafka Streams Topologies
- Processor API
- Data Types and Serialization
- Interactive Queries
- Memory Management
- Running Streams Applications
- Managing Streams Application Topics
- Streams Security
- Application Reset Tool
- Streams Operations
- Streams Upgrade Guide
- Upgrading from Confluent Platform 5.0.x (Kafka 2.0.x-cp1) to Confluent Platform 5.1.0 (Kafka 2.1.0-cp1)
- Upgrading older Kafka Streams applications to Confluent Platform 5.1.0
- API changes (from Confluent Platform 4.1 to Confluent Platform 5.0)
- API changes (from Confluent Platform 4.0 to Confluent Platform 4.1)
- API changes (from Confluent Platform 3.3 to Confluent Platform 4.0)
- API changes (from Confluent Platform 3.2 to Confluent Platform 3.3)
- API changes (from Confluent Platform 3.1 to Confluent Platform 3.2)
- API changes (from Confluent Platform 3.0 to Confluent Platform 3.1)
- Streams FAQ
- Is Kafka Streams a project separate from Kafka?
- Is Kafka Streams a proprietary library of Confluent?
- Do Kafka Streams applications run inside the Kafka brokers?
- Why Does My Kafka Streams Application Use So Much Memory?
- What are the system dependencies of Kafka Streams?
- How do I migrate my older Kafka Streams applications to the latest Confluent Platform version?
- Which versions of Kafka clusters are supported by Kafka Streams?
- What programming languages are supported?
- Why is my application re-processing data from the beginning?
- Failure and exception handling
- Interactive Queries
- Troubleshooting and debugging
- Easier to interpret Java stacktraces?
- Visualizing topologies?
- Inspecting streams and tables?
- Invalid Timestamp Exception
- Why do I get an
IllegalStateExceptionwhen accessing record metadata?
- Why is
- Scala: compile error “no type parameter”, “Java-defined trait is invariant in type T”
- How can I convert a KStream to a KTable without an aggregation step?
- RocksDB behavior in 1-core environments
- Streams Javadocs