Important
You are viewing documentation for an older version of Confluent Platform. For the latest, click here.
Kafka Streams¶
Kafka Streams is a client library for building applications and microservices, where the input and output data are stored in a Kafka cluster. It combines the simplicity of writing and deploying standard Java and Scala applications on the client side with the benefits of Kafka’s server-side cluster technology.
- Quick Start Guide
- The Tutorial: Creating a Streaming Data Pipeline demonstrates how to run your first Java application that uses the Kafka Streams library by showcasing a simple end-to-end data pipeline powered by Kafka.
- Streams API Screencasts
Watch the Intro to Streams API on YouTube.
Contents
- Introduction
- Tutorial: Creating a Streaming Data Pipeline
- Kafka Streams Demo Application
- Connecting Kafka Streams to Confluent Cloud
- Streams Concepts
- Streams Architecture
- Streams Code Examples
- Streams Developer Guide
- Streams Operations
- Streams Upgrade Guide
- Upgrading from Confluent Platform 4.1.x (Kafka 1.1.x-cp1) to Confluent Platform 5.0.0 (Kafka 2.0.0-cp1)
- Upgrading older Kafka Streams applications to Confluent Platform 5.0.0
- API changes (from Confluent Platform 4.0 to Confluent Platform 4.1)
- API changes (from Confluent Platform 3.3 to Confluent Platform 4.0)
- API changes (from Confluent Platform 3.2 to Confluent Platform 3.3)
- API changes (from Confluent Platform 3.1 to Confluent Platform 3.2)
- API changes (from Confluent Platform 3.0 to Confluent Platform 3.1)
- Streams FAQ
- General
- Is Kafka Streams a project separate from Apache Kafka?
- Is Kafka Streams a proprietary library of Confluent?
- Do Kafka Streams applications run inside the Kafka brokers?
- Why Does My Kafka Streams Application Use So Much Memory?
- What are the system dependencies of Kafka Streams?
- How do I migrate my older Kafka Streams applications to the latest Confluent Platform version?
- Which versions of Kafka clusters are supported by Kafka Streams?
- What programming languages are supported?
- Why is my application re-processing data from the beginning?
- Scalability
- Processing
- Failure and exception handling
- Interactive Queries
- Security
- Troubleshooting and debugging
- Easier to interpret Java stacktraces?
- Visualizing topologies?
- Inspecting streams and tables?
- Invalid Timestamp Exception
- Why do I get an
IllegalStateException
when accessing record metadata? - Why is
punctuate()
not called? - Scala: compile error “no type parameter”, “Java-defined trait is invariant in type T”
- How can I convert a KStream to a KTable without an aggregation step?
- RocksDB behavior in 1-core environments
- General
- Streams Javadocs