Important
You are viewing documentation for an older version of Confluent Platform. For the latest, click here.
Build Applications for Kafka¶
Use these resources to develop and deploy your streaming applications on Confluent Platform.
- Schema Management
- Command Line Tools
- Kafka Clients
- API and Javadocs
- Application Development
- Docker Developer Guide
- Streams Developer Guide
- Writing a Streams Application
- Testing Streams Code
- Configuring a Streams Application
- Streams DSL
- Naming Kafka Streams DSL Topologies
- Optimizing Kafka Streams Topologies
- Processor API
- Data Types and Serialization
- Interactive Queries
- Memory Management
- Running Streams Applications
- Managing Streams Application Topics
- Streams Security
- Application Reset Tool
- Connector Developer Guide
- Confluent Hub
- MQTT Proxy
- ksqlDB Developer Guide
- Pipelining with Kafka Connect and Kafka Streams
- Overview
- Description of Data
- Demo Prerequisites
- Run the demo
- Example 1: Kafka console producer -> Key:String and Value:String
- Example 2: JDBC source connector with Single Message Transformations -> Key:Long and Value:JSON
- Example 3: JDBC source connector with SpecificAvro -> Key:String(null) and Value:SpecificAvro
- Example 4: JDBC source connector with GenericAvro -> Key:String(null) and Value:GenericAvro
- Example 5: Java client producer with SpecificAvro -> Key:Long and Value:SpecificAvro
- Example 6: JDBC source connector with Avro to ksqlDB -> Key:Long and Value:Avro
- Technical Notes