Connect self-managed Kafka clients to Confluent Cloud

To integrate your existing Kafka clients and applications with Confluent Cloud, you can establish secure connections from your self-managed clients and applications. Confluent provides client libraries for seamless integration in various languages. For the following Kafka clients, click the links to learn how to connect them to Confluent Cloud:

For additional examples and demos using these and other clients, see More examples and demos.

Prerequisites and security requirements

For comprehensive information about TLS encryption, certificate management, prerequisites, and security requirements for connecting clients to Confluent Cloud, see Client Configuration Properties in the Build Streaming Applications section. This includes:

  • TLS encryption and SNI extension requirements
  • Certificate management and pinning guidelines
  • librdkafka version-specific configuration guidance
  • Java version requirements
  • Prerequisites for different client types

The information below provides specific configuration examples for Java, Python, Go, JavaScript, .NET, and C/C++ self-managed clients connecting to Confluent Cloud.

Connect a Java application to Confluent Cloud

To configure Java clients for Kafka to connect to a Kafka cluster in Confluent Cloud:

  1. Add the Kafka client dependency to your project. For Maven:

    <dependency>
        <groupId>org.apache.kafka</groupId>
        <artifactId>kafka-clients</artifactId>
        <version>${kafka.clients.version}</version>
    </dependency>
    

    For Gradle:

    implementation "org.apache.kafka:kafka-clients:${kafkaClientsVersion}"
    

    Use a current, supported version per your build’s BOM or dependency management policy.

  2. Configure your Java application with the connection properties. You can obtain these from the Confluent Cloud Console by selecting your cluster and clicking Clients.

  3. Use the configuration in your producer or consumer code:

    Properties props = new Properties();
    props.put("bootstrap.servers", "your-bootstrap-servers");
    props.put("security.protocol", "SASL_SSL");
    props.put("sasl.mechanism", "PLAIN");
    props.put("sasl.jaas.config", "org.apache.kafka.common.security.plain.PlainLoginModule required username='your-api-key' password='your-api-secret';");
    
    // Create producer or consumer
    KafkaProducer<String, String> producer = new KafkaProducer<>(props);
    
  4. See the Java client examples for complete working examples.

  5. Integrate with your environment.

Connect a Python application to Confluent Cloud

To configure the Confluent Python Client for Kafka to connect to a Kafka cluster in Confluent Cloud:

  1. Install the Confluent Python Client for Apache Kafka.
  2. Customize the Python Confluent Cloud example for your Confluent Cloud cluster, specifically bootstrap.servers, sasl.username, and sasl.password.
  3. Integrate with your environment.

Connect a Go application to Confluent Cloud

To configure the Confluent Golang Client for Kafka to connect to a Kafka cluster in Confluent Cloud:

  1. Install the Confluent Golang Client for Apache Kafka.

    The Confluent Golang Client for Kafka depends on librdkafka, which must be installed separately.

  2. Customize the Golang Confluent Cloud example for your Confluent Cloud cluster, specifically bootstrap.servers, sasl.username, and sasl.password.

  3. Integrate with your environment.

Connect a JavaScript application to Confluent Cloud

To configure JavaScript clients for Kafka to connect to a Kafka cluster in Confluent Cloud:

  1. Install the Confluent JavaScript client for Kafka:

    npm install @confluentinc/kafka-javascript
    
  2. Configure your JavaScript application with the connection properties. You can obtain these from the Confluent Cloud Console by selecting your cluster and clicking Clients.

  3. Use the configuration in your producer or consumer code:

    const { Kafka } = require('@confluentinc/kafka-javascript');
    
    const kafka = new Kafka({
      kafkaJS: {
        brokers: ['your-bootstrap-servers'],
        ssl: true,
        sasl: {
          mechanism: 'plain',
          username: 'your-api-key',
          password: 'your-api-secret'
        }
      }
    });
    
    // Create producer or consumer
    const producer = kafka.producer();
    
  4. See the JavaScript client examples for complete working examples.

  5. Integrate with your environment.

Connect a .NET application to Confluent Cloud

To configure the Confluent .NET Client for Kafka to connect to a Kafka cluster in Confluent Cloud:

  1. Install the Confluent .NET Client for Apache Kafka.
  2. Customize the .NET Confluent Cloud example for your Confluent Cloud cluster, specifically bootstrap.servers, sasl.username, and sasl.password.
  3. Integrate with your environment.

Connect a C/C++ application to Confluent Cloud

To configure a C/C++ application using the librdkafka client to connect to a Kafka cluster in Confluent Cloud:

  1. Prerequisite: Ensure you have installed the librdkafka library on your system or included it in your project’s build process.

  2. In your application code, create a configuration object and set the properties for connecting to Confluent Cloud.

    #include <librdkafka/rdkafka.h>
    // ...
    
    rd_kafka_conf_t *conf;
    char errstr[512];
    
    conf = rd_kafka_conf_new();
    
    // Confluent Cloud bootstrap servers
    if (rd_kafka_conf_set(conf, "bootstrap.servers", "<BOOTSTRAP_SERVERS>", errstr, sizeof(errstr)) != RD_KAFKA_CONF_OK) {
        fprintf(stderr, "%s\n", errstr);
        // Handle error
    }
    
    // Security configuration
    rd_kafka_conf_set(conf, "security.protocol", "SASL_SSL", errstr, sizeof(errstr));
    rd_kafka_conf_set(conf, "sasl.mechanisms", "PLAIN", errstr, sizeof(errstr));
    rd_kafka_conf_set(conf, "sasl.username", "<API_KEY>", errstr, sizeof(errstr));
    rd_kafka_conf_set(conf, "sasl.password", "<API_SECRET>", errstr, sizeof(errstr));
    
    // See the Client Prerequisites section for details on ssl.ca.location.
    // For librdkafka v2.11 or later, it is typically not required.
    
    // ... create producer or consumer instance with this conf ...
    
  3. For complete, working projects, refer to the official librdkafka examples directory.

More examples and demos

  • To view a working example of hybrid Kafka clusters from self-hosted to Confluent Cloud, see cp-demo.
  • For example configurations for all Confluent Platform components and clients connecting to Confluent Cloud, see template examples for components.
  • See the collection of client examples for supported programming languages in Code Examples for Apache Kafka.
    • The “Hello, World!” examples produce to and consume from any Kafka cluster, including Confluent Cloud, using client libraries for the following programming languages: C (librdkafka), Clojure, C#/.NET, Go, Groovy, Java, Java Spring Boot, JavaScript/Node.js, Kotlin, Python, Ruby, Rust, and Scala.
    • For the subset of language client libraries that support it, the examples demonstrate how to use Confluent Cloud Schema Registry and Avro.