.. _connect-rbac-connectors: |rbac| for self-managed connectors ---------------------------------- .. note:: For connectors running in |ccloud|, see `RBAC for managed connectors `__. In an |rbac|-enabled |cp| environment, individual connectors can override the |kconnect| worker principal configuration. This allows each connector to use a separate principal with specific access privileges for specific topics, increasing security for your |ak| environment. This is recommended for |ak| production environments using RBAC. .. note:: Note the following considerations: * The configuration steps in the following sections assume you have included :ref:`worker-wide default properties `. * See :ref:`Secret Registry ` if you are using a Secret Registry for connector credentials. * Before configuring RBAC for |kconnect-long|, read the white paper `Role-Based Access Control (RBAC) for Kafka Connect `__. This white paper covers basic RBAC concepts and provides a deep dive into using RBAC with |kconnect-long| and connectors. It also contains a link to a GitHub demo so you can see how it all works on a local |cp| installation. ------------------ Source connectors ------------------ Add the following lines and a valid service principal to every Source connector created in the |kconnect| cluster. .. sourcecode:: json "producer.override.sasl.jaas.config": "org.apache.kafka.common.security.oauthbearer.OAuthBearerLoginModule required \ username=\"$USERNAME\" \ password=\"$PASSWORD\" \ metadataServerUrls=\"http://localhost:8090\";", ---------------- Sink connectors ---------------- Add the following lines and a valid service principal to every Sink connector created in the |kconnect| cluster. .. sourcecode:: json "consumer.override.sasl.jaas.config": "org.apache.kafka.common.security.oauthbearer.OAuthBearerLoginModule required \ username=\"$USERNAME\" \ password=\"$PASSWORD\" \ metadataServerUrls=\"http://localhost:8090\";", ----------------- Dead letter queue ----------------- If the connector is using the dead letter queue feature, you need to add a configuration block for both a Producer and Admin Client in the connector. The reason for this is that invalid (dropped) sink messages are passed to a Producer constructed to send records to the dead letter queue and then the Admin Client creates the dead letter queue topic. Both of these need to have service principals to function. To use the dead letter queue feature, add two additional configuration sections as shown below. .. sourcecode:: json "producer.override.sasl.jaas.config": "org.apache.kafka.common.security.oauthbearer.OAuthBearerLoginModule required \ username=\"$USERNAME\" \ password=\"$PASSWORD\" \ metadataServerUrls=\"http://localhost:8090\";", .. sourcecode:: json "admin.override.sasl.jaas.config": "org.apache.kafka.common.security.oauthbearer.OAuthBearerLoginModule required \ username=\"$USERNAME\" \ password=\"$PASSWORD\" \ metadataServerUrls=\"http://localhost:8090\";", .. _connect-rbac-key-value-converters: --------------------- |sr|-based converters --------------------- The following |sr|-based converters are available for |kconnect|: * Avro converter: ``io.confluent.connect.avro.AvroConverter`` * Protobuf converter: ``io.confluent.connect.protobuf.ProtobufConverter`` * JSON Schema converter: ``io.confluent.connect.json.JsonSchemaConverter`` To use an |sr|-based converter with an RBAC-enabled Schema Registry, you first add the ``key.converter`` or ``value converter`` property to the connector configuration. The following example shows the ``value.converter`` property for Avro, Protobuf, and JSON Schema: .. sourcecode:: properties "value.converter": "io.confluent.connect.avro.AvroConverter" .. sourcecode:: properties "value.converter": "io.confluent.connect.protobuf.ProtobufConverter" .. sourcecode:: properties "value.converter": "io.confluent.connect.json.JsonSchemaConverter" Then, to properly authenticate with |sr|, you add the following properties. Note that the ```` entered are the connector's |rbac-sa| username and password. The following properties are used interchangeably between the three converters: .. sourcecode:: properties "value.converter.schema.registry.url": "", "value.converter.basic.auth.credentials.source": "USER_INFO", "value.converter.basic.auth.user.info": ":" Using Avro as an example, a complete converter configuration snippet is shown below: .. sourcecode:: properties "value.converter": "io.confluent.connect.avro.AvroConverter" "value.converter.schema.registry.url": "", "value.converter.basic.auth.credentials.source": "USER_INFO", "value.converter.basic.auth.user.info": ":" For additional information about |kconnect| converters, see :connect-common:`Configuring Key and Value Converters|userguide.html#configuring-key-and-value-converters`.