Custom Connectors Limitations and Support

Make sure to review all the information here before starting.


The following are limitations for custom connectors:

  • Custom connector clusters must be in a supported region.
  • The cluster must use public internet endpoints.
  • You can only run Java-based Apache Kafka® custom connectors in Confluent Cloud.
  • Organizations are limited to 30 custom connectors and 100 plugins.
  • Up to 5 tasks are supported per custom connector. Note that a custom connector is allocated 2 GB of memory for all tasks.
  • Custom connectors cannot write data to a local file system in Confluent Cloud. If you configure your custom connector to write to the local file system, it will fail.

Shared responsibility

Confluent supports the Kafka Connect infrastructure in Confluent Cloud only. It is your responsibility to troubleshoot custom connector issues for connectors you build or that were provided to you from others. The following provides additional details about shared support responsibilities.

Shared responsibilities matrix

Shared responsibilities matrix

  • Customer Managed: The customer is responsible for self-managing these services. Confluent does not provide any support for customer-managed services and features within Custom Connectors clusters.
  • Confluent Managed: Confluent is responsible for managing these services and providing support.

Confluent and Partner support

You are responsible for the connectors you upload through the Custom Connector feature.

  • Customers that upload connectors to Confluent Cloud through the Custom Connector feature are responsible for management and support of the connector. Confluent does not provide support for custom connectors.
  • Partner-built connectors that are published on Confluent Hub are not tested or certified for Custom Connector functionality on Confluent Cloud. Confluent does not provide Enterprise support for customers who choose to provision a custom connector using a Partner-built connector
  • Confluent-built connectors are not tested or certified for Custom Connector functionality in Confluent Cloud. Confluent does not provide Enterprise support for customers who choose to provision a custom connector using a Confluent-built plugin.
Enterprise support

Enterprise support


Confluent regularly scans connectors uploaded through the Custom Connector product to detect if uploaded connectors are interacting with malicious endpoints or otherwise present a security risk to Confluent. If malicious activity is detected, Confluent may immediately delete the connector.

Supported AWS Regions

Custom connectors are currently available in the following AWS regions:

AWS region Region name
us-east-1 US East (N. Virginia)
us-east-2 US East (Ohio)
us-west-2 US West (Oregon)
eu-west-1 Europe (Ireland)
eu-central-1 Europe (Frankfurt)

Schema Registry integration

Schema Registry must be enabled for the cluster, for the custom connector to use a Schema Registry-based format (for example, Avro, JSON_SR (JSON Schema), or Protobuf). Schema Registry can be enabled as a Confluent Cloud managed service or as a self-managed service.

If you plan to use a self-managed Schema Registry configuration, the self-managed Schema Registry instance must be accessible over the public internet. You also must add all Schema Registry configuration entries, including credentials, in the connector configuration. For example:

  "key.converter": "io.confluent.connect.protobuf.ProtobufConverter",
  "key.converter.basic.auth.credentials.source": "USER_INFO",
  "": "abc:xyz",
  "key.converter.schema.registry.url": "",
  "value.converter": "io.confluent.connect.protobuf.ProtobufConverter",
  "value.converter.basic.auth.credentials.source": "USER_INFO",
  "": "abc:xyz",
  "value.converter.schema.registry.url": ""

Note that the additional configuration entries are not needed for clusters that have Schema Registry enabled in Confluent Cloud as a managed service.

App log topic

Note the following usage details for the app log topic:

  • Customers are responsible for all charges related to using the app log topic with a custom connector. For all other billing details, see Manage Billing in Confluent Cloud.
  • Kafka topics have a default retention period of seven days. You must consume the logs from the app log topic within seven days. If you need logs for a longer period, change the log retention period configuration property. In the UI, open the appropriate log topic and change the Retention time.
Log retention period

Log retention period