Azure Service Bus Source Connector for Confluent Cloud

Note

If you are installing the connector locally for Confluent Platform, see Azure Service Bus Source Connector for Confluent Platform.

The Azure Service Bus is a multitenant cloud messaging service you can use to send information between applications and services. The Kafka Connect Azure Service Bus Source connector for Confluent Cloud reads data from an Azure Service Bus queue or topic, and persists the data in a Kafka topic.

Important

If you are still on Confluent Cloud Enterprise, please contact your Confluent Account Executive for more information about using this connector.

Features

The Azure Service Bus Source connector supports the following features:

  • At least once delivery: The connector guarantees that records are delivered at least once to the Kafka topic.
  • Supports multiple tasks: The connector supports running one or more tasks.
  • Supported data formats: The connector supports Avro, JSON Schema (JSON-SR), Protobuf, and JSON (schemaless) output formats. Schema Registry must be enabled to use a Schema Registry-based format (for example, Avro, JSON Schema, or Protobuf).

Note

See Configuration Properties for configuration property descriptions.

Refer to Cloud connector limitations for additional information.

Quick Start

Use this quick start to get up and running with the Confluent Cloud Azure Service Bus Source connector. The quick start provides the basics of selecting the connector and configuring it to stream events.

Prerequisites
  • Authorized access to a Confluent Cloud cluster on Amazon Web Services (AWS), Microsoft Azure (Azure), or Google Cloud Platform (GCP).
  • The Confluent Cloud CLI installed and configured for the cluster. See Install and Configure the Confluent Cloud CLI.
  • Schema Registry must be enabled to use a Schema Registry-based format (for example, Avro, JSON_SR (JSON Schema), or Protobuf).
  • You must have the Azure Service Bus connection details. For additional information, see the Azure Service Bus docs.
  • At least one Kafka topic must exist in your Confluent Cloud cluster before creating the source connector.

Using the Confluent Cloud Console

Step 1: Launch your Confluent Cloud cluster.

See the Quick Start for Apache Kafka using Confluent Cloud for installation instructions.

Step 2: Add a connector.

Click Connectors. If you already have connectors in your cluster, click Add connector.

Step 3: Select your connector.

Click the Azure Service Bus Source connector icon.

Azure Service Bus Source Connector Icon

Step 4: Set up the connection.

Note

  • Make sure you have all your prerequisites completed.
  • An asterisk ( * ) designates a required entry.
  1. Enter a connector Name.
  2. Enter your Kafka Cluster credentials. The credentials are either the cluster API key and secret or the service account API key and secret.
  3. Select a topic where you want to send data.
  4. Enter the required Azure Service Bus connection details. For details about shared access authorization, see Shared Access Authorization Policies.
  5. Select an Output message format (data going into the Kafka topic): AVRO, JSON_SR (JSON Schema), PROTOBUF, or JSON (schemaless). Schema Registry must be enabled to use a Schema Registry-based format (for example, Avro, JSON_SR (JSON Schema), or Protobuf).
  6. Enter the number of tasks to use with the connector. More tasks may improve performance.

Note

See Configuration Properties for configuration property descriptions.

Step 5: Launch the connector.

Verify the connection details by previewing the running configuration. Once you’ve validated that the properties are configured to your satisfaction, click Launch.

Tip

For information about previewing your connector output, see Preview Connector Output.

Step 6: Check the connector status.

The status for the connector should go from Provisioning to Running.

Step 7: Check for records.

Verify that records are being produced at the Kafka topic.

For more information and examples to use with the Confluent Cloud API for Connect, see the Confluent Cloud API for Connect section.

Using the Confluent Cloud CLI

Complete the following steps to set up and run the connector using the Confluent Cloud CLI.

Note

Make sure you have all your prerequisites completed.

Step 1: List the available connectors.

Enter the following command to list available connectors:

ccloud connector-catalog list

Step 2: Show the required connector configuration properties.

Enter the following command to show the required connector properties:

ccloud connector-catalog describe <connector-catalog-name>

For example:

ccloud connector-catalog describe AzureServiceBusSource

Example output:

Following are the required configs:
connector.class: AzureServiceBusSource
name
kafka.api.key
kafka.api.secret
kafka.topic
azure.servicebus.namespace
azure.servicebus.sas.keyname
azure.servicebus.sas.key
azure.servicebus.entity.name
output.data.format
tasks.max

Step 3: Create the connector configuration file.

Create a JSON file that contains the connector configuration properties. The following example shows the required connector properties.

{
  "connector.class": "AzureServiceBusSource",
  "name": "AzureServiceBusSource_0",
  "kafka.api.key": "****************",
  "kafka.api.secret": "************************************************",
  "kafka.topic": "<topic-name>",
  "azure.servicebus.namespace": "<namespace>",
  "azure.servicebus.sas.keyname": "<keyname>",
  "azure.servicebus.sas.key": "****************************************",
  "azure.servicebus.entity.name": "<entity>",
  "output.data.format": "AVRO",
  "tasks.max": "1",
}

Note the following property definitions:

  • "connector.class": Identifies the connector plugin name.
  • "name": Sets a name for your new connector.
  • "kafka.api.key" and ""kafka.api.secret": These credentials are either the cluster API key and secret or the service account API key and secret.
  • "kafka.topic": Enter the topic name where data is sent.
  • "azure.servicebus.<>": Enter the Azure Service Bus details. For additional information, see the Azure Service Bus docs. For details about shared access signature (SAS) authorization, see Shared Access Authorization Policies.
  • output.data.format": Enter an output message format (data going into the Kafka topic): AVRO, JSON_SR (JSON Schema), PROTOBUF, or JSON (schemaless). Schema Registry must be enabled to use a Schema Registry-based format (for example, Avro, JSON_SR (JSON Schema), or Protobuf).
  • "tasks.max": Enter the maximum number of tasks for the connector to use.

Note

See Configuration Properties for configuration property descriptions.

Step 4: Load the properties file and create the connector.

Enter the following command to load the configuration and start the connector:

ccloud connector create --config <file-name>.json

For example:

ccloud connector create --config azure-service-bus-source-config.json

Example output:

Created connector AzureServiceBusSource_0 lcc-do6vzd

Step 5: Check the connector status.

Enter the following command to check the connector status:

ccloud connector list

Example output:

ID           |             Name              | Status  | Type | Trace
+------------+-------------------------------+---------+------+-------+
lcc-do6vzd   | AzureServiceBusSource_0       | RUNNING | sink |       |

Step 6: Check for records.

Verify that records are being produced at the Kafka topic.

For more information and examples to use with the Confluent Cloud API for Connect, see the Confluent Cloud API for Connect section.

Configuration Properties

The following connector configuration properties are used with the Azure Service Bus Source connector for Confluent Cloud.

kafka.topic

The name of the Kafka topic to publish data to. You can specify only one topic.

  • Type: string
  • Importance: high
azure.servicebus.namespace

Azure Service Bus Namespace that the message entity belongs to.

  • Type: string
  • Default: null
  • Importance: high
azure.servicebus.sas.keyname

Azure Service Bus Shared Access Signature (SAS) key name for the Service Bus queue/topic.

  • Type: string
  • Default: null
  • Importance: high
azure.servicebus.sas.key

Azure Service Bus SAS access key for the Service Bus queue/topic.

  • Type: password
  • Default: null
  • Importance: high
azure.servicebus.entity.name

Azure Service Bus messaging entity name. Should be either the name of the queue or topic from which messages have to be polled.

  • Type: string
  • Default: null
  • Importance: high
azure.servicebus.subscription

The Azure Service Bus subscription name for the Service Bus topic. This should only be configured when the message entity is a topic.

  • Type: string
  • Default: “”
  • Importance: high

Note

For details about Azure Service Bus SAS authorization, see Shared Access Authorization Policies.

Next Steps

See also

For an example that shows fully-managed Confluent Cloud connectors in action with Confluent Cloud ksqlDB, see the Cloud ETL Demo. This example also shows how to use Confluent Cloud CLI to manage your resources in Confluent Cloud.

../_images/topology.png