Get Confluent | Sign up for Confluent Cloud or download Confluent Platform
  • Home
  • Platform
  • Cloud
  • Connectors
  • Tools
  • Clients
  • Download Confluent
Back to home

CLOUD

  • Overview
  • Get Started
    • Guided Tutorial
    • Quick Start for Apache Kafka using Confluent Cloud
    • ksqlDB in Confluent Cloud
    • Manage Schemas on Confluent Cloud
    • Confluent Cloud Basics
    • Confluent Cloud Demos
  • Kafka Clusters
    • Create Cluster
    • Expand a Dedicated Cluster
    • Cluster Types
    • Create Encrypted Clusters Using Your Own Key
    • Cloud Providers and Regions
    • Confluent Cloud Broker and Topic Configuration Settings
    • Compliance for Confluent Cloud
    • Upgrade Policy for Confluent Cloud
    • Migrate Topics on Confluent Cloud Clusters
    • Migrate Schemas
  • Manage Access
    • User Accounts for Confluent Cloud
    • Service Accounts for Confluent Cloud
    • Access Control Lists (ACLs) for Confluent Cloud
    • Tutorial: User Management in Confluent Cloud
    • Restrict Access to Confluent Cloud
    • Single sign-on (SSO) for Confluent Cloud
    • Configure the CLI for Multiple Environments
    • Environments
  • Manage Topics
    • Overview
    • Topics
    • Message Browser
    • Data Flow
  • Manage Schemas
  • Explore Data Lineage
    • Data Lineage Preview
  • Develop Client Applications
    • Architectural Considerations
    • Monitoring
    • Optimizing and Tuning
      • Overview
      • Throughput
      • Latency
      • Durability
      • Availability
    • Configure Clients
    • Create API Keys
    • Tools for Confluent Cloud Clusters
  • Connect to External Systems
    • Overview
    • Amazon Kinesis Source Connector
    • Amazon Redshift Sink Connector
    • Amazon S3 Sink Connector
    • AWS Lambda Sink Connector
    • Azure Blob Storage Sink Connector
    • Azure Data Lake Storage Gen2 Sink Connector
    • Azure Event Hubs Source Connector
    • Azure Functions Sink Connector
    • Datagen Source Connector (development and testing)
    • Elasticsearch Service Sink Connector
    • Google BigQuery Sink Connector
    • Google Cloud Dataproc Sink Connector
    • Google Cloud Functions Sink Connector
    • Google Cloud Spanner Sink Connector
    • Google Cloud Storage Sink Connector
    • Google Pub/Sub Source Connector
    • Microsoft SQL Server Sink Connector
    • Microsoft SQL Server Source CDC Connector (Debezium)
    • Microsoft SQL Server Source Connector
    • MongoDB Atlas Sink Connector
    • MongoDB Atlas Source Connector
    • MySQL Source CDC Connector
    • MySQL Sink Connector
    • MySQL Source Connector
    • Oracle Database Source Connector
    • PostgreSQL CDC Source Connector (Debezium)
    • PostgreSQL Sink Connector
    • PostgreSQL Source Connector
    • Salesforce CDC Source Connector
    • Snowflake Sink Connector
    • Cloud Connector Service Accounts
    • Confluent Cloud Dead Letter Queue
    • Cloud Connector Limitations
    • Confluent Cloud Connect Preview
  • ksqlDB Stream Processing
  • Networking
    • Overview
    • Configure Peering for AWS
    • Configure Peering for Azure
    • Configure Peering for GCP
    • Configuring Access to the Confluent Cloud Web UI with VPC peering
    • Using Confluent Cloud Schema Registry in a VPC Peered Environment
    • Configure AWS PrivateLink
    • Configure Azure Private Link (Preview)
    • Configuring Access to the Confluent Cloud Web UI with AWS and Azure Private Link
  • Monitoring
    • Confluent Cloud Metrics API
    • Monitor Consumer Lag
    • Debug Confluent Cloud using kafkacat
  • Resource Limits
  • Billing
    • Confluent Cloud Billing
    • Consumption Metrics
    • Azure Marketplace Pay As You Go
    • Azure Marketplace Commits
    • AWS Marketplace Pay As You Go
    • AWS Marketplace Commits
    • GCP Marketplace Pay As You Go
    • GCP Marketplace Commits
  • Use Confluent Platform with Cloud
    • Overview
    • Connecting Control Center to Confluent Cloud
    • Connecting Clients to Confluent Cloud
    • Connect Kafka Connect to Confluent Cloud
    • Connecting REST Proxy to Confluent Cloud
    • Connecting ksqlDB to Confluent Cloud
    • Schema Registry and Confluent Cloud
    • Connecting Kafka Streams to Confluent Cloud
    • Auto-Generating Configurations for Components to Confluent Cloud
  • Confluent Cloud CLI
  • Release Notes
  • FAQ

Configure an Azure Private Link connection to Confluent Cloud (Preview)¶

Note

Preview Note

  1. Azure Private Link Service is currently in preview. Please contact your Confluent representative to get access to the preview.
  2. Azure Private Link preview is not covered by SLA and is not recommended for production workloads.
  3. Azure Private Link preview is supported only with multi-zone dedicated clusters. Support for single zone dedicated clusters is on the roadmap.

Overview¶

Prerequisite
A Dedicated Kafka cluster in Azure with Azure Private Link enabled. For more information about how to create a Dedicated cluster, see Create a Cluster in Confluent Cloud.

Follow this procedure to configure Azure Private Link for a Dedicated cluster in Azure.

  1. Register your Azure subscription with Confluent Cloud using the Confluent Cloud UI.
  2. Set up the Private Endpoint(s) to Confluent Cloud Private Link Service Alias(es) in your Azure subscription using the Azure portal.
  3. Set up Availability Zone mapped DNS records to use Azure Private Endpoints using the Azure portal.
  4. Validate connectivity to Confluent Cloud.

Requirements¶

  1. To use Azure Private Link with Confluent Cloud, your VNET must allow outbound internet connections for DNS resolution, Schema Registry and Confluent Cloud CLI to work.
    1. DNS requests to public authority traversing to private DNS zone is required.
    2. Confluent Cloud Schema Registry is only accessible over the internet.
    3. Confluent Cloud CLI requires internet access to authenticate with the Confluent Cloud control plane.
  2. Confluent Cloud web UI components like topic management and ksqlDB need additional configuration to function as they use cluster endpoints. To use all features of the Confluent Cloud web UI with Azure Private Link, follow this procedure.

Warning

For limitations of the Azure Private Link feature, see Limitations.

Register your Azure subscription with Confluent Cloud¶

To make an Azure Private Link connection to a cluster in Confluent Cloud you must register the Azure subscription ID you wish to use. This is a security measure that enables Confluent to ensure that only your organization can initiate Azure Private Link connections to the cluster. Azure Private Link connections from a VNET not contained in a registered Azure subscription will not be accepted by Confluent Cloud.

  1. Navigate to the Cluster Settings page, click the Networking tab, and click Add Connection.
  2. Provide the Azure subscription Number for the subscription containing the VNETs you want to make the Private Link connection from and click Save. The Azure subscription number can be found on your Azure subscription page on the Azure portal. Your Azure Private Link connection status will transition from “Pending” to “Active” in the Confluent Cloud web UI. You still need to configure the Private Endpoints in your VNET before you can connect to the cluster.

Set up the Private Endpoint(s) for Azure Private Link in your Azure subscription¶

After the connection status is “Active” in the Confluent Cloud UI, you must configure Private Endpoint(s) in your VNET from Azure portal to make the Private Link connection to your Confluent Cloud cluster.

Prerequisite

In the Confluent Cloud UI you will find the following information for your Confluent Cloud cluster under the Cluster Settings section. This information is needed to configure Azure Private Link for a Dedicated cluster in Azure.

  • Kafka Bootstrap (in the General tab)
  • DNS domain for cluster (in the Networking tab)
  • DNS domain per zone (in the Networking tab)
  • Service Alias (in the Networking tab)
  1. Create the Private Endpoint(s)

    In the Azure Private Link Center:

    1. Create a Private Endpoint for Confluent Cloud Availability Zone 1 by clicking Create Private Endpoint.
    2. Fill in subscription, resource group, name, and region for the virtual endpoint and click next. The selected subscription must be the same as the one registered with Confluent Cloud.
    3. Select the Connect to an Azure resource by resource ID or alias option, paste in the Confluent Cloud Service Alias for Availability Zone 1 and click next. You can find the Confluent Cloud Service Alias for Availability Zone 1 in the Networking tab under Cluster settings in the Confluent Cloud UI .
    4. Fill in virtual network and subnet where the Private Endpoint is to be created.
    5. Click Review + create. Review the details and click Create to create the Private Endpoint.
    6. Wait for the Azure deployment to complete, go to the Private Endpoint resource and verify Private Endpoint connection status is Approved.
    7. Repeat the above steps, creating two more Private Endpoints for the remaining two Confluent Cloud Availability Zones.

Set up DNS records to use Azure Private Endpoints¶

DNS changes must be made to ensure connectivity passes through Azure Private Link in the supported pattern. Any DNS provider that can ensure DNS is routed as follows is acceptable. Azure Private DNS Zone (used in this example) is one option.

Update DNS using Azure Private DNS Zone in the Azure console:

  1. Create the Private DNS Zone.

    1. Search for the Private DNS Zone resource in Azure portal.

    2. Click Add

    3. Copy the DNS Domain name from the Networking tab under Cluster Settings in the Confluent Cloud UI and use it as the name for the Private DNS Zone.

      For example:

      4kgzg.centralus.azure.confluent.cloud
      

      Note

      Notice there is no glb in the DNS Domain name

    4. Fill in subscription, resource group and name and click Review + create.

    5. Wait for the Azure deployment to complete.

  2. Create DNS records.

    1. Go to the Private DNS Zone resource as created above.
    2. Click + Record Set.
    3. Create the following record sets. The IP address of the Private Endpoint can be found under its associated network interface.
      1. Select name as “*”, type as “A”, TTL as “1 Minute” and add IP addresses of all three virtual endpoints as created above.
      2. Select name as “*.az1”, type as “A”, TTL as “1 Minute” and add IP address of the az1 virtual endpoint as created above.
      3. Select name as “*.az2”, type as “A”, TTL as “1 Minute” and add IP address of the az2 virtual endpoint as created above.
      4. Select name as “*.az3”, type as “A”, TTL as “1 Minute” and add IP address of the az3 virtual endpoint as created above.
  3. Attach the Private DNS Zone to the VNET(s) where clients/applications are present.

    1. Go to the Private DNS Zone resource and click Virtual network links under settings.
    2. Click Add.
    3. Fill in link name, subscription and virtual network.

Validate Connectivity to Confluent Cloud¶

  1. From an instance within the VNET (or anywhere the previous step’s DNS is set up), run the following to validate Kafka connectivity through Azure Private Link is working correctly.

    1. Set a variable with the cluster bootstrap URL.

      export BOOTSTRAP=$ConfluentCloudBootstrap
      

      For example:

      export BOOTSTRAP=lkc-222v1o-4kgzg.centralus.azure.glb.confluent.cloud:9092
      
    2. Test connectivity to the cluster.

      openssl s_client -connect $BOOTSTRAP:9092 -servername $BOOTSTRAP -verify_hostname $BOOTSTRAP </dev/null 2>/dev/null | grep -E 'Verify return code|BEGIN CERTIFICATE' | xargs
      
    3. If the return output is -----BEGIN CERTIFICATE----- Verify return code: 0 (ok), connectivity to the bootstrap is confirmed.

    Note

    You might need to update the network security tools and firewalls to allow connectivity. If you have issues connecting after following these steps, confirm which network security systems your organization uses and whether their configurations need to be changed.

  2. Next, verify connectivity with the Confluent Cloud CLI.

    1. Log in to the Confluent Cloud CLI with your Confluent Cloud credentials.

      ccloud login
      
    2. List the clusters in your organization.

      ccloud kafka cluster list
      
    3. Select the cluster with Azure Private Link you wish to test.

      ccloud kafka cluster use ...
      

      For example:

      ccloud kafka cluster use lkc-222v1o
      
    4. Create a cluster API key to authenticate with the cluster.

      ccloud api-key create --resource ... --description ...
      

      For example:

      ccloud api-key create --resource lkc-222v1o --description "connectivity test"
      
    5. Select the API key you just created.

      ccloud api-key use ... --resource ...
      

      For example:

      ccloud api-key use R4XPKKUPLYZSHOAT --resource lkc-222v1o
      
    6. Create a test topic.

      ccloud kafka topic create test
      
    7. Start consuming events from the test topic.

      ccloud kafka topic consume test
      
    8. Open another terminal tab or window.

    9. Start a producer.

      ccloud kafka topic produce test
      
    10. Type anything into the produce tab and hit Enter; press Ctrl+D or Ctrl+C to stop the producer.

    11. The tab running consume will print what was typed in the tab running produce.

  3. You’re done! The cluster is ready for use.

Limitations¶

Warning

  1. For limitations of the Azure Private Link feature preview, see preview note.
  2. Cross-region Azure Private Link connections are not supported.
  3. Azure Private Link is only available for use with Dedicated clusters.
  4. Existing Confluent Cloud clusters cannot be converted to use Azure Private Link.
  5. Fully-managed ksqlDB is not available for use with Azure Private Link clusters.
  6. Each Confluent Cloud cluster using Azure Private Link will be provisioned with Private Link Service in three availability zones. Three private endpoint connections are required for both single zone and multi zone clusters.
  7. Fully-managed Confluent Cloud connectors can connect to source(s) or sink(s) using a public IP. Source(s) or sink(s) in the customer network with private IP are not supported.
  8. Azure Private Link connections cannot be shared across multiple Confluent Cloud clusters. Separate Azure Private Link connections must be made to each Confluent Cloud cluster.
  9. Availability zone selection for placement of Confluent Cloud cluster and Azure Private Link service is not supported.
  10. For requirements of the Azure Private Link feature, see Requirements.

© Copyright , Confluent, Inc. Privacy Policy | Terms & Conditions. Apache, Apache Kafka, Kafka and the Kafka logo are trademarks of the Apache Software Foundation. All other trademarks, servicemarks, and copyrights are the property of their respective owners.

Please report any inaccuracies on this page or suggest an edit.

On this page: