.. _azure-sql-dw-configuration-options: Configuration Properties ------------------------ To use this connector, specify the name of the connector class in the ``connector.class`` configuration property. .. codewithvars:: properties connector.class=io.confluent.connect.azuresqldw.AzureSqlDwSinkConnector Connector-specific configuration properties are described below. .. _db-connection-security: Database Connection Security ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ In the connector configuration you will notice there are no security parameters. This is because SSL is not part of the JDBC standard and will depend on the JDBC driver in use (this connector will use the `Microsoft JDBC driver `_). In general, you will need to configure SSL via the ``azure.sql.dw.url`` parameter. For example, a possible connection url with SSL encryption might look like .. sourcecode:: properties azure.sql.dw.url="jdbc:sqlserver://sample-server.database.windows.net:1433;encrypt=true;trustServerCertificate=false;hostNameInCertificate=*.database.windows.net;" Please check the documentation `here `_ to learn about the different connection properties that can be used to enable SSL encryption. Azure SQL Data Warehouse ^^^^^^^^^^^^^^^^^^^^^^^^ ``azure.sql.dw.url`` JDBC connection url for SQL Data Warehouse * Type: string * Importance: high ``azure.sql.dw.user`` SQL Data Warehouse user * Type: string * Default: null * Importance: high ``azure.sql.dw.password`` SQL Data Warehouse password * Type: password * Default: null * Importance: high ``azure.sql.dw.database.name`` Name of database to connect to * Type: string * Default: null * Importance: high Data Mapping ^^^^^^^^^^^^ ``table.name.format`` A format string for the destination table name, which may contain '${topic}' as a placeholder for the originating topic name. For example, ``kafka_${topic}`` for the topic 'orders' will map to the table name 'kafka_orders'. * Type: string * Default: ``${topic}`` * Importance: medium ``fields.whitelist`` List of comma-separated record value field names. If empty, all fields from the record value are utilized, otherwise used to filter to the desired fields. * Type: list * Default: "" * Importance: medium ``db.timezone`` Name of the JDBC timezone that should be used in the connector when inserting time-based values. Defaults to UTC. * Type: string * Default: UTC * Importance: medium Writes ^^^^^^ ``batch.size`` Specifies how many records to attempt to batch together for insertion into the destination table, when possible. * Type: int * Default: 3000 * Valid Values: [0,...] * Importance: medium SQL/DDL Support ^^^^^^^^^^^^^^^ ``auto.create`` Whether to automatically create the destination table based on record schema if it is found to be missing by issuing ``CREATE``. * Type: boolean * Default: false * Importance: medium ``auto.evolve`` Whether to automatically add columns in the table schema when found to be missing relative to the record schema by issuing ``ALTER``. * Type: boolean * Default: false * Importance: medium ``quote.sql.identifiers`` When to quote table names, column names, and other identifiers in SQL statements. For backward compatibility, the default is 'always'. * Type: string * Default: ALWAYS * Importance: medium Retries ^^^^^^^ ``max.retries`` The maximum number of times to retry on errors before failing the task. * Type: int * Default: 10 * Valid Values: [0,...] * Importance: medium ``retry.backoff.ms`` The time in milliseconds to wait following an error before a retry attempt is made. * Type: int * Default: 3000 * Valid Values: [0,...] * Importance: medium .. _az-sql-dw-sink-connector-license-config: |cp| license ^^^^^^^^^^^^ ``confluent.topic.bootstrap.servers`` A list of host/port pairs to use for establishing the initial connection to the Kafka cluster used for licensing. All servers in the cluster will be discovered from the initial connection. This list should be in the form host1:port1,host2:port2,.... Since these servers are just used for the initial connection to discover the full cluster membership (which may change dynamically), this list need not contain the full set of servers (you may want more than one, though, in case a server is down). * Type: list * Importance: high ``confluent.topic`` Name of the Kafka topic used for Confluent Platform configuration, including licensing information. * Type: string * Default: _confluent-command * Importance: low ``confluent.topic.replication.factor`` The replication factor for the Kafka topic used for Confluent Platform configuration, including licensing information. This is used only if the topic does not already exist, and the default of 3 is appropriate for production use. If you are using a development environment with less than 3 brokers, you must set this to the number of brokers (often 1). * Type: int * Default: 3 * Importance: low ---------------------------- Confluent license properties ---------------------------- .. include:: ../includes/security-info.rst .. include:: ../includes/platform-license.rst .. include:: ../includes/security-configs.rst .. _azure-sql-dw-license-topic-configuration: .. include:: ../includes/platform-license-detail.rst .. include:: ../includes/overriding-default-config-properties.rst