.. _salesforce-sobject-sink-connector-config: Configuration Properties ------------------------ The Salesforce PushTopic Source Connector Connector can be configured using a variety of configuration properties. .. _salesforce-sobject-sink-connector-connection-config: Connection ^^^^^^^^^^ ``salesforce.consumer.key`` The consumer key for the OAuth application. * Type: string * Importance: high ``salesforce.consumer.secret`` The consumer secret for the OAuth application. * Type: password * Importance: high ``salesforce.password`` The Salesforce password the connector should use. * Type: password * Importance: high ``salesforce.username`` The Salesforce username the connector should use. * Type: string * Importance: high ``salesforce.instance`` The URL of the Salesforce endpoint to use. The default is blank. This directs the connector to use the endpoint specified in the authentication response. * Type: string * Default: https://login.salesforce.com * Valid Values: Valid URL with a scheme of ``https`` or ``http`` * Importance: high ``salesforce.password.token`` The Salesforce security token associated with the username. * Type: password * Default: null * Importance: high ``http.proxy`` The HTTP(S) proxy host and port the connector should use when talking to Salesforce. This defaults to a blank string, which corresponds to not using a proxy. * Type: string * Default: null * Valid Values: Of the form ``:`` where ```` is a valid hostname or IP address, and ```` is a valid port number * Importance: medium ``http.proxy.auth.scheme`` Authentication scheme to be used when authenticating the connector towards HTTP(s) proxy. Basic and NTLM schemes are supported. * Type: string * Default: NONE * Valid Values: One of ``NONE``, ``NTLM``, ``BASIC`` * Importance: medium ``http.proxy.user`` The HTTP(S) proxy user name. * Type: string * Default: NONE * Importance: medium ``http.proxy.password`` The HTTP(S) proxy password. * Type: password * Default: null * Importance: medium ``http.proxy.auth.ntlm.domain`` The domain to authenticate within, when NTLM scheme is used. * Type: string * Default: null * Importance: medium ``connection.timeout`` The amount of time to wait while connecting to the Salesforce streaming endpoint. * Type: long * Default: 30000 * Valid Values: [5000,...,600000] * Importance: low ``curl.logging`` If enabled the logs output the equivalent curl commands. This is a security risk because your authorization header is displayed in the log file. **Use at your own risk.** * Type: boolean * Default: false * Importance: low ``request.max.retries.time.ms`` The maximum time in milliseconds that the connector continues retry requests to Salesforce that fail because of network issues (after authentication has succeeded). The backoff period for each retry attempt uses a randomization function that grows exponentially. But, if the total time spent retrying the request exceeds this duration (15 minutes by default), retries stop and the request fails. This will likely result in task failure. * Type: long * Default: 900000 * Valid Values: [1,...] * Importance: low ``salesforce.version`` The version of the Salesforce API to use. * Type: string * Default: latest * Valid Values: Matches regex( ^(latest|[\d\.]+)$ ) * Importance: low Salesforce SObject Sink ^^^^^^^^^^^^^^^^^^^^^^^ ``salesforce.object`` The Salesforce SObject to perform the sink operation on. * Type: string * Importance: high ``topics`` One or more Kafka topics to use as data sources for SOjects or Events. * Type: string * Importance: high ``behavior.on.api.errors`` Error handling behavior setting for Rest api calls to Salesforce. Must be one of ``fail``, ``ignore``, or ``log``. ``fail`` stops the connector, ``ignore`` continues to the next record, and ``log`` logs the error and continues to the next record. * Type: string * Default: fail * Valid Values: Matches regex( ^(log|ignore|fail)$ ) * Importance: low ``override.event.type`` A flag to indicate that the Kafka SObject source record EventType(create, update, delete) is overriden to use the operation specified in the ``salesforce.sink.object.operation`` configuration setting. * Type: boolean * Default: false * Importance: low * Dependents: ``salesforce.sink.object.operation`` ``salesforce.custom.id.field.name`` Name of a custom external id field in SObject to structure Rest Api calls for insert, upsert, delete, and update operations. When ``salesforce.use.custom.id.field=true``, The operations substitute the value of the ``id`` field of source records in |ak| into the value of the specified custom external id field for sink records. This allows the sink connector to match records for operations without having to specify the ``id`` field in Salesforce which is auto-generated. * Type: string * Default: null * Importance: low ``salesforce.ignore.fields`` Comma separate list of fields from the source Kafka record to ignore when pushing a record into Salesforce. * Type: string * Default: "" * Importance: low ``salesforce.ignore.reference.fields`` Flag to prevent reference type fields from being updated or inserted in Salesforce SObjects. * Type: boolean * Default: false * Importance: low ``salesforce.sink.object.operation`` The Salesforce sink operation to perform on the SObject. One of: insert, update, upsert, delete. Default is insert. This feature works if ``override.event.type`` is true. * Type: string * Default: insert * Valid Values: Matches regex( ^(insert|update|upsert|delete)$ ) * Importance: low ``salesforce.use.custom.id.field`` Flag to indicate whether to use the ``salesforce.custom.id.field.name`` for all sink connector operations. * Type: boolean * Default: false * Importance: low * Dependents: ``salesforce.custom.id.field.name`` ``topics.regex`` A Java regex or regular expression to use for matching data source topics. * Type: string * Default: null * Importance: low |cp| license ^^^^^^^^^^^^ ``confluent.topic.bootstrap.servers`` A list of host/port pairs to use for establishing the initial connection to the Kafka cluster used for licensing. All servers in the cluster will be discovered from the initial connection. This list should be in the form host1:port1,host2:port2,.... Since these servers are just used for the initial connection to discover the full cluster membership (which may change dynamically), this list need not contain the full set of servers (you may want more than one, though, in case a server is down). * Type: list * Importance: high ``confluent.topic`` Name of the Kafka topic used for Confluent Platform configuration, including licensing information. * Type: string * Default: _confluent-command * Importance: low ``confluent.topic.replication.factor`` The replication factor for the Kafka topic used for Confluent Platform configuration, including licensing information. This is used only if the topic does not already exist, and the default of 3 is appropriate for production use. If you are using a development environment with less than 3 brokers, you must set this to the number of brokers (often 1). * Type: int * Default: 3 * Importance: low ---------------------------- Confluent license properties ---------------------------- .. include:: ../../includes/security-info.rst .. include:: ../../includes/platform-license.rst .. include:: ../../includes/security-configs.rst .. _salesforce-sobject-sink-license-topic-configuration: .. include:: ../../includes/platform-license-detail.rst .. include:: ../../includes/overriding-default-config-properties.rst