Salesforce Bulk API Source Connector for Confluent Platform


If you are using Confluent Cloud, see Salesforce Bulk API Source Connector for Confluent Cloud for the cloud Quick Start.

The Salesforce Bulk API Source connector integrates with Apache Kafka®.

The Salesforce Bulk API Source connector provides the capability to pull records and capture changes from via Salesforce Bulk Query API.

Salesforce Objects are standard salesforce objects. The SalesforceBulkApiSourceConnector can be used to pull objects/capture changes and write them to Kafka. You can use this connector with either standalone or distributed Connect workers.


The Salesforce Bulk API Source connector includes the following features:

At least once delivery

This connector guarantees that records are delivered at least once to the Kafka topic. If the connector restarts, there may be some duplicate records in the Kafka topic.

Supports one task

The Salesforce Bulk API Source connector supports running only one task.

Configuration Properties

For a complete list of configuration properties for this connector, see Salesforce Bulk API Source Connector Configuration Properties.


Note the following when using the Salesforce Bulk API source connector.


When the connector operates, it periodically records the last query time execution in the Connect offset topic. When the connector is restarted, it fetches Salesforce objects with a LastModifiedDate that is later than last queried time.

API Limits

The Salesforce Bulk API connector is limited by non compound fields. For Example, Bulk Query doesn’t support address, location fields. The connector will discard the address and geolocation fields.

SObjects Limitation

The following Salesforce object (SObject) error message may be displayed when you are using the Salesforce Bulk API Source connector:

Entity 'Order' is not supported to use PKChunking.

Set the configuration property batch.enable=false for these SObjects. This property is available for Salesforce Bulk API Source connector version 1.7.0 (or later).

Salesforce Object Support

The following objects from Salesforce are supported in this version of Kafka Connect Salesforce Bulk API Source connector:

  • Account
  • Campaign
  • CampaignMember
  • Case
  • Contact
  • Contract
  • Event
  • Group
  • Lead
  • Opportunity
  • OpportunityContactRole
  • OpportunityLineItem
  • Period
  • PricebookEntry
  • Product2
  • Task
  • TaskFeed
  • TaskRelation
  • User
  • UserRole

The Kafka Connect Salesforce Bulk API Source connector also supports custom objects with non-compound fields.

The following objects are not supported by Salesforce Bulk API:

  • Feed (e.g. AccountFeed, AssetFeed, …)
  • Share (e.g. AccountBrandShare, ChannelProgramLevelShare, …)
  • History (e.g. AccountHistory, ActivityHistory, …)
  • EventRelation (e.g. AcceptedEventRelation, DeclinedEventRelation, …)
  • AggregateResult
  • AttachedContentDocument
  • CaseStatus
  • CaseTeamMember
  • CaseTeamRole
  • CaseTeamTemplate
  • CaseTeamTemplateMember
  • CaseTeamTemplateRecord
  • CombinedAttachment
  • ContentFolderItem
  • ContractStatus
  • EventWhoRelation
  • FolderedContentDocument
  • KnowledgeArticleViewStat
  • KnowledgeArticleVoteStat
  • LookedUpFromActivity
  • Name
  • NoteAndAttachment
  • OpenActivity
  • OwnedContentDocument
  • PartnerRole
  • RecentlyViewed
  • ServiceAppointmentStatus
  • SolutionStatus
  • TaskPriority
  • TaskStatus
  • TaskWhoRelation
  • UserRecordAccess
  • WorkOrderLineItemStatus
  • WorkOrderStatus

Quick Start

In this quick start, the Salesforce Bulk API source connector is used to import data from Salesforce to Kafka.

  1. Create Salesforce developer account using this link if you don’t have it.
  2. Add records to the objects by clicking on App Launcher and selecting the required Salesforce object.

Install the connector through the Confluent Hub Client.

# run from your CP installation directory
confluent-hub install confluentinc/kafka-connect-salesforce-bulk-api:latest


By default, the connector will install the plugin into the share/confluent-hub-components directory and add the directory to the plugin path. For the plugin path change to take effect, you must restart the Connect worker.

Start the services using the Confluent CLI.

confluent local start

Every service starts in order, printing a message with its status.

Starting Zookeeper
Zookeeper is [UP]
Starting Kafka
Kafka is [UP]
Starting Schema Registry
Schema Registry is [UP]
Starting Kafka REST
Kafka REST is [UP]
Starting Connect
Connect is [UP]
Starting KSQL Server
KSQL Server is [UP]
Starting Control Center
Control Center is [UP]


The SalesforceBulkApiSourceConnector supports a single task only.

Property-based example

Create a configuration file, configuration is used typically along with standalone workers.

salesforce.username=< Required Configuration >
salesforce.password=< Required Configuration >
salesforce.password.token=< Required Configuration >
salesforce.object=< Required Configuration >
salesforce.since=< Required Configuration >
kafka.topic=< Required Configuration >
salesforce.instance=< Required Configuration >
confluent.license=Omit to enable trial mode

Before starting the connector, make sure that the configurations in are properly set.

Then start the Salesforce Bulk API source connector by loading its configuration with the following command.


You must include a double dash (--) between the connector name and your flag. For more information, see this post.

confluent local load salesforce-bulk-api-source -- -d
   "name" : "SalesforceBulkApiSourceConnector",
   "config" : {
     "connector.class", "io.confluent.connect.salesforce.SalesforceBulkApiSourceConnector",
     "tasks.max" : "1",
     "key.converter": "",
     "value.converter": "io.confluent.connect.avro.AvroConverter",
     "value.converter.schema.registry.url": "http://localhost:8081",
     "kafka.topic" : "< Required Configuration >",
     "salesforce.password" : "< Required Configuration >",
     "salesforce.password.token" : "< Required Configuration >",
     "salesforce.object" : "< Required Configuration >",
     "salesforce.username" : "< Required Configuration >",
     "salesforce.since" : "< Required Configuration >",
     "confluent.topic.bootstrap.servers": "localhost:9092",
     "confluent.topic.replication.factor": "1",
     "confluent.license": ""
  "tasks": []

Check that the connector started successfully. Review the Connect worker’s log by entering the following:

confluent local log connect

Confirm that the connector is in a RUNNING state.

confluent local status SalesforceBulkApiSourceConnector

Confirm that the messages are being sent to Kafka.

kafka-avro-console-consumer \
    --bootstrap-server localhost:9092 \
    --property schema.registry.url=http://localhost:8081 \
    --topic <topic-name> \
    --from-beginning | jq '.'

REST-based example

This configuration is used typically along with distributed workers. Write the following JSON to connector.json, configure all of the required values, and use the command below to post the configuration to one of the distributed connect worker(s). See Kafka Connect Kafka Connect REST Interface for more information.

Connect Distributed REST example:

  "name" : "SalesforceBulkApiSourceConnector",
  "config" : {
    "connector.class": "io.confluent.connect.salesforce.SalesforceBulkApiSourceConnector",
    "tasks.max" : "1",
    "key.converter": "",
    "value.converter": "io.confluent.connect.avro.AvroConverter",
    "value.converter.schema.registry.url": "http://localhost:8081",
    "kafka.topic" : "< Required Configuration >",
    "salesforce.password" : "< Required Configuration >",
    "salesforce.password.token" : "< Required Configuration >",
    "salesforce.object" : "< Required Configuration >",
    "salesforce.username" : "< Required Configuration >",
    "salesforce.since" : "< Required Configuration >",
    "confluent.topic.bootstrap.servers": "localhost:9092",
    "confluent.topic.replication.factor": "1",
    "confluent.license": " Omit to enable trial mode "


Change the confluent.topic.bootstrap.servers property to include your broker address(es), and change the confluent.topic.replication.factor to 3 for staging or production use.

Use curl to post a configuration to one of the Kafka Connect Workers. Change http://localhost:8083/ to the endpoint of one of your Kafka Connect worker(s).

Create a new connector:

curl -sS -X POST -H 'Content-Type: application/json' --data @connector.json http://localhost:8083/connectors

Update an existing connector:

curl -s -X PUT -H 'Content-Type: application/json' --data @connector.json http://localhost:8083/connectors/SalesforceBulkApiSourceConnector/config

Sample data formats

The following examples show the JSON document structure of a Salesforce Bulk Query results (for Contact object) as it received by the Saleforce Connector, converted to a Kafka record, and then stored in a topic.

Raw JSON received from Salesforce Bulk Query:

  "attributes" : {
      "type" : "Contact",
      "url" : "/services/data/v47.0/sobjects/Contact/0032v00002qXTBlAAO"
  "Id" : "0032v00002qXTBlAAO",
  "IsDeleted" : false,
  "MasterRecordId" : null,
  "AccountId" : "0012v00002RkgUVAAZ",
  "LastName" : "Gonzalez",
  "FirstName" : "Rose",
  "Salutation" : "Ms.",
  "Name" : "Rose Gonzalez",
  "OtherStreet" : null,
  "OtherCity" : null,
  "OtherState" : null,
  "OtherPostalCode" : null,
  "OtherCountry" : null,
  "OtherLatitude" : null,
  "OtherLongitude" : null,
  "OtherGeocodeAccuracy" : null,
  "MailingStreet" : "313 Constitution Place\nAustin, TX 78767\nUSA",
  "MailingCity" : null,
  "MailingState" : null,
  "MailingPostalCode" : null,
  "MailingCountry" : null,
  "MailingLatitude" : null,
  "MailingLongitude" : null,
  "MailingGeocodeAccuracy" : null,
  "Phone" : "(512) 757-6000",
  "Fax" : "(512) 757-9000",
  "MobilePhone" : "(512) 757-9340",
  "HomePhone" : null,
  "OtherPhone" : null,
  "AssistantPhone" : null,
  "ReportsToId" : null,
  "Email" : "",
  "Title" : "SVP, Procurement",
  "Department" : "Procurement",
  "AssistantName" : null,
  "LeadSource" : "Trade Show",
  "Birthdate" : "1967-07-14",
  "Description" : null,
  "OwnerId" : "0052v00000ajtG3AAI",
  "CreatedDate" : 1564636138000,
  "CreatedById" : "0052v00000ajtG3AAI",
  "LastModifiedDate" : 1564636138000,
  "LastModifiedById" : "0052v00000ajtG3AAI",
  "SystemModstamp" : 1564636138000,
  "LastActivityDate" : null,
  "LastCURequestDate" : null,
  "LastCUUpdateDate" : null,
  "LastViewedDate" : 1573528066000,
  "LastReferencedDate" : 1573528066000,
  "EmailBouncedReason" : null,
  "EmailBouncedDate" : null,
  "IsEmailBounced" : false,
  "PhotoUrl" : "/services/images/photo/0032v00002qXTBlAAO",
  "Jigsaw" : null,
  "JigsawContactId" : null,
  "CleanStatus" : "Pending",
  "IndividualId" : null,
  "Level__c" : "Primary",
  "Languages__c" : "English"

Kafka record value:

    "string":"Rose Gonzalez"
    "string":"313 Constitution Place\nAustin, TX 78767\nUSA"
    "string":"(512) 757-6000"
    "string":"(512) 757-9000"
    "string":"(512) 757-9340"
    "string":"SVP, Procurement"
    "string":"Trade Show"