Zendesk Source Connector for Confluent Platform¶
Zendesk Support is a system for tracking, prioritizing, and solving customer
support tickets. The Kafka Connect Zendesk Source connector copies data into
Apache Kafka® from various Zendesk support tables such as tickets
,
ticket_audits
, ticket_fields
, groups
, organizations
,
satisfaction_ratings
, and others, using the Zendesk Support API. You can
find the list of supported Zendesk tables in the supported tables section.
Features¶
The Zendesk Source connector offers the following features:
- At least once delivery
- Supports one task
- Quick turnaround
- Schema detection and evolution
- Real-time and historical lookup
- Automatic retries
- Intelligent backoffs
- Resource balance and throughput
At least once delivery¶
The connector guarantees no loss of messages from Zendesk to Kafka. Messages may be reprocessed because of task failure or API limits, which may cause duplication.
Supports one task¶
The Zendesk Source connector supports running only one task.
Quick turnaround¶
The Zendesk connector ensures that data between your Zendesk Tables and
corresponding Kafka topics are synced quickly, without unnecessary lag. The poll
frequency on each table has been specifically configured based on the size of
the table, so that larger and more dynamic tables, like Tickets
, are polled
more frequently than the static tables like Organizations
.
Schema detection and evolution¶
The connector supports automatic schema detection and backward compatible schema evolution for all supported tables.
Real-time and historical lookup¶
The connector supports fetching all the past historical records for all tables.
It can also be configured to pull in data from only a specified time in the past
(see configuration property
zendesk.since
).
Automatic retries¶
In case of a connection error between the API server and Kafka Connect, the
connector may receive a not OK
response from the API server or no response at
all. In such cases, the connector can be made robust using the automatic retry
mechanism with linear backoff using configuration properties max.retries
and retry.backoff.ms
.
Intelligent backoffs¶
If there are too many requests because of support API rate limits, the connector intelligently spaces out the HTTP fetch operations to ensure a smooth balance between recency, API limits, and back pressure.
Resource balance and throughput¶
Some resources within Zendesk are created and updated at a much greater rate
than other Zendesk resources. These resources can be balanced among the workers,
with reduced hot-spotting, by keeping the resources in configuration
zendesk.tables
sorted by the order of their expected cardinality. Also, the
task.max
, max.in.flight.requests
, and max.batch.size
configuration
properties can be used to improve overall throughput.
Supported Tables¶
The following tables from Zendesk are supported in this version of the Kafka Connect Zendesk Source connector:
- custom_roles
- groups
- group_memberships
- organizations
- organization_subscriptions
- organization_memberships
- satisfaction_ratings
- tickets
- ticket_audits
- ticket_fields
- ticket_metrics
- users
Limitations¶
For Schema Registry-based output formats, the connector tries to deduce the schema based on the source API response returned. The connector registers a new schema for every NULL and NOT NULL value of an optional field in the API response. For this reason, the connector may register schema versions at a much higher rate than expected.
License¶
You can use this connector for a 30-day trial period without a license key.
After 30 days, you must purchase a connector subscription which includes Confluent enterprise license keys to subscribers, along with enterprise-level support for Confluent Platform and your connectors. If you are a subscriber, you can contact Confluent Support at support@confluent.io for more information.
For license properties, see Confluent Platform license and for information about the license topic, see License topic configuration.
Configuration properties¶
For a complete list of configuration properties for this connector, see Configuration Reference for Zendesk Source Connector for Confluent Platform.
For an example of how to get Kafka Connect connected to Confluent Cloud, see Connect Self-Managed Kafka Connect to Confluent Cloud.
Install the Zendesk Source connector¶
You can install this connector by using the Confluent Hub client installatiion instructions or by manually downloading the ZIP file.
Prerequisites¶
- You must install the connector on every machine where Connect will run.
- Kafka Broker: Confluent Platform 3.3.0 or later, or Kafka 0.11.0 or later.
- Kafka Connect: Confluent Platform 4.1.0 or later, or Kafka 1.1.0 or later.
- Java 8+. Note that Java 8 is deprecated in versions 7.2 and later of Confluent Platform. For more details, view Java compatibility with Confluent Platform by version.
- Zendesk account type: Certain tables, such as
custom_roles
, can only be accessed if the Zendesk account is anEnterprise
account. For more details, see Custom Agent Roles. - Zendesk API: Support APIs
should be enabled for the Zendesk account. Also, either
oauth2
orpassword
mechanism should be enabled in the Zendesk account. For more information, see Using the API dashboard: Enabling password or token access. - Zendesk settings: Some settings may need to be enabled to ensure export is
possible. Example,
satisfaction_ratings
can only be exported if it is enabled. For more details, see Support API: Satisfaction Ratings. - Because every table requires a different role to be fetched by the connector, Confluent recommends you check the Zendesk API for the role required to use this connector.
Install the connector using the Confluent CLI¶
To install the latest
connector version, navigate to your Confluent Platform installation directory and run the following command:
confluent connect plugin install confluentinc/kafka-connect-zendesk:latest
You can install a specific version by replacing latest
with a version
number as shown in the following example:
confluent connect plugin install confluentinc/kafka-connect-zendesk:1.0.1
Install the connector manually¶
Download and extract the ZIP file for your connector and then follow the manual connector installation instructions.
Quick Start¶
In this quick start guide, the Zendesk connector is used to consume records from
a Zendesk resource called tickets
and send the records to a Kafka topic named
ZD_tickets
.
To run this quick start, ensure you have a Zendesk Developer Account.
Install the connector through the Confluent Hub Client.
# run from your confluent platform installation directory confluent connect plugin install confluentinc/kafka-connect-zendesk:latest
Start the Confluent Platform.
confluent local services start
Check the status of all services.
confluent local services status
Configure your connector by first creating a JSON file named
zendesk.json
with the following properties.// substitute <> with your config { "name": "ZendeskConnector", "config": { "connector.class": "io.confluent.connect.zendesk.ZendeskSourceConnector", "key.converter": "org.apache.kafka.connect.storage.StringConverter", "value.converter": "org.apache.kafka.connect.json.JsonConverter", "value.converter.schemas.enable": "false", "confluent.topic.bootstrap.servers": "127.0.0.1:9092", "confluent.topic.replication.factor": 1, "confluent.license": "<license>", // leave it empty for evaluation license "tasks.max": 1, "poll.interval.ms": 1000, "topic.name.pattern": "ZD_${entityName}", "zendesk.auth.type": "basic", "zendesk.url": "https://<sub-domain>.zendesk.com", "zendesk.user": "<username>", "zendesk.password": "<password>", "zendesk.tables": "tickets", "zendesk.since": "2019-08-01" } }
Start the Zendesk Source connector by loading the connector’s configuration with the following command:
confluent local services connect connector load zendesk --config zendesk.json
Confirm that the connector is in a
RUNNING
state.confluent local services connect connector status ZendeskConnector
Create one ticket record using Zendesk API as follows.
curl https://{subdomain}.zendesk.com/api/v2/tickets.json \ -d '{"ticket": {"subject": "My printer is on fire!", "comment": { "body": "The smoke is very colorful." }}}' \ -H "Content-Type: application/json" -v -u {email_address}:{password} -X POST
Confirm the messages were delivered to the
ZD_tickets
topic in Kafka. Note, it may take a minute before the record populates the topic.confluent local services kafka consume ZD_tickets --from-beginning