ServiceNow Source Connector for Confluent Platform

The Kafka Connect ServiceNow Source connector is used to move creations and updates in a ServiceNow Table to Apache Kafka® in real-time. The connector consumes data from a ServiceNow table to add and update records in a Kafka topic.

Features

The ServiceNow Source connector includes the following features:

At least once delivery

This connector guarantees that records are delivered to the Kafka topic at least once. If the connector restarts, there may be some duplicate records in the Kafka topic.

Supports one task

The ServiceNow Source connector supports running only one task–one table is handled by one task.

Delivery guarantee

Under normal operation, the connector delivers each ServiceNow update to Kafka only once. However, duplicates are still possible in several situations. These situations include: during temporary network degradation, after non-graceful shutdowns, or when connectors and tasks are rescheduled.

Automatic retries

The connector may experience network failures connecting to the ServiceNow endpoint. The connector will automatically retry to poll from the endpoint. The property retry.max.times controls how many times retries are attempted. An exponential backoff is added to each retry interval.

Supports HTTPS proxy

The connector can connect to ServiceNow using an HTTPS proxy server.

Elasticity

The connector allows you to configure two parameters that enforce the throughput limit: batch.max.rows and poll.interval.ms. The connector uses 10000 records and a 20s interval as a default. If a large number of updates occur within the given interval, the connector will paginate records according to configurable batch size. Note that since ServiceNow provides precision down to one second, the ServiceNow Connector provides one second as the lowest poll.interval.s configuration property setting.

Real-time and historical lookup

The connector supports a specific starting point in the history. It is capable of retrieving all historical records in one run and catching up with real-time records.

Limitations

The ServiceNow Source connector has the following limitation:

  • The connector does not support the following table types:
    • Sys Audit (sys_audit)
    • Audit Relationship Change (sys_audit_relation)
    • History (sys_history_line)
  • The ServiceNow Source connector interprets all table fields as strings. Use the Cast Single Message Transformation (SMT) to workaround this limitation.

License

You can use this connector for a 30-day trial period without a license key.

After 30 days, you must purchase a connector subscription which includes Confluent enterprise license keys to subscribers, along with enterprise-level support for Confluent Platform and your connectors. If you are a subscriber, you can contact Confluent Support at support@confluent.io for more information.

For license properties, see Confluent Platform license. For information about the license topic, see License topic configuration.

Configuration Properties

For a complete list of configuration properties for this connector, see ServiceNow Source connector Configuration Properties.

For an example of how to get Kafka Connect connected to Confluent Cloud, see Connect Self-Managed Kafka Connect to Confluent Cloud.

Install the ServiceNow Source Connector

You can install this connector by using the Confluent Hub client installation instructions or by manually downloading the ZIP file.

Prerequisites

Important

You must install the connector on every machine where Connect will run.

  • Kafka Broker: Confluent Platform 3.3.0 or later, or Kafka 0.11.0 or later.

  • Connect: Confluent Platform 4.1.0 or later, or Kafka 1.1.0 or later.

  • (ServiceNow Sink connector only) ServiceNow API: Paris, Orlando, New York, Madrid or London.

  • An installation of the Confluent Hub Client. This is installed by default with Confluent Enterprise.

  • An installation of the latest (latest) connector version.

    To install the latest connector version, navigate to your Confluent Platform installation directory and run the following command:

    confluent-hub install confluentinc/kafka-connect-servicenow:latest
    

    You can install a specific version by replacing latest with a version number as shown in the following example:

    confluent-hub install confluentinc/kafka-connect-servicenow:2.5.0
    

Install the connector manually

Download and extract the ZIP file for your connector and then follow the manual connector installation instructions.

Quick start

The quick start guide uses ServiceNow Source connector to consume records from a ServiceNow Table and send them to Kafka. This guide assumes multi-tenant environment is used. For local testing, refer to Running Connect in standalone mode.

  1. Install the connector through the Confluent Hub Client.

    # run from your confluent platform installation directory
    confluent-hub install confluentinc/kafka-connect-servicenow:latest
    
  2. Start the Confluent Platform.

    Tip

    The command syntax for the Confluent CLI development commands changed in 5.3.0. These commands have been moved to confluent local. For example, the syntax for confluent start is now confluent local services start. For more information, see confluent local.

    confluent local services start
    
  3. Check the status of all services.

    confluent local services status
    
  4. Create a servicenow-source.json file with the following contents:

    // subsitute <> with your config
    {
        "name": "ServiceNowSourceConnector",
        "config": {
            "connector.class": "io.confluent.connect.servicenow.ServiceNowSourceConnector",
            "kafka.topic": "topic-servicenow",
            "servicenow.url": "https://<endpoint>.service-now.com/",
            "tasks.max": "1",
            "servicenow.table": "<table_name>",
            "servicenow.user": "<username>",
            "servicenow.password": "<password>",
            "servicenow.since": "2019-01-01",
            "key.converter": "org.apache.kafka.connect.json.JsonConverter",
            "value.converter": "org.apache.kafka.connect.json.JsonConverter",
            "confluent.topic.bootstrap.servers": "<server hostname>:9092",
            "confluent.license": "<license>", // leave it empty for evaluation license
            "poll.interval.s": "10",
            "confluent.topic.replication.factor": "1"
        }
    }
    
  5. Load the ServiceNow Source connector by posting configuration to Connect REST server.

    Caution

    You must include a double dash (--) between the topic name and your flag. For more information, see this post.

    confluent local services connect connector load servicenow --config servicenow-source.json
    
  6. Confirm that the connector is in a RUNNING state.

    confluent local services connect connector status ServiceNowSourceConnector
    
  7. Create one record to ServiceNow.

    curl -X POST \
        https://<endpoint>.service-now.com/api/now/table/<table_name> \
        -H 'Accept: application/json' \
        -H 'Authorization: Basic <token>'
        -H 'Content-Type: application/json' \
        -H 'cache-control: no-cache' \
        -d '{"short_description": "This is test"}'
    
  8. Confirm the messages were delivered to the topic-servicenow topic in Kafka.

    confluent local services kafka consume topic-servicenow --from-beginning