ServiceNow Sink Connector for Confluent Platform

The Kafka Connect ServiceNow Sink Connector is used to capture Apache Kafka® records and sink them to a ServiceNow Table in real-time. Data is consumed using the ServiceNow Table API.

Features

The ServiceNow Sink Connector includes the following features:

At least once delivery

This connector guarantees that records are delivered at least once from the Kafka topic.

Dead Letter Queue

This connector supports the Dead Letter Queue (DLQ) functionality. For information about accessing and using the DLQ, see Confluent Platform Dead Letter Queue.

Multiple tasks

The ServiceNow Sink Connector supports running one or more tasks. You can specify the number of tasks in the tasks.max configuration parameter. This can lead to huge performance gains when multiple files need to be parsed.

Multiple HTTP request methods

The connector supports POST, DELETE, and PUT HTTP request methods. Note that this connector does not support PATCH as PUT and PATCH behave exactly the same in the ServiceNow Table API. The requests method used for each specific record is chosen dynamically:

  • POST is chosen when the record key is a tombstone value (null) or if no sysId field exists in the case that the key is a struct
  • DELETE is chosen when the record value is a tombstone value (null)
  • PUT is chosen when there is both a valid key and value in the record

Supports HTTPS proxy

The connector can connect to ServiceNow using an HTTPS proxy server.

Result reporting

The connector supports result reporting. For the ServiceNow Sink Connector, in the case of a successful HTTP response, the connector will report to the configured success topic a record with key as sysId of the newly created ServiceNow table record and value as a Struct object with fields requestMethod, statusCode, and responseString, where responseString is the response body parsed as a string. In the case of a non-successful HTTP response, the report record’s key will be the original sysId provided and the value will be a Struct object with the above described schema. Note that in the case of a failed POST request, there will be no sysId to report.

Limitations

The ServiceNow Sink Connector only supports HTTP Basic Authentication. It does not support mutual TLS (mTLS).

Install the ServiceNow Sink Connector

You can install this connector by using the Confluent Hub client installation instructions or by manually downloading the ZIP file.

Prerequisites

Important

You must install the connector on every machine where Connect will run.

  • An installation of the Confluent Hub Client.

    Note

    This is installed by default with Confluent Enterprise.

  • An installation of the latest (latest) connector version.

    To install the latest connector version, navigate to your Confluent Platform installation directory and run the following command:

    confluent-hub install confluentinc/kafka-connect-servicenow:latest
    

    You can install a specific version by replacing latest with a version number as shown in the following example:

    confluent-hub install confluentinc/kafka-connect-servicenow:1.0.2-preview
    

Install the connector manually

Download and extract the ZIP file for your connector and then follow the manual connector installation instructions.

License

You can use this connector for a 30-day trial period without a license key.

After 30 days, this connector is available under a Confluent enterprise license. Confluent issues Confluent enterprise license keys to subscribers, along with providing enterprise-level support for Confluent Platform and your connectors. If you are a subscriber, please contact Confluent Support at support@confluent.io for more information.

See Confluent Platform license for license properties and License topic configuration for information about the license topic.

Configuration Properties

For a complete list of configuration properties for this connector, see ServiceNow Sink Connector Configuration Properties.

Note

For an example of how to get Kafka Connect connected to Confluent Cloud, see Distributed Cluster.

Prerequisites

The following are required to run the Kafka Connect ServiceNow Sink Connector:

  • Kafka Broker: Confluent Platform 3.3.0 or above, or Kafka 0.11.0 or above
  • Connect: Confluent Platform 4.1.0 or above, or Kafka 1.1.0 or above
  • ServiceNow API: Paris, Orlando, New York, Madrid or London.

Quick Start

The quick start guide uses ServiceNow Sink connector to consume records from Kafka and send them to a ServiceNow table. This guide assumes multi-tenant environment is used. For local testing, please refer to Running Connect in standalone mode.

  1. Create a table called test_table in ServiceNow.

    ../_images/servicenow_create_table.png
  2. Define three columns in the table.

    ../_images/servicenow_define_columns.png
  3. Install the connector through the Confluent Hub Client.

    # run from your confluent platform installation directory
    confluent-hub install confluentinc/kafka-connect-servicenow:latest
    
  4. Start the Confluent Platform.

    Tip

    The command syntax for the Confluent CLI development commands changed in 5.3.0. These commands have been moved to confluent local. For example, the syntax for confluent start is now confluent local services start. For more information, see confluent local.

    confluent local services start
    
  5. Check the status of all services.

    confluent local services status
    
  6. Create a servicenow-sink.json file with the following contents:

    Note

    All user-defined tables in ServiceNow start with u_,

     // subsitute <> with your config
     {
        "name": "ServiceNowSinkConnector",
        "config": {
            "connector.class": "io.confluent.connect.servicenow.ServiceNowSinkConnector",
            "topics": "test_table",
            "servicenow.url": "https://<endpoint>.service-now.com/",
            "tasks.max": "1",
            "servicenow.table": "u_test_table",
            "servicenow.user": "<username>",
            "servicenow.password": "<password>",
            "key.converter": "io.confluent.connect.avro.AvroConverter",
            "key.converter.schema.registry.url": "http://localhost:8081",
            "value.converter": "io.confluent.connect.avro.AvroConverter",
            "value.converter.schema.registry.url": "http://localhost:8081",
            "confluent.topic.bootstrap.servers": "localhost:9092",
            "confluent.license": "<license>", // leave it empty for evaluation license
            "confluent.topic.replication.factor": "1",
            "reporter.bootstrap.servers": "localhost:9092",
            "reporter.error.topic.name": "test-error",
            "reporter.error.topic.replication.factor": 1,
            "reporter.error.topic.key.format": "string",
            "reporter.error.topic.value.format": "string",
            "reporter.result.topic.name": "test-result",
            "reporter.result.topic.key.format": "string",
            "reporter.result.topic.value.format": "string",
            "reporter.result.topic.replication.factor": 1
        }
    }
    

    Note

    For details about using this connector with Kafka Connect Reporter, see Connect Reporter.

  7. Load the ServiceNow Sink Connector by posting configuration to Connect REST server.

    Caution

    You must include a double dash (--) between the topic name and your flag. For more information, see this post.

    confluent local services connect connector load ServiceNowSinkConnector --config servicenow-sink.json
    
  8. Confirm that the connector is in a RUNNING state.

    confluent local services connect connector status ServiceNowSinkConnector
    
  9. To produce some records into the test_table topic, first start a Kafka producer.

    Note

    All user-defined columns in ServiceNow start with u_

    kafka-avro-console-producer \
    --broker-list localhost:9092 --topic test_table \
    --property value.schema='{"type":"record","name":"myrecord","fields":[{"name":"u_name","type":"string"},
    {"name":"u_price", "type": "float"}, {"name":"u_quantity", "type": "int"}]}'
    
  10. The console producer is now waiting for input, so you can go ahead and insert some records into the topic.

    {"u_name": "scissors", "u_price": 2.75, "u_quantity": 3}
    {"u_name": "tape", "u_price": 0.99, "u_quantity": 10}
    {"u_name": "notebooks", "u_price": 1.99, "u_quantity": 5}
    
  11. Confirm the messages were delivered to the ServiceNow table by using the ServiceNow user interface.

    ../_images/servicenow_result.png

Write JSON message values into ServiceNow

The example settings file is shown below.

  1. Create a servicenow-sink-json.json file with the following contents.

    Note

    All user-defined tables in ServiceNow start with u_

    // subsitute <> with your config
    {
        "name": "ServiceNowSinkJSONConnector",
        "config": {
            "connector.class": "io.confluent.connect.servicenow.ServiceNowSinkConnector",
            "topics": "test_table_json",
            "servicenow.url": "https://<endpoint>.service-now.com/",
            "tasks.max": "1",
            "servicenow.table": "u_test_table",
            "servicenow.user": "<username>",
            "servicenow.password": "<password>",
            "key.converter":"org.apache.kafka.connect.storage.StringConverter",
            "value.converter":"org.apache.kafka.connect.json.JsonConverter",
            "value.converter.schemas.enable": "true",
            "confluent.topic.bootstrap.servers": "localhost:9092",
            "confluent.license": "<license>", // leave it empty for evaluation license
            "confluent.topic.replication.factor": "1",
            "reporter.bootstrap.servers": "localhost:9092",
            "reporter.error.topic.name": "test-error",
            "reporter.error.topic.replication.factor": 1,
            "reporter.error.topic.key.format": "string",
            "reporter.error.topic.value.format": "string",
            "reporter.result.topic.name": "test-result",
            "reporter.result.topic.key.format": "string",
            "reporter.result.topic.value.format": "string",
            "reporter.result.topic.replication.factor": 1
        }
    }
    

    Note

    For details about using this connector with Kafka Connect Reporter, see Connect Reporter.

  2. Load the ServiceNow Sink Connector by posting configuration to Connect REST server.

    Caution

    You must include a double dash (--) between the topic name and your flag. For more information, see this post.

    confluent local services connect connector load ServiceNowSinkJSONConnector --config servicenow-sink-json.json
    
  3. Confirm that the connector is in a RUNNING state.

    confluent local services connect connector status ServiceNowSinkJSONConnector
    
  4. To produce some records into the test_table_json topic, first start a Kafka producer.

    Note

    All user-defined columns in ServiceNow start with u_

    kafka-console-producer \
    --broker-list localhost:9092 \
    --topic test_table_json
    
  5. The console producer is now waiting for input, so you can go ahead and insert some records into the topic.

    {"schema": {"type": "struct", "fields": [{"type": "string", "optional": false, "field": "u_name"},{"type": "float", "optional": false, "field": "u_price"}, {"type": "int64","optional":false,"field": "u_quantity"}],"optional": false,"name": "products"}, "payload": {"u_name": "laptop", "u_price": 999.50, "u_quantity": 3}}
    {"schema": {"type": "struct", "fields": [{"type": "string", "optional": false, "field": "u_name"},{"type": "float", "optional": false, "field": "u_price"}, {"type": "int64","optional":false,"field": "u_quantity"}],"optional": false,"name": "products"}, "payload": {"u_name": "pencil", "u_price": 0.99, "u_quantity": 10}}
    {"schema": {"type": "struct", "fields": [{"type": "string", "optional": false, "field": "u_name"},{"type": "float", "optional": false, "field": "u_price"}, {"type": "int64","optional":false,"field": "u_quantity"}],"optional": false,"name": "products"}, "payload": {"u_name": "pen", "u_price": 1.99, "u_quantity": 5}}