Important

You are viewing documentation for an older version of Confluent Platform. For the latest, click here.

Kafka Connect IBM MQ Sink Connector

The IBM MQ Sink Connector is used to move messages from Kafka to an IBM MQ cluster.

Note

If you are required to use JNDI to connect to IBM MQ, there is a general JMS Sink Connector available that uses a JNDI-based mechanism to connect to the JMS broker.

Prerequisites

The following are required to run the Kafka Connect IBM MQ Sink Connector:

  • Kafka Broker: Confluent Platform 3.3.0 or above, or Kafka 0.11.0 or above
  • Kafka Connect: Confluent Platform 4.1.0 or above, or Kafka 1.1.0 or above (requires header support in Connect)
  • IBM MQ 8.0.0 or above, or IBM MQ on Cloud service
  • IBM MQ Client Jar (See Client Jars section)
  • Java 1.8

License

You can use this connector for a 30-day trial period without a license key.

After 30 days, this connector is available under a Confluent enterprise license. Confluent issues enterprise license keys to subscribers, along with providing enterprise-level support for Confluent Platform and your connectors. If you are a subscriber, please contact Confluent Support at support@confluent.io for more information.

See Confluent Platform license for license properties and License topic configuration for information about the license topic.

Install IBM MQ Sink Connector

You can install this connector by using the Confluent Hub client (recommended) or you can manually download the ZIP file.

Install the connector using Confluent Hub

Prerequisite
Confluent Hub Client must be installed. This is installed by default with Confluent Enterprise.

Navigate to your Confluent Platform installation directory and run this command to install the latest (latest) connector version. The connector must be installed on every machine where Connect will be run.

confluent-hub install confluentinc/kafka-connect-ibmmq-sink:latest

You can install a specific version by replacing latest with a version number. For example:

confluent-hub install confluentinc/kafka-connect-ibmmq-sink:1.0.0-preview

Important

This connector does not include the IBM MQ client JAR. See the IBM MQ Client Library section for more details.

Install Connector Manually

Download and extract the ZIP file for your connector and then follow the manual connector installation instructions.

Important

This connector does not include the IBM MQ client JAR. See the IBM MQ Client Library section for more details.

IBM MQ Client Library

The Kafka Connect IBM MQ connector does not come with the IBM MQ client library.

If you are running a multi-node Connect cluster, the IBM MQ connector and IBM MQ client JAR must be installed on every Connect worker in the cluster. See below for details.

Installing IBM MQ Client Library

This connector relies on a provided com.ibm.mq.allclient client JAR distributed by IBM. The connector will not run if you have not installed the JAR on each Connect worker node.

The installation steps are:

  1. Follow IBM’s guide on Obtaining the IBM MQ classes for JMS separately to download the IBM MQ client JAR.
  2. The installation should have created a wmq/JavaSE directory. From this directory, copy only the com.ibm.mq.allclient.jar file into the share/java/kafka-connect-ibmmq-sink directory of your Confluent Platform installation on each worker node.
  3. Restart all of the Connect worker nodes.

Note

The share/java/kafka-connect-ibmmq-sink directory mentioned above is for Confluent Platform. If you are using a different installation, find the location of the Confluent IBM MQ sink connector JAR files and place the IBM MQ client JAR file into the same directory.

JMS Message Formats

The format of outgoing JMS Message values is configured with the jms.message.format property, using one of the following options:

string (default)

When using the string Message Format, record values are run through Values.convertToString(...) from the Connect Data package and produced as a JMS TextMessage.

Primitive values are converted to their String equivalent and structured objects are transformed to a sensible string representation that is similar to the JSON representation, with the exception of simple string values (not in objects or arrays) which are unquoted.

Tip

Kafka Connect Transformations can be used with the configured jms.message.format to transform the record value to the desired string representation before the connector processes each record.

avro

Record values are serialized without the Avro schema information and produced as a JMS BytesMessage. JMS consumers must have the schema to deserialize the data.

Important

The connector attempts to infer the Avro schema for records that have no schema. If the connector cannot infer the schema, the task is killed. If you are processing data without a schema, consider using one of the other jms.message.format configurations.

json

Record values are converted to a UTF-8 encoded JSON representation and produced as a JMS TextMessage.

bytes

Record values are passed along in bytes form without any conversion.

Important

Record values must be converted to bytes form before the connector processes them. Configure the value.converter property to org.apache.kafka.connect.converters.ByteArrayConverter to ensure that record values arrive in byte format.

Forwarding Kafka Properties to JMS

The connector can be configured to forward various values from the Kafka record to the JMS Message.

  • Enable jms.forward.kafka.key to convert the record’s key to a String and forward it as the JMSCorrelationID.
  • Enable jms.forward.kafka.metadata to forward the record’s topic, partition, and offset on JMS Message properties.
    • Kafka topic is applied to the message as a String property named KAFKA_TOPIC.
    • Partition is applied to the message as an Int property named KAFKA_PARTITION.
    • Offset is applied to the message as a Long property named KAFKA_OFFSET.
  • Enable jms.forward.kafka.headers to add each header from the SinkRecord to the JMS Message as a String property.

Note

The connector converts the record key and headers to a sensible string representation that is similar to the JSON representation, with the exception of simple string values (not in objects or arrays) which are unquoted. No other conversion is done to the key and headers before forwarding them on the JMS Message. If another format is needed, out-of-the-box or custom Kafka Connect Transformations can be used with the connector to transform the record keys and/or headers to the desired string representation before the JMS sink connector processes each record.

Quick Start

This quick start uses the IBM MQ Sink Connector to consume records from Kafka and send them to an IBM MQ broker running in a Docker container.

  1. Start the IBM MQ broker.

    docker run -d \
      -p 1414:1414 -p 9443:9443 \
      -e LICENSE=accept \
      ibmcom/mq:9.1.2.0
    
  2. Install the connector through the Confluent Hub Client.

    # run from your Confluent Platform installation directory
    confluent-hub install confluentinc/kafka-connect-ibmmq-sink:latest
    
  3. Start Confluent Platform.

    confluent start
    
  4. Produce test data to the sink-messages topic in Kafka.

    seq 10 | confluent produce sink-messages
    
  5. Create a ibmmq-sink.json file with the following contents:

    {
      "name": "IbmMqSinkConnector",
      "config": {
        "connector.class": "io.confluent.connect.jms.IbmMqSinkConnector",
        "tasks.max": "1",
        "topics": "sink-messages",
        "mq.username": "app",
        "mq.channel": "DEV.APP.SVRCONN",
        "mq.hostname": "localhost",
        "mq.port": "1414",
        "mq.password": "",
        "mq.queue.manager": "mq",
        "mq.transport.type": "client",
        "jms.destination.type": "queue",
        "jms.destination.name": "DEV.QUEUE.1",
        "value.converter": "org.apache.kafka.connect.storage.StringConverter",
        "key.converter": "org.apache.kafka.connect.storage.StringConverter",
        "confluent.topic.replication.factor": "1",
        "confluent.topic.bootstrap.servers": "localhost:9091"
      }
    }
    
  6. Load the IBM MQ Sink Connector.

    confluent load mq​ -d ibmmq-sink.json
    

    Important

    Don’t use the Confluent CLI in production environments.

  7. Confirm that the connector is in a RUNNING state.

    confluent status mq
    
  8. Navigate to the IBM MQ Console to confirm the messages were delivered to the DEV.QUEUE.1 queue.

    Tip

    The default credentials for the IBM MQ Console are admin/passw0rd