.. _splunk-sink-connector: Splunk Sink Connector for |cp| ============================== The |kconnect-long| `Splunk Sink connector `__ is used to move messages from |ak-tm| to Splunk. The connector has the following features: * Data ingestion from Kafka topics into Splunk via Splunk HTTP Event Collector (HEC) The `Splunk HTTP Event Collector (HEC) `_ receives data from |ak| topics via HTTP or HTTPS connection using an Event Collector token configured in Splunk. * In-flight data transformation and enrichment. This feature is used to enrich raw data with extra metadata fields. The configured enrichment metadata is indexed along with raw event data by the Splunk software. See `Indexed Field Extractions `_ for more information. .. note:: Data enrichment for ``/event`` HEC endpoint is only available in Splunk Enterprise 6.5 and above. * Acknowledgement mode This feature implements guaranteed delivery by polling Splunk for acknowledgement before committing the |ak| offset. .. _splunk_sink_connector_prereqs: Prerequisites ------------- The following are required to run the Splunk Sink Connector: * |ak| Broker: |cp| 3.3.0 or above, or |ak| 0.11.0 or above * |kconnect|: |cp| 4.0 or above, or |ak| 1.0 or above * Java 1.8 * Splunk 6.5 or above, configured with valid HTTP Event Collector (HEC) tokens * Splunk Indexers and Heavy Forwarders that send information to this connector should have the same HEC token settings as this connector. * Task configuration parameters vary depending on acknowledgement setting. See the :ref:`Configuration Properties ` for details. .. note:: HEC Acknowledgement prevents potential data loss but may slow down event ingestion. .. _splunk_sink_connector_install: Install the Splunk Sink Connector --------------------------------- .. include:: ../../includes/connector-install.rst .. include:: ../../includes/connector-install-hub.rst .. codewithvars:: bash confluent-hub install splunk/kafka-connect-splunk:latest .. include:: ../../includes/connector-install-version.rst .. codewithvars:: bash confluent-hub install splunk/kafka-connect-splunk:1.1.1 ------------------------------ Install the connector manually ------------------------------ `Download and extract the ZIP file `__ for your connector and then follow the manual connector installation :ref:`instructions `. .. _splunk_sink_connector_license: License ------- The Splunk Sink connector is an open source connector and does not require a Confluent Enterprise License. Configuration Properties ------------------------ For a complete list of configuration properties for this connector, see :ref:`splunk_sink_connector_config`. .. include:: ../../includes/connect-to-cloud-note.rst .. _splunk_connector_quickstart: Quick Start ------------ .. important:: The default port used by a Splunk HEC is ``8088``. However, the |ksqldb| component of |cp| also uses that port. For this quick start, since both Splunk and |cp| will be running, we configure the HEC to use port ``8889``. If that port is in use by another process, change ``8889`` to a different, open port. #. Start a Splunk Enterprise instance by running the Splunk Docker container. .. codewithvars:: bash docker run -d -p 8000:8000 -p 8889:8889 -e "SPLUNK_START_ARGS=--accept-license" -e "SPLUNK_PASSWORD=password" --name splunk splunk/splunk:7.3.0 #. Open `http://localhost:8000 `_ to access Splunk Web. Login with username ``admin`` and password ``password``. #. Configure a Splunk HEC using Splunk Web. - Click **Settings** > **Data Inputs**. - Click **HTTP Event Collector**. - Click **Global Settings**. - In the All Tokens toggle button, select **Enabled**. - Ensure **SSL disabled** is checked. - Change the HTTP Port Number to **8889**. - Click **Save**. - Click **New Token**. - In the **Name** field, enter a name for the token: ``kafka`` - Click **Next**. - Click **Review**. - Click **Submit**. .. important:: Note the token value on the “Token has been created successfully” page. This token value is needed for the connector configuration later. #. Install the connector through the `Confluent Hub Client `_. .. codewithvars:: bash # run from your Confluent Platform installation directory confluent-hub install splunk/kafka-connect-splunk:latest #. Start |cp|. .. include:: ../../../includes/cli-new.rst .. codewithvars:: bash |confluent_start| #. `Produce `_ test data to the ``splunk-qs`` topic in |ak|. .. codewithvars:: bash echo event 1 | |confluent_produce| splunk-qs echo event 2 | |confluent_produce| splunk-qs #. Create a ``splunk-sink.properties`` file with the properties below. Substitute ```` with the Splunk HEC token created earlier. .. codewithvars:: properties name=SplunkSink topics=splunk-qs tasks.max=1 connector.class=com.splunk.kafka.connect.SplunkSinkConnector splunk.indexes=main splunk.hec.uri=http://localhost:8889 splunk.hec.token= splunk.sourcetypes=my_sourcetype confluent.topic.bootstrap.servers=localhost:9092 confluent.topic.replication.factor=1 value.converter=org.apache.kafka.connect.storage.StringConverter #. Start the connector. .. include:: ../../../includes/confluent-local-consume-limit.rst .. codewithvars:: bash |confluent_load| splunk|dash| -d splunk-sink.properties #. In the Splunk UI, verify that data is flowing into your Splunk platform instance by searching using the search parameter ``source="http:kafka"``. #. Shut down |cp|. .. codewithvars:: bash |confluent_destroy| #. Shut down the Docker container. .. codewithvars:: bash docker stop splunk docker rm splunk Additional Documentation ------------------------ .. toctree:: :maxdepth: 1 connector_config changelog