.. _splunk_source_connector: Splunk Source Connector for |cp| ================================ The |kconnect| Splunk Source connector provides a way to integrate Splunk with |ak-tm|. The connector receives data from applications that would normally send data to a `Splunk HTTP Event Collector (HEC) `__. The connector has support for `[X-Forwarded-For] `__ which allows it to be used behind a load balancer. .. note:: The connector does not support receiving data from a Splunk Universal Forwarder or Splunk Heavy Forwarder. .. important:: This connector listens on a network port. Running more than one connector task or running in distributed mode can cause undesirable effects if another task already has the port open. It is recommended that you run this connector in **Standalone Mode**. Prerequisites ------------- The following are required to run the |kconnect-long| Splunk Source Connector: * |ak| Broker: |cp| 3.3.0 or above * |kconnect|: |cp| 4.1.0 or above * Java 1.8 .. _splunk_source_connector_install: Install the Splunk Source Connector ----------------------------------- .. include:: ../../includes/connector-install.rst .. include:: ../../includes/connector-install-hub.rst .. codewithvars:: bash confluent-hub install confluentinc/kafka-connect-splunk-source:latest .. include:: ../../includes/connector-install-version.rst .. codewithvars:: bash confluent-hub install confluentinc/kafka-connect-splunk-source:1.0.0-preview ------------------------------ Install the connector manually ------------------------------ `Download and extract the ZIP file `__ for your connector and then follow the manual connector installation :ref:`instructions `. License ------- .. include:: ../../includes/enterprise-license.rst See :ref:`splunk_source-connector-license-config` for license properties and :ref:`splunk_source_license-topic-configuration` for information about the license topic. Configuration Properties ------------------------ For a complete list of configuration properties for this connector, see :ref:`splunk_source_connector_config`. .. include:: ../../includes/connect-to-cloud-note.rst .. _splunk_source_connector_quickstart: Quick Start ----------- This quick start uses the Splunk Source Connector to receive application data ingest it into |ak|. #. Install the connector using the `Confluent Hub Client `_. :: # run from your CP installation directory confluent-hub install confluentinc/kafka-connect-splunk-source:latest #. Start the Confluent Platform. .. include:: ../../../includes/cli-new.rst .. codewithvars:: bash |confluent_start| #. Create a ``splunk-source.properties`` file with the following contents: :: name=splunk-source kafka.topic=splunk-source tasks.max=1 connector.class=io.confluent.connect.SplunkHttpSourceConnector splunk.collector.index.default=default-index splunk.port=8889 splunk.ssl.key.store.path=/path/to/your/keystore.jks splunk.ssl.key.store.password= confluent.topic.bootstrap.servers=localhost:9092 confluent.topic.replication.factor=1 #. Load the Splunk Source Connector. .. codewithvars:: bash |confluent_load| splunk-source|dash| -d splunk-source.properties .. important:: Don't use the :ref:`cli` in production environments. #. Confirm that the connector is in a ``RUNNING`` state. .. codewithvars:: bash |confluent_status| splunk-source #. Simulate an application sending data to the connector. .. codewithvars:: bash curl -k -X POST https://localhost:8889/services/collector/event -d '{"event":"from curl"}' #. Verify the data was ingested into the |ak| topic. :: kafka-avro-console-consumer --bootstrap-server localhost:9092 --topic splunk-source --from-beginning #. Shut down |cp|. .. codewithvars:: bash |confluent_destroy| Additional Documentation ------------------------ .. toctree:: :maxdepth: 1 connector_config changelog