Splunk S2S Source Connector for Confluent Platform¶
The Splunk S2S Source connector provides a way to integrate Splunk with Apache Kafka®. The connector receives data from Splunk universal forwarder (UF) or Splunk heavy forwarder (HF)
Important
The Splunk S2S Source connector listens on a network port. Running more than one connector task, or running in distributed mode can produce undesirable results if another task already has the port open. Confluent recommends you run the Splunk S2S Source connector in Standalone Mode.
Features¶
The Splunk S2S Source connector includes the following features:
- Supports one task
- Metadata Support
- Data Ingestion
- Multiline Event Parsing
- Compression Support
- SSL Communication Support
Supports one task¶
The Splunk S2S Source connector supports running only one task.
Metadata Support¶
The Splunk S2S Source connector supports parsing of metadata fields (host
,
source
, sourcetype
, and index
) along with a raw event. Here is an
example of message in a Kafka topic:
{
"event": "sample log event",
"time": 1623175216,
"host": "sample host",
"source": "/opt/splunkforwarder/splunk-s2s-test.log",
"index": "default",
"sourcetype": "splunk-s2s-test-too_small"
}
The Connector also supports parsing of custom meta fields which can be
configured at the forwarder’s end by using the _meta
tag as shown in the
following example:
[monitor://$SPLUNK_HOME/splunk-s2s-test.log]
sourcetype = test
disabled = false
_meta = testField::testValue
The following example shows a message with the previous input configuration:
{
"event": "sample log event",
"time": 1623175216,
"host": "sample host",
"source": "/opt/splunkforwarder/splunk-s2s-test.log",
"index": "default",
"sourcetype": "test",
"testField": "testValue"
}
Data Ingestion¶
The Splunk S2S Source connector supports data ingestion from the Splunk forwarder for the following input types:
For help with configuring these input types on UFs, see Configure Inputs on Splunk Forwarder.
Multiline Event Parsing¶
The Splunk S2S Source connector also supports multiline event parsing by providing the following event break options for each sourcetype:
EVERY_LINE
: Create new events on every new line.REGEX
: Create events as defined in regex.
See SourceType Parsing Config Example for help with defining event break options for a sourcetype.
Compression Support¶
The Splunk S2S Source connector supports compression for communication between the connector and Splunk forwarders. To enable compression, set the following configuration property:
"splunk.s2s.compression.enable": "true"
Note
- The connector supports only native Splunk compression–that is, the
compressed=true
setting. It does not support theuseClientSSLCompression
setting provided by Splunk. - Be sure to set
compressed
totrue
on forwarders before setting the"splunk.s2s.compression.enable": "true"
.
SSL Communication Support¶
The Splunk S2S Source connector supports SSL communication between the connector and Splunk forwarders. To enable SSL communication, set the following configuration properties:
"splunk.s2s.ssl.enable": "true"
"splunk.s2s.ssl.key.path":"Path to SSL Server Private Key File"
"splunk.s2s.ssl.key.password":"SSL Server Private Key Password"
"splunk.s2s.ssl.cert.chain.path":"Path to SSL Server Certificate Chain"
The Splunk S2S connector supports client authentication in SSL communication between the connector and Splunk forwarders. To enable client authentication in SSL communication, in addition to above properties, set the following configuration properties:
"splunk.s2s.ssl.client.auth.enable": "true"
"splunk.s2s.ssl.root.ca.cert.chain.path":"Path to Root CA Certificate Chain"
Note
- Be sure to set
useSSL
totrue
andsslRootCAPath
tolocation of the certificate authority certificate
on forwarders before setting the"splunk.s2s.ssl.enable": "true"
. - Be sure to set
clientCert
andsslPassword
on forwarders before setting the"splunk.s2s.ssl.client.auth.enable": "true"
.
Limitations¶
The Splunk S2S Source connector does not support the useClientSSLCompression
setting that Splunk provides.
License¶
Confluent’s Splunk S2S Source connector is a Confluent Premium connector subject to the Confluent enterprise license and therefore requires an additional subscription.
You can use this connector for a 30-day trial period without a license key. After 30 days, you must purchase a connector subscription to Confluent’s Splunk S2S Source connector which includes Confluent enterprise license keys to subscribers, along with enterprise-level support for Confluent Platform and your connectors. If you are a subscriber, contact Confluent Support for more information.
For license properties, see Confluent Platform license. For information about the license topic, see License topic configuration.
Configuration Properties¶
For a complete list of configuration properties for this connector, see Splunk S2S Source Connector Configuration Properties.
Note
For an example of how to get Kafka Connect connected to Confluent Cloud, see Distributed Cluster.
Splunk Forwarder Configuration¶
For a complete list of configuration properties for the Splunk forwarder, see Splunk Forwarder Configuration Properties
Install the Splunk S2S Source Connector¶
You can install this connector by using the Confluent Hub client installation instructions or by manually downloading the ZIP file.
Prerequisites¶
Note
You must install the connector on every machine where Connect will run.
Kafka Broker: Confluent Platform 6.0.0 or later.
Connect: Confluent Platform 6.0.0 or later.
Java 1.8.
Splunk UF version 8.x.
An install of the Confluent Hub Client. This is installed by default with Confluent Enterprise.
An install of the latest (
latest
) connector version.To install the
latest
connector version, navigate to your Confluent Platform installation directory and run the following command:confluent-hub install confluentinc/kafka-connect-splunk-s2s:latest
You can install a specific version by replacing
latest
with a version number as shown in the following example:confluent-hub install confluentinc/kafka-connect-splunk-s2s:1.3.0
Install the connector manually¶
Download and extract the ZIP file for your connector and then follow the manual connector installation instructions.
SourceType Parsing Config Example¶
The splunk.s2s.sourcetypes
configuration contains a list of sourcetypes for
defining regex
to parse events. The following example shows how to use this
configuration:
splunk.s2s.sourcetypes = typeA,typeB
splunk.s2s.sourcetype.typeA.eventbreak = EVERY_LINE
splunk.s2s.sourcetype.typeB.eventbreak = REGEX
splunk.s2s.sourcetype.typeB.regex = ([\r\n]+)(?:\d{4}-\d{2}-\d{2}\s+\d{2}:\d{2}:\d{2}\.\d{3})
Note
- By default, the event break option for each sourcetype is configured to
EVERY_LINE
. - To add custom properties, such as
splunk.s2s.sourcetype.typeA.eventbreak
(which may not be visible initially in the user interface), click Add a property while defining the configuration.
Quick Start¶
This Quick Start uses the Splunk S2S Source connector to receive data from the Splunk UF and ingests it into Kafka.
Install the connector using the Confluent Hub Client.
# run from your CP installation directory confluent-hub install confluentinc/kafka-connect-splunk-s2s:latest
Start the Confluent Platform.
Tip
The command syntax for the Confluent CLI development commands changed in 5.3.0. These commands have been moved to
confluent local
. For example, the syntax forconfluent start
is nowconfluent local services start
. For more information, see confluent local.confluent local services start
Create a
splunk-s2s-source.properties
file with the following contents:name=splunk-s2s-source tasks.max=1 connector.class=io.confluent.connect.splunk.s2s.SplunkS2SSourceConnector splunk.s2s.port=9997 kafka.topic=splunk-s2s-events key.converter=org.apache.kafka.connect.storage.StringConverter value.converter=org.apache.kafka.connect.json.JsonConverter key.converter.schemas.enable=false value.converter.schemas.enable=false confluent.topic.bootstrap.servers=localhost:9092 confluent.topic.replication.factor=1
Load the Splunk S2S Source connector.
confluent local services connect connector load splunk-s2s-source --config splunk-s2s-source.properties
Important
Don’t use the Confluent CLI in production environments.
Confirm the connector is in a
RUNNING
state.confluent local services connect connector status splunk-s2s-source
Start a Splunk UF by running the Splunk UF Docker container.
docker run -d -p 9998:9997 -e "SPLUNK_START_ARGS=--accept-license" -e "SPLUNK_PASSWORD=password" --name splunk-uf splunk/universalforwarder:8.1.2
Create a
splunk-s2s-test.log
file with the following sample log events:log event 1 log event 2 log event 3
Copy the
splunk-s2s-test.log
file to the Splunk UF Docker container using the following command:docker cp splunk-s2s-test.log splunk-uf:/opt/splunkforwarder/splunk-s2s-test.log
Configure the UF to monitor the
splunk-s2s-test.log
file:docker exec -it splunk-uf sudo ./bin/splunk add monitor -source /opt/splunkforwarder/splunk-s2s-test.log -auth admin:password
Configure the UF to connect to Splunk S2S Source connector:
For Mac/Windows systems:
docker exec -it splunk-uf sudo ./bin/splunk add forward-server host.docker.internal:9997
For Linux systems:
docker exec -it splunk-uf sudo ./bin/splunk add forward-server 172.17.0.1:9997
Verify the data was ingested into the Kafka topic.
To look for events from a monitored file (
splunk-s2s-test.log
) in the Kafka topic, run the following command:kafka-console-consumer --bootstrap-server localhost:9092 --topic splunk-s2s-events --from-beginning | grep 'log event'
Note
When you use the previous command without
grep
, you will see many Splunk internal events get ingested in the Kafka topic as Splunk UF sends internal Splunk log events to connector by default.Shut down Confluent Platform.
confluent local destroy
Shut down the Docker container.
docker stop splunk-uf docker rm splunk-uf
Suggested Reading¶
Blog post: Reduce Your Data Infrastructure TCO with Confluent’s New Splunk S2S Source Premium Connector