Important
You are viewing documentation for an older version of Confluent Platform. For the latest, click here.
AppDynamics Metrics Sink Connector for Confluent Platform¶
The Kafka Connect AppDynamics metrics sink connector is used to export metrics
from Apache Kafka® topic to AppDynamics via AppDynamics Machine Agent. The
connector accepts Struct and schemaless JSON as a Kafka record’s value.
The name
and values
fields are required. The
values
field refers to a metric’s values and is also expected to be a
Struct object in the case when the Kafka record’s value is of type Struct and
nested JSON in the case when the Kafka record’s value is of type schemaless JSON.
The input Struct or schemaless JSON object used as the record’s value should resemble the following:
{
"name": string,
"type": string,
"timestamp": long,
"dimensions":{
"aggregatorType": string,
...
},
"values":{
"doubleValue": double
}
}
Note
The qualifier value AVERAGE is used by default if the aggregatorType
property is not present in the dimensions struct. The possible values for aggregatorType
are AVERAGE, SUM and OBSERVATION. Refer Appdynamics documentation for details.
This connector can start with one task that exports data to AppDynamics. The connector can scale by adding more tasks. Note that as more tasks are added, connector performance may be limited by AppDynamics transaction processing.
Prerequisites¶
The following are required to run the Kafka Connect AppDynamics connector:
- Kafka Broker: Confluent Platform 3.3.0 or above
- Connect: Confluent Platform 4.1.0 or above
- Java 1.8
- AppDynamics APM Pro, APM Advanced, or APM Peak account.
- An AppDynamics machine agent (v4.5 or higher) configured to send data to the AppDynamics controller. For details, see Configure the Standalone Machine Agent.
- The AppDynamics machine agent uses its HTTP listener. For details, see Standalone Machine Agent HTTP Listener.
Features¶
The AppDynamics Metrics Sink connector offers the following features:
- Supported types for Kafka record value: The connector accepts Kafka record values as Struct type, schemaless JSON type, and JSON string type.
- Exactly Once Delivery: The connector ensures exactly once delivery of metrics to the AppDynamics machine agent. However, exactly once delivery is not ensured if the machine agent fails while sending metrics to the AppDynamics Controller.
Supported Metrics and Schemas¶
The connector supports metrics of type Gauge. Kafka topics that contain these metrics must have records that adhere to the following schema.
Gauge schema¶
{
"doubleValue": double
}
Record Mapping¶
Each Kafka record is converted to AppDynamics metric object. For example below shows the original form:
{
"name": "sample_metric",
"type": "gauge",
"timestamp": 23480239402348234,
"dimensions": {
"aggregatorType": "AVERAGE"
},
"values": {
"doubleValue": 28945
}
}
The example below shows the converted AppDynamics metric object:
{
"metricName": "sample_metric",
"aggregatorType": "AVERAGE",
"value": 28945
}
Install the AppDynamics Metrics Connector¶
You can install this connector by using the Confluent Hub client (recommended) or you can manually download the ZIP file.
Install the connector using Confluent Hub¶
- Prerequisite
- Confluent Hub Client must be installed. This is installed by default with Confluent Enterprise.
Navigate to your Confluent Platform installation directory and run the following command to install the latest (latest
) connector version. The connector must be installed on every machine where Connect will run.
confluent-hub install confluentinc/kafka-connect-appdynamics-metrics:latest
You can install a specific version by replacing latest
with a version number. For example:
confluent-hub install confluentinc/kafka-connect-appdynamics-metrics:1.1.2
Install the connector manually¶
Download and extract the ZIP file for your connector and then follow the manual connector installation instructions.
License¶
You can use this connector for a 30-day trial period without a license key.
After 30 days, this connector is available under a Confluent enterprise license. Confluent issues enterprise license keys to subscribers, along with providing enterprise-level support for Confluent Platform and your connectors. If you are a subscriber, please contact Confluent Support at support@confluent.io for more information.
See Confluent Platform license for license properties and License topic configuration for information about the license topic.
Configuration Properties¶
For a complete list of configuration properties for this connector, see AppDynamics Metrics Sink Connector Configuration Properties.
Quick Start¶
Complete the following instructions.
Note
For an example of how to get Kafka Connect connected to Confluent Cloud, see Distributed Cluster in Connect Kafka Connect to Confluent Cloud.
Preliminary setup¶
Prior to running the connector, set up the AppDynamics account and the Controller. Once these are configured, install and configure the Machine Agent using the following documentation:
- Install the Standalone Machine Agent
- Configure the Standalone Machine Agent
- Standalone Machine Agent HTTP Listener
Set the following properties in the machine agent controller-info.xml
file. Use the information from the AppDynamics account and the Controller configurations.
<controller-info>
<controller-host></controller-host>
<controller-port></controller-port>
<controller-ssl-enabled></controller-ssl-enabled>
<enable-orchestration></enable-orchestration>
<account-access-key></account-access-key>
<account-name></account-name>
<sim-enabled></sim-enabled>
<application-name></application-name>
<tier-name></tier-name>
<node-name></node-name>
</controller-info>
To add the new connector plugin you must restart Connect. Use the Confluent CLI command to restart Connect.
Tip
The command syntax for the Confluent CLI development commands changed in 5.3.0.
These commands have been moved to confluent local
. For example, the syntax for confluent start
is now
confluent local start
. For more information, see confluent local.
confluent local stop connect && confluent local start connect
Your output should resemble:
Using CONFLUENT_CURRENT: /Users/username/Sandbox/confluent-snapshots/var/confluent.NuZHxXfq
Starting zookeeper
zookeeper is [UP]
Starting kafka
kafka is [UP]
Starting schema-registry
schema-registry is [UP]
Starting kafka-rest
kafka-rest is [UP]
Starting connect
connect is [UP]
Verify that the AppDynamics plugin has been installed correctly and recognized by the plugin loader:
curl -sS localhost:8083/connector-plugins | jq '.[].class' | grep appdynamics
Example output:
"io.confluent.connect.appdynamics.metrics.AppDynamicsMetricsSinkConnector"
Sink Connector Configuration¶
If not running, start Confluent Platform:
confluent local start
Create a configuration file named appdynamics-metrics-sink-config.json
with
the following contents.
{
"name": "appdynamics-metrics-sink",
"config": {
"topics": "appdynamics-metrics-topic",
"connector.class": "io.confluent.connect.appdynamics.metrics.AppDynamicsMetricsSinkConnector",
"tasks.max": "1",
"machine.agent.host": "<host>",
"machine.agent.port": "<port>",
"behavior.on.error": "fail",
"confluent.topic.bootstrap.servers": "localhost:9092",
"confluent.topic.replication.factor": "1",
"reporter.bootstrap.servers": "localhost:9092",
"reporter.result.topic.replication.factor": "1",
"reporter.error.topic.replication.factor": "1",
"key.converter": "io.confluent.connect.avro.AvroConverter",
"key.converter.schema.registry.url": "http://localhost:8081",
"value.converter": "io.confluent.connect.avro.AvroConverter",
"value.converter.schema.registry.url": "http://localhost:8081"
}
}
Note
For details about using this connector with Kafka Connect Reporter, see Connect Reporter.
Enter the following command to start the AppDynamics Metrics sink connector:
Caution
You must include a double dash (--
) between the connector name and your flag. For more information,
see this post.
confluent local load appdynamics-metrics-sink -- -d appdynamics-metrics-sink-config.json
Verify that the connector started by viewing the Connect worker log. Enter the following command:
confluent local log connect
Produce test data to the appdynamics-metrics-topic
topic in Kafka using the Confluent CLI confluent local produce command.
kafka-avro-console-producer \
--broker-list localhost:9092 --topic appdynamics-metrics-topic \
--property value.schema='{"name": "metric","type": "record","fields": [{"name": "name","type": "string"},{"name": "dimensions", "type": {"name": "dimensions", "type": "record", "fields": [{"name": "aggregatorType", "type":"string"}]}},{"name": "values","type": {"name": "values","type": "record","fields": [{"name":"doubleValue", "type": "double"}]}}]}'
{"name":"Custom Metrics|Tier-1|CPU-Usage", "dimensions":{"aggregatorType":"AVERAGE"}, "values":{"doubleValue":5.639623848362502}}
You can view the metrics being produced using an AppDynamics Dashboard. You can produce AVRO, schemaless JSON, and JSON String data to the Kafka topic.
When you are ready, stop Confluent services using the following command:
confluent local stop
Examples¶
Property-based example¶
Create a configuration file for the connector. This file is included with the connector in etc/kafka-connect-appdynamics-metrics/appdynamics-metrics-sink-connector.properties
. This configuration is typically used for standalone workers.
name=appdynamics-metrics-sink
topics=appdynamics-metrics-topic
connector.class=io.confluent.connect.appdynamics.metrics.AppDynamicsMetricsSinkConnector
tasks.max=1
machine.agent.host=<host>
machine.agent.port=<port>
behavior.on.error=fail
confluent.topic.bootstrap.servers=localhost:9092
confluent.topic.replication.factor=1
reporter.bootstrap.servers=localhost:9092
reporter.result.topic.replication.factor=1
reporter.error.topic.replication.factor=1
key.converter=io.confluent.connect.avro.AvroConverter
key.converter.schema.registry.url=http://localhost:8081
value.converter=io.confluent.connect.avro.AvroConverter
value.converter.schema.registry.url=http://localhost:8081
Note
Before starting the connector:
- Make sure to supply the
machine.agent.host
,machine.agent.port
andbehavior.on.error
properties. - Make sure that the machine agent is set up and the controller configurations in the
<machine-agent-path>/conf/controller-info.xml
file are properly set. See Preliminary setup for additional information.
Tip
For details about using this connector with Kafka Connect Reporter, see Connect Reporter.
Enter the following command to load the configuration and start the connector:
Caution
You must include a double dash (--
) between the connector name and your flag. For more information,
see this post.
confluent local load appdynamics-metrics-sink -- -d appdynamics-metrics-sink-connector.properties
Example output:
{
"name": "appdynamics-metrics-sink",
"config": {
"connector.class": "io.confluent.connect.appdynamics.metrics.AppDynamicsMetricsSinkConnector",
"tasks.max":"1",
"topics":"appdynamics-metrics-topic",
"machine.agent.host": "<host>",
"machine.agent.port": "<port>",
"behavior.on.error": "fail",
"confluent.topic.bootstrap.servers":"localhost:9092",
"confluent.topic.replication.factor":"1",
"reporter.bootstrap.servers": "localhost:9092",
"reporter.result.topic.replication.factor": "1",
"reporter.error.topic.replication.factor": "1",
"key.converter": "io.confluent.connect.avro.AvroConverter",
"key.converter.schema.registry.url": "http://localhost:8081",
"value.converter": "io.confluent.connect.avro.AvroConverter",
"value.converter.schema.registry.url": "http://localhost:8081"
},
"tasks": []
}
REST-based example¶
This configuration is typically used with distributed workers. Create a JSON file named connector.json
and enter all the required properties. An example of the JSON to use is provided below:
{
"name": "appdynamics-metrics-sink",
"config": {
"connector.class": "io.confluent.connect.appdynamics.metrics.AppDynamicsMetricsSinkConnector",
"tasks.max":"1",
"topics":"appdynamics-metrics-topic",
"machine.agent.host": "<host>",
"machine.agent.port": "<port>",
"behavior.on.error": "fail",
"confluent.topic.bootstrap.servers":"localhost:9092",
"confluent.topic.replication.factor":"1",
"reporter.bootstrap.servers": "localhost:9092",
"reporter.result.topic.replication.factor": "1",
"reporter.error.topic.replication.factor": "1",
"key.converter": "io.confluent.connect.avro.AvroConverter",
"key.converter.schema.registry.url": "http://localhost:8081",
"value.converter": "io.confluent.connect.avro.AvroConverter",
"value.converter.schema.registry.url": "http://localhost:8081"
}
}
Note
For details about using this connector with Kafka Connect Reporter, see Connect Reporter.
Use curl to post the configuration to one of the Connect workers. Change
http://localhost:8083/
to the endpoint of the Connect worker.
curl -sS -X POST -H 'Content-Type: application/json' --data @connector.json http://localhost:8083/connectors
For more information, see the Kafka Connect REST API.