Important
You are viewing documentation for an older version of Confluent Platform. For the latest, click here.
Installing Clients¶
Confluent Platform includes client libraries for multiple languages that provide both low level access to Kafka and higher level stream processing. These libraries are available through the native packaging systems for each language.
Java¶
All JARs included in the packages are also available in the Confluent Maven repository. Here’s a sample POM file showing how to add this repository:
<repositories>
<repository>
<id>confluent</id>
<url>https://packages.confluent.io/maven/</url>
</repository>
<!-- further repository entries here -->
</repositories>
The Confluent Maven repository includes compiled versions of Kafka.
To reference the Kafka version 1.1.1-cp2 that is included with Confluent Platform 4.1.3,
use the following in your pom.xml
:
<dependencies>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_2.11</artifactId>
<version>1.1.1-cp2</version>
</dependency>
<!-- further dependency entries here -->
</dependencies>
Note
Version names of Kafka in Apache vs. Kafka in Confluent Platform:
Confluent always contributes patches back to the Apache Kafka open source project.
However, the exact versions (and version names) being included in Confluent Platform
may differ from the Apache artifacts when Confluent Platform and Apache
Kafka releases do not align. If they are different, Confluent keeps the groupId
and artifactId
identical, but appends the suffix -cpX
to the version identifier
of the Confluent Platform version (with X
being a digit) to distinguish these from
the Apache artifacts.
You can reference artifacts for all Java libraries that are included with Confluent Platform. For example, to use Confluent’s
open source serializers that integrate with the rest of the Confluent Platform you would include the following in your pom.xml
:
<dependencies>
<dependency>
<groupId>io.confluent</groupId>
<artifactId>kafka-avro-serializer</artifactId>
<!-- For Confluent Platform 4.1.3 -->
<version>4.1.3</version>
</dependency>
<!-- further dependency entries here -->
</dependencies>
C/C++¶
The C/C++ client, called librdkafka, is available in source form and as precompiled binaries for Debian and Red Hat based Linux distributions, and macOS. Most users will want to use the precompiled binaries.
For Linux distributions, follow the instructions for Debian or Red Hat distributions to setup the repositories, then use yum
or apt-get
to install the appropriate packages. For example, a developer building a C application on a Red Hat-based
distribution would use the librdkafka-devel
package:
sudo yum install librdkafka-devel
And on a Debian-based distribution they would use the librdkafka-dev
package:
sudo apt-get install librdkafka-dev
On macOS, the latest release is available via Homebrew:
brew install librdkafka
The source code is also available in the ZIP and TAR archives under the directory src/
.
JMS¶
The JMS client is a library that you use from within your Java applications.
To reference kafka-jms-client in a Maven based project, first add the Confluent Maven repository to your pom.xml:
<repositories>
<repository>
<id>confluent</id>
<url>http://packages.confluent.io/maven/</url>
</repository>
</repositories>
Then add a dependency on the Confluent JMS client as well as the JMS API specification (note:
replace the text [version]
with 4.1.3):
<dependency>
<groupId>io.confluent</groupId>
<artifactId>kafka-jms-client</artifactId>
<version>[version]</version>
</dependency>
<dependency>
<groupId>org.apache.geronimo.specs</groupId>
<artifactId>geronimo-jms_1.1_spec</artifactId>
<version>1.1</version>
</dependency>
If you don’t use Maven, you can download the JMS Client JAR file directly by navigating to the
following URL (note: replace the text [version]
with 4.1.3)
If you require a JAR that includes the JMS Client and all of it’s dependencies, see Appendix 1 - Creating a Shaded Fat .jar.
Python¶
The Python client, called confluent-kafka-python, is available on PyPI. The
Python client uses librdkafka, the C client, internally. To install the Python client, first install the C client including its development package, then install the library with pip
(for both Linux and macOS):
pip install confluent-kafka
Note that this will install the package globally for your Python environment. You may also use a virtualenv to install it only for your project.
Then in Python you can import and use the library:
from confluent_kafka import Producer
conf = {'bootstrap.servers': 'localhost:9092', 'client.id': 'test', 'default.topic.config': {'acks': 'all'}}
producer = Producer(conf)
producer.produce(topic, key='key', value='value')
See the clients documentation for more examples.
The source code is also available in the ZIP and TAR archives under the directory src/
.
Go¶
The Go client, called confluent-kafka-go, is distributed via GitHub
and gopkg.in to pin to specific versions. The Go client uses librdkafka, the C client,
internally and exposes it as Go library using cgo. To install the Go client, first install
the C client including its development package as well as a C build toolchain including
pkg-config
. On Red Hat-based Linux distributions install the following packages in addition to librdkafka:
sudo yum groupinstall "Development Tools"
On Debian-based distributions, install the following in addition to librdkafka:
sudo apt-get install build-essential pkg-config git
On macOS using Homebrew, install the following:
brew install pkg-config git
Next, use go get
to install the library:
go get gopkg.in/confluentinc/confluent-kafka-go.v0/kafka
Your Go code can now import and use the client. You can also build and run a small command line utility, go-kafkacat
,
to ensure the installation was successful:
go get gopkg.in/confluentinc/confluent-kafka-go.v0/examples/go-kafkacat
$GOPATH/bin/go-kafkacat --help
If you would like to statically link librdkafka, add the flag -tags static
to the go get
commands. This will
statically link librdkafka itself so its dynamic library will not be required on the target deployment system. Note,
however, that librdkafka’s dependencies that are linked statically (such as ssl, sasl2, lz4, etc) will still be linked
dynamically and required on the target system. An experimental option for creating a completely statically linked binary is
available as well. Use the flag -tags
static_all
. This requires all dependencies to be available as static libraries (e.g., libsasl2.a
). Static libraries are
typically not installed by default but are available in the corresponding -dev
or -devel
packages (e.g.,
libsasl2-dev
).
See the clients documentation for code examples showing how to use the library.
The source code is also available in the ZIP and TAR archives under the directory src/
.
.NET¶
The .NET client, called confluent-kafka-dotnet, is available on NuGet. Internally, the .NET client uses librdkafka, the C client. Precompiled binaries for librdkafka are provided via the dependent librdkafka.redist NuGet package for a number of popular platforms (win-x64, win-x86, debian-x64, rhel-x64 and osx).
To reference confluent-kafka-dotnet from within a Visual Studio project, run the following command in the Package Manager Console:
PM> Install-Package Confluent.Kafka
Note
The dependent librdkafka.redist package will be installed automatically.
To reference confluent-kafka-dotnet in a .NET Core project.json file, include the following reference in the dependencies section:
"dependencies": {
...
"Confluent.Kafka": "0.9.4"
...
}
and then execute the dotnet restore
command to restore project dependencies via NuGet.
confluent-kafka-dotnet targets frameworks net451 and netstandard1.3 and is supported on the .NET Framework version 4.5.1 and higher and .NET Core 1.0 (on Windows, Linux and Mac) and higher. We do not support confluent-kafka-dotnet on Mono.