.. _clients:
==================
Installing Clients
==================
|CP| includes client libraries for multiple languages that provide both low level access to Kafka and higher level stream processing. These libraries are available through the native packaging systems for each language.
.. include:: ../includes/clients.rst
.. _installation_maven:
Java
^^^^
All JARs included in the packages are also available in the Confluent Maven
repository. Here's a sample POM file showing how to add this repository:
.. sourcecode:: xml
confluent
https://packages.confluent.io/maven/
The Confluent Maven repository includes compiled versions of Kafka.
To reference the Kafka version |kafka-version| that is included with |CP| |release|,
use the following in your ``pom.xml``:
.. codewithvars:: xml
org.apache.kafka
kafka_|scala_version|
|kafka_release|
.. note::
**Version names of Kafka in Apache vs. Kafka in Confluent Platform:**
Confluent always contributes patches back to the Apache Kafka open source project.
However, the exact versions (and version names) being included in |CP|
may differ from the Apache artifacts when |CP| and Apache
Kafka releases do not align. If they are different, Confluent keeps the ``groupId``
and ``artifactId`` identical, but appends the suffix ``-cpX`` to the version identifier
of the |CP| version (with ``X`` being a digit) to distinguish these from
the Apache artifacts.
You can reference artifacts for all Java libraries that are included with |CP|. For example, to use Confluent's
open source serializers that integrate with the rest of the |CP| you would include the following in your ``pom.xml``:
.. codewithvars:: xml
io.confluent
kafka-avro-serializer
|release|
.. _cpp_client:
C/C++
^^^^^
The C/C++ client, called librdkafka, is available in source form and as precompiled binaries for Debian and Red Hat based
Linux distributions, and macOS. Most users will want to use the precompiled binaries.
For Linux distributions, follow the instructions for :ref:`Debian
` or :ref:`Red Hat ` distributions to setup the repositories, then use ``yum`` or ``apt-get``
to install the appropriate :ref:`packages `. For example, a developer building a C application on a Red Hat-based
distribution would use the ``librdkafka-devel`` package:
.. sourcecode:: bash
sudo yum install librdkafka-devel
And on a Debian-based distribution they would use the ``librdkafka-dev`` package:
.. sourcecode:: bash
sudo apt-get install librdkafka-dev
On macOS, the latest release is available via `Homebrew `_:
.. sourcecode:: bash
brew install librdkafka
The source code is also available in the :ref:`ZIP and TAR archives ` under the directory ``src/``.
.. _jms_client:
JMS
^^^
The JMS client is a library that you use from within your Java applications.
To reference `kafka-jms-client` in a Maven based project, first add the Confluent Maven repository to your pom.xml::
confluent
http://packages.confluent.io/maven/
Then add a dependency on the Confluent JMS client as well as the JMS API specification (note:
replace the text ``[version]`` with |release|)::
io.confluent
kafka-jms-client
[version]
org.apache.geronimo.specs
geronimo-jms_1.1_spec
1.1
If you don't use Maven, you can download the JMS Client JAR file directly by navigating to the
following URL (note: replace the text ``[version]`` with |release|)
.. codewithvars::
http://packages.confluent.io/maven/io/confluent/kafka-jms-client/[version]/kafka-jms-client
-[version].jar
If you require a JAR that includes the JMS Client and all of it's dependencies, see :ref:`appendix_1`.
.. _python_client:
Python
^^^^^^
The Python client, called confluent-kafka-python, is available on `PyPI `_. The
Python client uses librdkafka, the C client, internally. To install the Python client, first install :ref:`the C client
` including its development package, then install the library with ``pip`` (for both Linux and macOS):
.. sourcecode:: bash
pip install confluent-kafka
Note that this will install the package globally for your Python environment. You may also use a `virtualenv
`_ to install it only for your project.
Then in Python you can import and use the library:
.. sourcecode:: python
from confluent_kafka import Producer
conf = {'bootstrap.servers': 'localhost:9092', 'client.id': 'test', 'default.topic.config': {'acks': 'all'}}
producer = Producer(conf)
producer.produce(topic, key='key', value='value')
See the `clients documentation `_ for more examples.
The source code is also available in the :ref:`ZIP and TAR archives ` under the directory ``src/``.
.. _go_client:
Go
^^
The Go client, called confluent-kafka-go, is distributed via `GitHub `_
and `gopkg.in `_ to pin to specific versions. The Go client uses librdkafka, the C client,
internally and exposes it as Go library using `cgo `_. To install the Go client, first install
:ref:`the C client ` including its development package as well as a C build toolchain including
``pkg-config``. On Red Hat-based Linux distributions install the following packages in addition to librdkafka:
.. sourcecode:: bash
sudo yum groupinstall "Development Tools"
On Debian-based distributions, install the following in addition to librdkafka:
.. sourcecode:: bash
sudo apt-get install build-essential pkg-config git
On macOS using `Homebrew `_, install the following:
.. sourcecode:: bash
brew install pkg-config git
Next, use ``go get`` to install the library:
.. sourcecode:: bash
go get gopkg.in/confluentinc/confluent-kafka-go.v0/kafka
Your Go code can now import and use the client. You can also build and run a small command line utility, ``go-kafkacat``,
to ensure the installation was successful:
.. sourcecode:: bash
go get gopkg.in/confluentinc/confluent-kafka-go.v0/examples/go-kafkacat
$GOPATH/bin/go-kafkacat --help
If you would like to statically link librdkafka, add the flag ``-tags static`` to the ``go get`` commands. This will
statically link librdkafka itself so its dynamic library will not be required on the target deployment system. Note,
however, that librdkafka's dependencies that are linked statically (such as ssl, sasl2, lz4, etc) will still be linked
dynamically and required on the target system. An experimental option for creating a completely statically linked binary is
available as well. Use the flag ``-tags
static_all``. This requires all dependencies to be available as static libraries (e.g., ``libsasl2.a``). Static libraries are
typically not installed by default but are available in the corresponding ``-dev`` or ``-devel`` packages (e.g.,
``libsasl2-dev``).
See the `clients documentation `_ for code examples showing how to use the library.
The source code is also available in the :ref:`ZIP and TAR archives ` under the directory ``src/``.
.. _dotnet_client:
.NET
^^^^
The .NET client, called confluent-kafka-dotnet, is available on `NuGet `_.
Internally, the .NET client uses librdkafka, the C client. Precompiled binaries for librdkafka are provided via the
dependent `librdkafka.redist `_ NuGet package for a number
of popular platforms (win-x64, win-x86, debian-x64, rhel-x64 and osx).
To reference confluent-kafka-dotnet from within a Visual Studio project, run the following command in the
Package Manager Console:
.. sourcecode:: bash
PM> Install-Package Confluent.Kafka
.. note::
The dependent librdkafka.redist package will be installed automatically.
To reference confluent-kafka-dotnet in a .NET Core project.json file, include the following reference in the
dependencies section:
.. sourcecode:: json
"dependencies": {
...
"Confluent.Kafka": "0.9.4"
...
}
and then execute the ``dotnet restore`` command to restore project dependencies via NuGet.
confluent-kafka-dotnet targets frameworks net451 and netstandard1.3 and is supported on the .NET
Framework version 4.5.1 and higher and .NET Core 1.0 (on Windows, Linux and Mac) and higher. We
do not support confluent-kafka-dotnet on Mono.