In the 0.9.0.0 release, the Kafka community added a number of features that can be used, together or separately, to secure a Kafka cluster. The following security measures are currently supported:
- Authentication of connections to brokers from clients (producers and consumers), other brokers and tools, using either SSL or SASL (Kerberos). SASL/PLAIN can also be used from release 0.10.0.0 onwards
- Authentication of connections from brokers to ZooKeeper
- Encryption of data transferred between brokers and clients, between brokers, or between brokers and tools using SSL (Note that there is a performance degradation when SSL is enabled, the magnitude of which depends on the CPU type and the JVM implementation)
- Authorization of read / write operations by clients
- Authorization is pluggable and integration with external authorization services is supported
It’s worth noting that security is optional - non-secured clusters are supported, as well as a mix of authenticated, unauthenticated, encrypted and non-encrypted clients.
See operating a secure cluster in the Confluent Platform for suggestions on using other Confluent Platform components with a secured Kafka cluster.
The guides below explain how to configure and use the security features in both clients and brokers.
- Encryption and Authentication using SSL
- Authentication using SASL
- SASL configuration for Kafka brokers
- SASL configuration for Kafka Clients
- Authentication using SASL/Kerberos
- Authentication using SASL/PLAIN
- Enabling multiple SASL mechanisms in a broker
- Modifying SASL mechanisms in a Running Cluster
- Enabling Logging for SASL
- Authorization and ACLs
- Adding Security to a Running Cluster
- ZooKeeper Authentication
- Kafka Security & the Confluent Platform