Important

You are viewing documentation for an older version of Confluent Platform. For the latest, click here.

confluent local consume

Description

Consume data from topics. By default this command consumes binary data from the Apache Kafka® cluster on the localhost.

Important

The confluent local commands are intended for a single-node development environment and are not suitable for a production environment. The data that are produced are transient and are intended to be temporary. For production-ready workflows, see Install and Upgrade Confluent Platform.

confluent local consume <topicname> -- [flags] --value-format <format> --path <path-to-confluent>

Caution

You must include a double dash (--) between the topic name and your flag. For more information, see this post.

Flags

Tip

You must either specify the path for each Confluent CLI confluent local command invocation, export the path as an environment variable for each terminal session, or set the path to your Confluent Platform installation in your shell profile. For example:

cat ~/.bash_profile
export CONFLUENT_HOME=<path-to-confluent>
export PATH="${CONFLUENT_HOME}/bin:$PATH"
Name, shorthand Default Description
--value-format <format>   Specify the topic data as AVRO, PROTOBUF, or JSONSCHEMA.
--cloud   Connect to Confluent Cloud using a user-created configuration file by default located at $HOME/.ccloud/config.
--config <path-to-file>   Specify alternate location of a user-created configuration file. See the example configuration file.
--from-beginning latest Consume from the earliest message in the topic log.
--bootstrap-server localhost:9092 Kafka brokers to connect to.
--path <path-to-confluent>   Path to Confluent Platform install directory.
-h, --help   Print command information.

Positional arguments

Name, shorthand Default Description
<topicname>   Kafka topic to consume messages from
<path-to-confluent>   The relative path to Confluent Platform. You can also define this as an environment variable named CONFLUENT_HOME.

Examples

Tip

For reference on how to use the Confluent CLI to produce to a topic, see confluent local produce.

  • Consume Avro data from the beginning of topic called mytopic1 on a development Kafka cluster on localhost. Assumes Confluent Schema Registry is listening at http://localhost:8081. Note the double dash (--) between the topic name and your flag.

    confluent local consume mytopic1 -- --value-format avro --from-beginning
    
  • Consume newly arriving non-Avro data from a topic called mytopic2 on a development Kafka cluster on localhost. Note the double dash (--) between the topic name and your flag.

    confluent local consume mytopic2
    

Examples to Confluent Cloud

  • Create a Confluent Cloud configuration file with connection details for the Confluent Cloud cluster using the format shown in this example, and save as /tmp/myconfig.properties. You can specify the file location using --config <filename>.

    bootstrap.servers=<broker endpoint>
    sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required \
    username="<api-key>" \
    password="<api-secret>";
    basic.auth.credentials.source=USER_INFO
    schema.registry.basic.auth.user.info=<username:password>
    schema.registry.url=<sr endpoint>
    
  • Consume non-Avro data from the beginning of a topic named mytopic3 in Confluent Cloud, using a user-specified Confluent Cloud configuration file at /tmp/myconfig.properties. Note the double dash (--) between the topic name and your flag.

    confluent local consume mytopic3 -- --cloud --config /tmp/myconfig.properties --from-beginning
    
  • Consume messages with keys and non-Avro values from the beginning of topic called mytopic4 in Confluent Cloud, using a user-specified Confluent Cloud configuration file at /tmp/myconfig.properties. Note the double dash (--) between the topic name and your flag. See the sample Confluent Cloud configuration file above.

    confluent local consume mytopic4 -- --cloud --config /tmp/myconfig.properties --from-beginning --property print.key=true
    
  • Consume Avro data from a topic called mytopic5 in Confluent Cloud. Assumes Confluent Schema Registry is listening at http://localhost:8081. Note the double dash (--) between the topic name and your flag.

    confluent local consume mytopic5 -- --cloud --config /tmp/myconfig.properties --value-format avro \
    --from-beginning --property schema.registry.url=http://localhost:8081
    
  • Consume Avro data from a topic called mytopic6 in Confluent Cloud. Assumes you are using Confluent Cloud Confluent Schema Registry. Note the double dash (--) between the topic name and your flag.

    confluent local consume mytopic6 -- --cloud --config /tmp/myconfig.properties --value-format avro \
    --from-beginning --property schema.registry.url=https://<SR ENDPOINT> \
    --property basic.auth.credentials.source=USER_INFO \
    --property schema.registry.basic.auth.user.info=<SR API KEY>:<SR API SECRET>
    

Tip

To easily try out the Confluent CLI functionality in your Confluent Cloud cluster, see the Confluent CLI demo.