Important
You are viewing documentation for an older version of Confluent Platform. For the latest, click here.
confluent local produce¶
Description¶
Produce data to topics. By default this command produces non-Avro data to the Apache Kafka® cluster on the localhost.
Important
The confluent local commands are intended for a single-node development environment and are not suitable for a production environment. The data that are produced are transient and are intended to be temporary. For production-ready workflows, see Install and Upgrade Confluent Platform.
confluent local produce <topicname> [--value-format <format> --property value.schema=<schema>] [--cloud]
[other optional args] --path <path-to-confluent>
Caution
You must include a double dash (--
) between the topic name and your flag. For more information,
see this post.
Flags¶
Tip
You must either specify the path for each Confluent CLI confluent local
command invocation, export the
path as an environment variable for each terminal session, or set the path to your Confluent Platform installation in your
shell profile. For example:
cat ~/.bash_profile
export CONFLUENT_HOME=<path-to-confluent>
export PATH="${CONFLUENT_HOME}/bin:$PATH"
Name, shorthand | Default | Description |
---|---|---|
--path <path-to-confluent> |
Path to Confluent Platform install directory. | |
-h, --help |
Print command information. | |
--value-format <format> |
Specify the topic data as AVRO, PROTOBUF, or JSONSCHEMA. | |
--property value.schema=<schema> |
Provide the schema for the data. | |
--cloud |
Connect to Confluent Cloud using a user-created configuration
file located at $HOME/.ccloud/config . |
|
--config <path-to-file> |
Specify the location of a user-created configuration file. See the example configuration file. | |
--broker-list |
localhost:9092 |
Kafka brokers to connect to. |
Positional arguments¶
Name, shorthand | Default | Description |
---|---|---|
<topicname> |
Kafka topic to produce messages to. | |
<path-to-confluent> |
The relative path to Confluent Platform. You can also define this
as an environment variable named CONFLUENT_HOME . |
Examples¶
Tip
For reference on how to use the Confluent CLI to consume from a topic, see confluent local consume.
Produce Avro data to a topic called
mytopic1
on a development Kafka cluster on localhost. Assumes Confluent Schema Registry is listening athttp://localhost:8081
. Note the double dash (--
) between the topic name and your flag.confluent local produce mytopic1 -- --value-format avro --property value.schema='{"type":"record","name":"myrecord","fields":[{"name":"f1","type":"string"}]}'
Produce non-Avro data to a topic called
mytopic2
on a development Kafka cluster on localhost:confluent local produce mytopic2
Examples to Confluent Cloud¶
Create a customized Confluent Cloud configuration file with connection details for the Confluent Cloud cluster using the format shown in this example, and save as
/tmp/myconfig.properties
. You can specify the file location using--config <filename>
.bootstrap.servers=<broker endpoint> sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required \ username="<api-key>" \ password="<api-secret>"; basic.auth.credentials.source=USER_INFO schema.registry.basic.auth.user.info=<username:password> schema.registry.url=<sr endpoint>
Produce non-Avro data to a topic called
mytopic3
in Confluent Cloud. Assumes topic has already been created. Note the double dash (--
) between the topic name and your flag.confluent local produce mytopic3 -- --cloud --config /tmp/myconfig.properties
Produce messages with keys and non-Avro values to a topic called
mytopic4
in Confluent Cloud, using a user-specified Confluent Cloud configuration file at/tmp/myconfig.properties
. Assumes topic has already been created. Note the double dash (--
) between the topic name and your flag.confluent local produce mytopic4 -- --cloud --config /tmp/myconfig.properties --property parse.key=true --property key.separator=,
Produce Avro data to a topic called
mytopic5
in Confluent Cloud. Assumes topic has already been created, and Confluent Schema Registry is listening athttp://localhost:8081
. Note the double dash (--
) between the topic name and your flag.confluent local produce mytopic5 -- --cloud --config /tmp/myconfig.properties --value-format avro --property \ value.schema='{"type":"record","name":"myrecord","fields":[{"name":"f1","type":"string"}]}' \ --property schema.registry.url=http://localhost:8081
Produce Avro data to a topic called
mytopic6
in Confluent Cloud. Assumes topic has already been created and you are using Confluent Cloud Confluent Schema Registry. Note the double dash (--
) between the topic name and your flag.confluent local produce mytopic5 -- --cloud --config /tmp/myconfig.properties --value-format avro --property \ value.schema='{"type":"record","name":"myrecord","fields":[{"name":"f1","type":"string"}]}' \ --property schema.registry.url=https://<SR ENDPOINT> \ --property basic.auth.credentials.source=USER_INFO \ --property schema.registry.basic.auth.user.info=<SR API KEY>:<SR API SECRET>
Tip
To easily try out the Confluent CLI functionality in your Confluent Cloud cluster, see the Confluent CLI demo.