confluent local produce¶
Description¶
Produce data to topics. By default this command produces non-Avro data to the Apache Kafka® cluster on the localhost.
Important
The confluent local commands are intended for a single-node development environment and are not suitable for a production environment. The data that are produced are transient and are intended to be temporary. For production-ready workflows, see Install and Upgrade.
confluent local produce <topicname> [--value-format avro --property value.schema=<schema>] [--cloud]
[other optional args] --path <path-to-confluent>
Caution
You must include a double dash (--
) between the topic name and your flag. For more information,
see this post.
Flags¶
Tip
You must either specify the path for each Confluent CLI confluent local
command invocation, export the
path as an environment variable for each terminal session, or set the path to your Confluent Platform installation in your
shell profile. For example:
cat ~/.bash_profile
export CONFLUENT_HOME=<path-to-confluent>
export PATH="${CONFLUENT_HOME}/bin:$PATH"
Name, shorthand | Default | Description |
---|---|---|
--path <path-to-confluent> |
Path to Confluent Platform install directory. | |
-h, --help |
Print command information. | |
--value-format avro |
Specify the topic data as Avro data. | |
--property value.schema=<schema> |
Provide the schema for the Avro data. | |
--cloud |
Connect to Confluent Cloud using a configuration file. The
configuration file is specified using --config . |
|
--config <path-to-file> |
Specify the configuration file for --cloud .
See the example configuration file. |
|
--broker-list |
localhost:9092 |
Kafka brokers to connect to. |
Positional arguments¶
Name, shorthand | Default | Description |
---|---|---|
<topicname> |
Kafka topic to produce messages to. | |
<path-to-confluent> |
The relative path to Confluent Platform. You can also define this
as an environment variable named CONFLUENT_HOME . |
Examples¶
Tip
For reference on how to use the Confluent CLI to consume from a topic, see confluent local consume.
Produce Avro data to a topic called
mytopic1
on a development Kafka cluster on localhost. Assumes Confluent Schema Registry is listening athttp://localhost:8081
. Note the double dash (--
) between the topic name and your flag.confluent local produce mytopic1 -- --value-format avro --property value.schema='{"type":"record","name":"myrecord","fields":[{"name":"f1","type":"string"}]}'
Produce non-Avro data to a topic called
mytopic2
on a development Kafka cluster on localhost:confluent local produce mytopic2
Examples to Confluent Cloud¶
Create a Confluent Cloud configuration file with information on connecting to your Confluent Cloud cluster. By default, the CLI expects this file at
$HOME/.ccloud/config
but if you name it something else, e.g./tmp/config
, you can specify the file location at--config <filename>
.bootstrap.servers=<broker endpoint> sasl.username=<api-key-id> sasl.password=<secret-access-key> basic.auth.credentials.source=USER_INFO schema.registry.basic.auth.user.info=<username:password> schema.registry.url=<sr endpoint>
Produce non-Avro data to a topic called
mytopic3
in Confluent Cloud. Assumes topic has already been created. Note the double dash (--
) between the topic name and your flag.confluent local produce mytopic3 -- --cloud --config /tmp/config
Produce messages with keys and non-Avro values to a topic called
mytopic4
in Confluent Cloud, using a user-specified Confluent Cloud configuration file at/tmp/config
. Assumes topic has already been created. Note the double dash (--
) between the topic name and your flag.confluent local produce mytopic4 -- --cloud --config /tmp/config --property parse.key=true --property key.separator=,
Produce Avro data to a topic called
mytopic5
in Confluent Cloud. Assumes topic has already been created, and Confluent Schema Registry is listening athttp://localhost:8081
. Note the double dash (--
) between the topic name and your flag.confluent local produce mytopic5 -- --cloud --config /tmp/config --value-format avro --property \ value.schema='{"type":"record","name":"myrecord","fields":[{"name":"f1","type":"string"}]}' \ --property schema.registry.url=http://localhost:8081
Produce Avro data to a topic called
mytopic6
in Confluent Cloud. Assumes topic has already been created and you are using Confluent Cloud Confluent Schema Registry. Note the double dash (--
) between the topic name and your flag.confluent local produce mytopic5 -- --cloud --config /tmp/config --value-format avro --property \ value.schema='{"type":"record","name":"myrecord","fields":[{"name":"f1","type":"string"}]}' \ --property schema.registry.url=https://<SR ENDPOINT> \ --property basic.auth.credentials.source=USER_INFO \ --property schema.registry.basic.auth.user.info=<SR API KEY>:<SR API SECRET>
Tip
To easily try out the Confluent CLI functionality in your Confluent Cloud cluster, see the Confluent CLI demo.