Monitor Confluent Platform topics

This page provides a centralized view for monitoring all Kafka topics within your connected Confluent Platform cluster.

An Apache Kafka® topic is a category or feed that stores messages. Producers write data to topics, and consumers read data from topics. Topics are grouped by cluster in environments. This interface lets you monitor the activity and high-level details of all your topics at a glance.

Note

From the Confluent Cloud Console, you can only monitor Confluent Platform topics. To create, delete, or manage these topics, you must use the Confluent Control Center for your self-managed cluster.

View the topics list

  1. In the Confluent Cloud Console, navigate to Environments and select your environment.
  2. In the navigation menu, click Clusters and select a Confluent Platform cluster.
  3. In the navigation menu, click Topics.

Understand the topics list

The Topics page provides a centralized view for all topics in your selected cluster. The list includes the following fields for each topic:

  • Topic name: The unique identifier for the topic.
  • Partitions: The number of partitions configured for the topic.
  • Production: The current rate of data written to the topic (bytes/sec).
  • Consumption: The current rate of data read from the topic (bytes/sec).
  • Retained bytes: The total size of data stored in the topic’s logs.
  • Consumer: The number of active consumer groups that read from the topic.
  • Data contract: A link to view the schema and data rules associated with the topic.

View individual topic details

To inspect the metrics, data contract, and configuration for a specific topic, select it from the topics list. This opens a detailed view with three tabs: Monitor, Data Contract, and Configuration.

Monitor tab

The Monitor tab provides a real-time summary of the message flow for the selected topic.

  • Production: The total rate of data produced (written) to this topic.
  • Consumption: The total rate of data consumed (read) from this topic.

The sidebar provides additional context, such as the topic’s retention time, partition count, and cleanup policy. You can also add metadata such as a description, tags, and owners to the topic’s entry in the Confluent Cloud Console to help with organization.

Data contract tab

The Data Contract tab defines the schema and rules for the data published to this topic, which ensures data quality and compatibility. This tab displays the schema, references, metadata, and any rules applied to the data.

From the Data contracts page, you can also explore Stream Lineage, which provides a visual representation of the topic’s data flow. Clicking any element in the lineage diagram provides additional details about that component. For more information, see Track Data with Stream Lineage on Confluent Cloud.

If no data is present, click Create Data Contract to add a data contract. For instructions, see Create a topic schema.

The Data Contract tab displays the following specifications:

  • Schema: The JSON schema of the topic’s messages, which defines the structure and data types.
  • References: A list of any external schema references.
  • Metadata: Additional metadata about the schema.
  • Rules: Any rules or validations applied to the data.

Use the following to view topic data contracts.

  1. In the Confluent Cloud Console, navigate to Environments and select your environment.

  2. In the navigation menu, click Clusters and select a Confluent Platform cluster.

  3. In the navigation menu, click Topics.

  4. Select a topic from the list of topics.

  5. Click View Data Contract.

    The data contract tab displays the schema and rules for the topic.

Configuration tab

The Configuration tab displays the topic’s configuration parameters. These settings are read-only in the Confluent Cloud Console.

  • General settings: Shows the topic name and number of partitions.
  • Show full config : Click to view a detailed list of all topic-level configuration settings, such as cleanup.policy and retention.ms.