Notable Limitations in Public Preview¶
For the Preview, many features of open-source Flink are supported by Confluent Cloud for Apache Flink®️. Some features are limited and others are not supported. This topic describes the status of Flink SQL for Preview.
Confluent Cloud for Apache Flink®️ is currently available for Preview. A Preview feature is a Confluent Cloud component that is being introduced to gain early feedback from developers. Preview features can be used for evaluation and non-production testing purposes or to provide feedback to Confluent. The warranty, SLA, and Support Services provisions of your agreement with Confluent do not apply to Preview features. Confluent may discontinue providing Preview releases of the Preview features at any time in Confluent’s sole discretion. Check out Getting Help for questions, feedback and requests.
For Flink SQL features and limitations in the preview program, see Notable Limitations in Public Preview.
Supported features in Preview¶
The following features are supported in the Preview for Flink SQL in Confluent Cloud.
- CREATE TABLE (without the AS, PARTITION BY, and LIKE keywords)
- ALTER TABLE (only for ADD/MODIFY WATERMARK; ADD COLUMN, DROP COLUMN, and other alterations aren’t supported)
- DESCRIBE EXTENDED
- INSERT INTO (persistent queries)
- EXECUTE STATEMENT SET
- SHOW CATALOG / DATABASE / TABLE
- USE / USE CATALOG
- SHOW CREATE TABLE
- Regular Joins
- Interval Joins
- Temporal Table Join between a non-compacted and compacted Kafka topic
- Star Schema Denormalization (N-Way Join), as long as temporary tables are not used
- Lateral Table Join, as long as temporary views are not used
Flink SQL is supported on AWS in these regions:
We will soon be adding support for the following regions:
Limitations for Preview¶
The following limitations apply to the Preview for Flink SQL in Confluent Cloud.
- Exactly once semantics are not guaranteed for Preview. In some cases, this may appear like a bug in your query. The mitigation is to retry the query.
- Flink SQL does not support some Avro data types, for example, enums, union types, timestamp with time zone, and namespaced records. For more information, see Avro known limitations.
- Schema Registry must be enabled in your environment, and queries on schemaless topics are not allowed during Preview.
- The supported schema evolution mode is FULL_TRANSITIVE. Be aware that the Confluent Schema Registry default compatibility type is BACKWARD.
- The result size limit for SHOW commands, for example, SHOW JOBS, is 350,000 bytes. If the result size is larger than this limit, the result is truncated.
- Any ENUM field is mapped to a STRING. You can’t create a table with an ENUM field from Flink SQL. You must do this from Schema Registry.
- If you need to run cross-environment queries, the
EnvironmentAdminrole in the Flink environment doesn’t grant cross-environment access for Kafka and Schema Registry, so you must get additional grants for data access roles in the other environments.
key.fieldsoption is not supported. To declare column keys, use the PARTITIONED BY clause instead.
- Time attribute columns have these limitations:
- Must not be nested.
- Must be top-level columns.
- Must have a TIMESTAMP type with precision 3 or less.
- Only public networking is supported for processing data in your Kafka clusters. Private networking is currently unsupported. We are actively working on adding support for private networking.
- User-defined functions (UDFs) are not supported.
- DROP TABLE is not supported. To delete a table, delete the underlying Kafka topic from Confluent Cloud. For more information see Delete a topic.
- Renaming tables is not supported.
- VIEWs are not supported.
- Encapsulating logic with temporary views is not supported.
- CREATE TABLE LIKE and CREATE TABLE AS are not supported.
- CREATE / DROP / ALTER Catalog / Database commands are not supported.
- Temporary tables are not supported.
- Schema Contexts are not supported.
- GROUP BY windows are not supported, which means that SESSION windows also are not supported
- UNION is not supported for the AVRO format.
The following statements are not supported by Confluent Cloud for Flink SQL for Preview.
- ADD JAR / REMOVE JAR / SHOW JARS
- ALTER FUNCTION
- ANALYZE TABLE
- CREATE [TEMPORARY] [SYSTEM] FUNCTION
- CREATE CATALOG
- CREATE TABLE AS
- DROP [TEMPORARY] [SYSTEM] FUNCTION
- DROP CATALOG
- DROP DATABASE
- EXECUTE PLAN
- LOAD MODULE / UNLOAD MODULE / USE MODULE / SHOW MODULES