You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently, if the insert statement specifies less columns than the target table size, the following exception will be thrown:
Exception in thread "main" org.apache.spark.sql.AnalysisException: Cannot find column 'col_1' of the target table among the INSERT columns: col_2, col_3. INSERT clauses must provide values for all columns of the target table.
For a wide table that has 1000 columns, the user is required to specify all the columns with default values null to avoid this exception. Can we support partial insert in merge into (default to null if not specified) command so the developer can maintain clean SQL statement.
E.g. Delta merge command has already supported this
This issue has been automatically marked as stale because it has been open for 180 days with no activity. It will be closed in next 14 days if no further activity occurs. To permanently prevent this issue from being considered stale, add the label 'not-stale', but commenting on the issue is preferred when possible.
Feature Request / Improvement
Currently, if the insert statement specifies less columns than the target table size, the following exception will be thrown:
Exception in thread "main" org.apache.spark.sql.AnalysisException: Cannot find column 'col_1' of the target table among the INSERT columns: col_2, col_3. INSERT clauses must provide values for all columns of the target table.
For a wide table that has 1000 columns, the user is required to specify all the columns with default values null to avoid this exception. Can we support partial insert in merge into (default to null if not specified) command so the developer can maintain clean SQL statement.
E.g. Delta merge command has already supported this
https://docs.databricks.com/sql/language-manual/delta-merge-into.html
Query engine
Spark
The text was updated successfully, but these errors were encountered: