You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently, the dbt-spark docker image is configured to include only Apache Hudi jars and Hudi Spark extensions.
It would be valuable to add support for Delta and Iceberg so the community can use the different formats.
Solution: Adding the support into the conf file and making sure the tests work correctly
Who will this benefit?
Users who wish to use the Delta and Iceberg without having to separately add binaries and add extensions.
Are you interested in contributing this feature?
I can add a PR to resolve this.
The text was updated successfully, but these errors were encountered:
Thanks for opening @nssalian! Re-labeling as tech_debt since the core functionality for OSS Delta + Hudi support is in place (and extension Iceberg as described in #294). We want the ability to actually test that functionality in CI, which requires a properly configured Docker image.
Fleid
changed the title
[CT-797] Add Iceberg, delta support onto dbt-spark
[CT-797] Add Iceberg, delta support onto dbt-spark (docker images)
Dec 9, 2022
This issue has been marked as Stale because it has been open for 180 days with no activity. If you would like the issue to remain open, please remove the stale label or comment on the issue, or it will be closed in 7 days.
Describe the feature
Currently, the dbt-spark docker image is configured to include only Apache Hudi jars and Hudi Spark extensions.
It would be valuable to add support for Delta and Iceberg so the community can use the different formats.
Solution: Adding the support into the conf file and making sure the tests work correctly
Who will this benefit?
Users who wish to use the Delta and Iceberg without having to separately add binaries and add extensions.
Are you interested in contributing this feature?
I can add a PR to resolve this.
The text was updated successfully, but these errors were encountered: