Releases: dbt-labs/dbt-external-tables
Releases · dbt-labs/dbt-external-tables
dbt-external-tables v0.8.3
What's Changed
- (Spark) Correcting handling partitions in Spark and adding sample source by @pgoslatara in #161
- Add aws context by @dave-connors-3 in #179
- Use CircleCI contexts for environment variables by @dbeatty10 in #180
- Create schema for external tables when it is not there. by @guillesd in #167
New Contributors
- @pgoslatara made their first contribution in #161
- @dave-connors-3 made their first contribution in #179
- @dbeatty10 made their first contribution in #180
- @guillesd made their first contribution in #167
Full Changelog: 0.8.2...0.8.3
dbt-external-tables v0.8.2
Fixes
- (BigQuery) Fix
column_quoted
set outside the columns loop (#166)
Full Changelog: 0.8.1...0.8.2
dbt-external-tables v0.8.1
Features
- (Snowflake) Add support for Delta lake external tables (#138)
- (snowflake) Add configurable error integration for snowpipe (#154)
Fixes
- (Snowflake) Fix column quoting (#135)
- (BigQuery) Fix column quoting (#148)
- (Spark) Fix
recover_partitions
macro (#163)
Contributors
- [@aPeterHeise] (#135, #138)
- [@scarrucciu] (#148)
- [@NiallRees] (#154)
- [@lassebenni] (#163)
dbt-external-tables v0.8.0
This release supports any version (minor and patch) of v1, which means far less need for compatibility releases in the future.
Features
- (Snowflake) Support for regex
pattern
in snowpipes (#111, #122) - (Apache Spark) Real support for partitioned external tables. Note that external sources with
partitions
defined were implicitly skipped. Going forward, sources with partitions defined (excluding those withusing: delta
) will runalter table ... recover partitions
. (#116)
Under the hood
- Use standard logging, thereby removing dependency on
dbt_utils
(#119) - Remove
synapse__
-prefixed "passthrough" macros, now thatdbt-synapse
can usesqlserver__
-prefixed macros instead (#110)
Contributors
- @JCZuurmond (#116)
- @stumelius (#111)
- @swanderz (#110)
dbt-external-tables v0.7.3
Fixes
- Hard code printer width for backwards compatibility with older versions of dbt Core (#120)
dbt-external-tables v0.7.2
🚨 This is a compatibility release in preparation for dbt-core
v1.0.0 (🎉). Projects using this version with dbt-core
v1.0.x can expect to see a deprecation warning. This will be resolved in the next minor release.
Fixes
- (BigQuery) Fix
create external tables
with multiple partitions, by including missing comma (#114) - (Snowflake) Fix
auto_refresh
when not specifiedFalse
(#117)
Contributors
dbt-external-tables v0.7.1
Under the hood
Need to explicitly declare auto_refresh for Snowflake-GCS integrations to work (#104)
dbt-external-tables v0.7.0
🚨 Breaking changes
- This package now requires dbt v0.20.0 and dbt-utils v0.7.0. dbt v0.20.0rc2 is currently available as a release candidate. If you are not ready to upgrade, consider using a previous version of this package.
- This package depends on
dbt-labs/dbt_utils
. If the latest version of another installed package depends onfishtown-analytics/dbt_utils
, you'll need to wait to upgrade. See discourse for details.
Under the hood
- Use new
adapter.dispatch
syntax (#99)
dbt-external-tables v0.6.4
This is a bugfix release.
Fixes
- Snowflake: Explicitly wrap DML statements in
begin
+commit
. Revert previous attempt at transactional logic (#98 @jasonzondor, #101, #103) - BigQuery: Wrap
options
values in single quotes (#93 @davesgonechina)
dbt-external-tables 0.6.3
This is a bugfix release.
Fixes
- Avoid numeric type coercion for Redshift partition values (#70)
- Aggressively commit transactions on Snowflake to avoid fruitless DML (#47)
- Handle JSON
'null'
values in Snowflake columns (#76) - Adds
integration
parameter for Snowflake external tables on Azure (#87) - Fix whitespace for Spark external tables with no columns specified (#92)
Quality of life
Contributors
Thanks to those who submitted code changes: