Releases: dbt-labs/dbt-external-tables
dbt-external-tables 0.6.2
This is a patch release that adds support for dbt v0.19.0, with backwards compatibility for v0.18.x.
Fixes
- Use
response
(v0.19) orstatus
(v0.18) for CLI logging (#64) - Quote columns with
quote: true
on Snowflake (#67)
Quality of life
- Print log line if no external sources are selected by
--args '{"select": "..."}'
(#68)
dbt-external-tables 0.6.1
dbt-external-tables 0.6.0
This is a significant release with no known breaking changes.
Features
Fixes
- Generating Redshift partition paths with 3+ partition columns
Quality of life
Contributors
Thanks to those who submitted code changes for this package:
And to everyone else who proposed, tested, or otherwise quality-checked the changes in this release: thank you!
dbt-external-tables 0.5.0
🚨 Breaking change — this package now requires dbt v0.18.0
Quality of life:
- Use 0.18.0 functionality, in particular the
adapter.dispatch
macro (#40)
dbt-external-tables 0.4.0
🚨 There is a breaking change in this release — the lower bound of dbt-utils
is now 0.4.0
.
This won't affect most users, since a version of dbt-utils in this range is required to achieve 0.17.0 compatibility.
Quality of life:
- Change dbt-utils dependencies to
[>=0.4.0, <0.6.0]
(#37)
dbt-external-tables 0.3.2
dbt-external-tables 0.3.1
dbt-external-tables 0.3.0
This is a minor release that updates to config-version: 2
of dbt_project.yml
.
Breaking changes
- Requires dbt >= v0.17.0
- Access source information from new
graph.sources
object, instead ofgraph.nodes
dbt-external-tables 0.2.0
This is a minor release that significantly changes the behavior of the stage_external_sources
macro to improve its flexibility and performance. Several of the features described below reflect breaking changes compared with the 0.1.x versions of this package.
Features
- On Snowflake, add support for staging external sources via snowpipes.
- If
external.snowpipe
is configured, thestage_external_sources
macro will create an empty table with all specifiedcolumns
plus metadata fields, run a full historical backfill viacopy
statement, and finally create a pipe that wraps the samecopy
statement. - If no
columns
are specified, the pipe will instead pull all data into a single variant column. (This behavior does not work for CSV files.)
- If
- During standard runs, the
stage_external_sources
macro will attempt to "partially refresh" external assets. It will not drop, replace, or change any existing external tables or pipe targets. To fully rebuild all external assets, supply the CLI variableext_full_refresh: true
. - The
stage_external_sources
macro now accepts aselect
argument to stage only specific nodes, using similar syntax todbt source snapshot-freshness
.
Breaking changes
- Properties of the
external
source config which previously accepted string or boolean values, such asauto_refresh
, now expect boolean values in order for therefresh_external_table
to infer whether it should be refreshed or ignored during partial refresh runs. - The
refresh_external_table
macro now returns a Jinja list[]
instead of a string.
Quality of life
- Improved info logging of the DDL/DML that the
stage_external_sources
macro is running. This is intended to provide more visibility into multistep operations, such as staging snowpipes. - Add sample analysis that, when copied to the root project's
analysis
folder, will print out a "dry run" version of all the SQL whichstage_external_sources
would run as an operation. - Add GitHub templates and codeowner
dbt-external-tables 0.1.3
Fixes
stage_external_sources
macro logs the source it is staging as{schema}.{identifier}
(#17)