You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have searched the existing issues, and I could not find an existing issue for this bug
Current Behavior
when writing a dbt microbatch model:
-- no aliasselect*from {{ ref('my_ref') }}
code is compiled to:
-- no alias compilingselect*from (
select*from"my_database"."my_schema"."my_ref"where event_time_col >='2024-12-10'and event_time_col >='2024-12-11') _some_dbt_temp_alias_here
Problem is, whenever a microbatch model is used with an alias, it throws an error.
Because basically it is passing two aliases:
-- with aliasselect*from {{ ref('my_ref') }} as my_table
code is compiled to:
-- with alias compiling-- this model will throw an error saying: "syntax error at or near 'as'"-- or "syntax error at or near 'my_table'" if as is not usedselect*from (
select*from"my_database"."my_schema"."my_ref"where event_time_col >='2024-12-10'and event_time_col >='2024-12-11') _some_dbt_temp_alias_here as my_table
A workaround I found to bypass this:
-- with alias workaroundselect*from (select*from {{ ref('my_ref') }}) as my_table
git clone https://github.com/canutog/dbt-postgres-microbatch-issue.git
cd dbt-postgres-microbatch-issue
2. Run astro dev start on terminal
astro dev start
This will start an astro-cli project with postgres and airflow. The host (172.18.0.2) available in airflow_settings.yaml and dbt/profiles.yaml is default for when you don't have any other docker containers.
If you have, you might need to change those settings.
To do that:
docker network ls
# then grab the desired network and inspect using
docker network inspect docker network inspect dbt-postgres-microbatch-issue_<some_id>_airflow
If you ever need to change settings, run this to rebuild your project:
astro dev restart
3. Run dbt though airflow or terminal
using airflow UI
Navigate to localhost:8080 and login with user: admin and password: admin
run microbatch_issue or microbatch_issue_full_refresh dag
check logs from task dbt_run_tree_routes_with_alias_error
you might want to check logs from tree_routes_no_error task also, this shows the workaround working.
using terminal
Run this to set var DBT_HOST in your env:
$Env:DBT_HOST='localhost'
then:
cd dbt
dbt run --debug
4. check errors
21:28:43 Unhandled error while executing
Exception on worker thread. Database Error
syntax error at or near "AS"
LINE 35: ...:00:00+00:00') _dbt_et_filter_subq_stg_encoded_tree AS nodes
Relevant log output
Airflow task tree_routes_with_error log:
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - 21:04:18 On model.issue.tree_routes_with_alias_error: /* {"app": "dbt", "dbt_version": "1.9.0", "profile_name": "issue", "target_name": "dev", "node_id": "model.issue.tree_routes_with_alias_error"} */
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO -
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - create table "*****"."*****"."tree_routes_with_alias_error"
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO -
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - as
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO -
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - (
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO -
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - -- depends_on: (select * from "*****"."*****"."stg_encoded_tree" where date >= '2024-12-01 00:00:00+00:00' and date < '2024-12-02 00:00:00+00:00') _dbt_et_filter_subq_stg_encoded_tree
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO -
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - WITH RECURSIVE tree_paths AS (
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - SELECT
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - NodeID::VARCHAR AS route,
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - Value::VARCHAR AS full_path,
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - *
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - FROM (select * from (select * from "*****"."*****"."stg_encoded_tree" where date >= '2024-12-01 00:00:00+00:00' and date < '2024-12-02 00:00:00+00:00') _dbt_et_filter_subq_stg_encoded_tree) as nodes
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - WHERE ParentID IS NULL
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO -
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - UNION ALL
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO -
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - SELECT
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - CONCAT(tree_paths.route, '->', nodes.NodeID::VARCHAR) AS route,
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - CONCAT(tree_paths.full_path, '->', nodes.Value) AS full_path,
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - nodes.NodeID,
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - nodes.ParentID,
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - nodes.Value,
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - nodes.date
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - FROM tree_paths
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - JOIN (select * from "*****"."*****"."stg_encoded_tree" where date >= '2024-12-01 00:00:00+00:00' and date < '2024-12-02 00:00:00+00:00') _dbt_et_filter_subq_stg_encoded_tree nodes
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - ON tree_paths.NodeID = nodes.ParentID
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - and tree_paths.date = nodes.date
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - )
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO -
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - SELECT
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - route,
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - full_path,
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - date
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - FROM tree_paths
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - ORDER BY route
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - );
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO -
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - 21:04:18 Postgres adapter: Postgres error: syntax error at or near "nodes"
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - LINE 35: ... 00:00:00+00:00') _dbt_et_filter_subq_stg_encoded_tree nodes
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - ^
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO -
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - 21:04:18 On model.issue.tree_routes_with_alias_error: ROLLBACK
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - 21:04:18 Unhandled error while executing
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - Exception on worker thread. Database Error
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - syntax error at or near "nodes"
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - LINE 35: ... 00:00:00+00:00') _dbt_et_filter_subq_stg_encoded_tree nodes
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - ^
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - 21:04:18 On model.issue.tree_routes_with_alias_error: Close
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - 21:04:18 Batch 1 of 11 ERROR creating batch 2024-12-01 of *****.tree_routes_with_alias_error [ERROR in 0.09s]
Airflow task tree_routes_no_error log:
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - 21:04:18 Began executing node model.issue.tree_routes_no_error
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - 21:04:18 Using ***** connection "model.issue.tree_routes_no_error"
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - 21:04:18 On model.issue.tree_routes_no_error: /* {"app": "dbt", "dbt_version": "1.9.0", "profile_name": "issue", "target_name": "dev", "node_id": "model.issue.tree_routes_no_error"} */
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO -
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO -
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - create temporary table "tree_routes_no_error__dbt_tmp_20241210210418308074"
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO -
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO -
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - as
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO -
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - (
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO -
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - -- depends_on: (select * from "*****"."*****"."stg_encoded_tree" where date >= '2024-12-10 00:00:00+00:00' and date < '2024-12-11 00:00:00+00:00') _dbt_et_filter_subq_stg_encoded_tree
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO -
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - WITH RECURSIVE tree_paths AS (
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - SELECT
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - NodeID::VARCHAR AS route,
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - Value::VARCHAR AS full_path,
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - *
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - FROM (select * from "*****"."*****"."stg_encoded_tree" where date >= '2024-12-10 00:00:00+00:00' and date < '2024-12-11 00:00:00+00:00') _dbt_et_filter_subq_stg_encoded_tree
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - WHERE ParentID IS NULL
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO -
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - UNION ALL
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO -
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - SELECT
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - CONCAT(tree_paths.route, '->', nodes.NodeID::VARCHAR) AS route,
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - CONCAT(tree_paths.full_path, '->', nodes.Value) AS full_path,
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - nodes.NodeID,
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - nodes.ParentID,
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - nodes.Value,
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - nodes.date
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - FROM tree_paths
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - JOIN (select * from (select * from "*****"."*****"."stg_encoded_tree" where date >= '2024-12-10 00:00:00+00:00' and date < '2024-12-11 00:00:00+00:00') _dbt_et_filter_subq_stg_encoded_tree) AS nodes
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - ON tree_paths.NodeID = nodes.ParentID
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - and tree_paths.date = nodes.date
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - )
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO -
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - SELECT
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - route,
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - full_path,
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - date
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - FROM tree_paths
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - ORDER BY route
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - );
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO -
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - 21:04:18 Opening a new connection, currently in state closed
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - 21:04:18 SQL status: SELECT 0 in 0.007 seconds
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - 21:04:18 Using ***** connection "model.issue.tree_routes_no_error"
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - 21:04:18 On model.issue.tree_routes_no_error: BEGIN
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - 21:04:18 SQL status: BEGIN in 0.000 seconds
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - 21:04:18 Using ***** connection "model.issue.tree_routes_no_error"
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - 21:04:18 On model.issue.tree_routes_no_error: /* {"app": "dbt", "dbt_version": "1.9.0", "profile_name": "issue", "target_name": "dev", "node_id": "model.issue.tree_routes_no_error"} */
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO -
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - select
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - column_name,
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - data_type,
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - character_maximum_length,
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - numeric_precision,
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - numeric_scale
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO -
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - from INFORMATION_SCHEMA.columns
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - where table_name = 'tree_routes_no_error__dbt_tmp_20241210210418308074'
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO -
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - order by ordinal_position
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO -
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - 21:04:18 SQL status: SELECT 3 in 0.006 seconds
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - 21:04:18 Using ***** connection "model.issue.tree_routes_no_error"
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - 21:04:18 On model.issue.tree_routes_no_error: /* {"app": "dbt", "dbt_version": "1.9.0", "profile_name": "issue", "target_name": "dev", "node_id": "model.issue.tree_routes_no_error"} */
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO -
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - select
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - column_name,
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - data_type,
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - character_maximum_length,
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - numeric_precision,
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - numeric_scale
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO -
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - from "*****".INFORMATION_SCHEMA.columns
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - where table_name = 'tree_routes_no_error'
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO -
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - and table_schema = '*****'
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO -
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - order by ordinal_position
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO -
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - 21:04:18 SQL status: SELECT 3 in 0.003 seconds
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - 21:04:18 Using ***** connection "model.issue.tree_routes_no_error"
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - 21:04:18 On model.issue.tree_routes_no_error: /* {"app": "dbt", "dbt_version": "1.9.0", "profile_name": "issue", "target_name": "dev", "node_id": "model.issue.tree_routes_no_error"} */
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO -
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - select
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - column_name,
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - data_type,
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - character_maximum_length,
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - numeric_precision,
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - numeric_scale
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO -
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - from "*****".INFORMATION_SCHEMA.columns
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - where table_name = 'tree_routes_no_error'
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO -
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - and table_schema = '*****'
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO -
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - order by ordinal_position
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO -
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO -
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - 21:04:18 SQL status: SELECT 3 in 0.002 seconds
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - 21:04:18 Writing runtime sql for node "model.issue.tree_routes_no_error"
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - 21:04:18 Using ***** connection "model.issue.tree_routes_no_error"
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - 21:04:18 On model.issue.tree_routes_no_error: /* {"app": "dbt", "dbt_version": "1.9.0", "profile_name": "issue", "target_name": "dev", "node_id": "model.issue.tree_routes_no_error"} */
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO -
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - -- back compat for old kwarg name
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO -
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO -
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - merge into "*****"."*****"."tree_routes_no_error" as DBT_INTERNAL_DEST
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - using "tree_routes_no_error__dbt_tmp_20241210210418308074" as DBT_INTERNAL_SOURCE
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - on (
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - DBT_INTERNAL_SOURCE.route = DBT_INTERNAL_DEST.route
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - ) and (
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - DBT_INTERNAL_SOURCE.date = DBT_INTERNAL_DEST.date
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - )
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO -
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - when matched then update set
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - "route" = DBT_INTERNAL_SOURCE."route","full_path" = DBT_INTERNAL_SOURCE."full_path","date" = DBT_INTERNAL_SOURCE."date"
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO -
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - when not matched then insert
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - ("route", "full_path", "date")
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - values
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - ("route", "full_path", "date")
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO -
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - 21:04:18 SQL status: MERGE 0 in 0.001 seconds
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - 21:04:18 On model.issue.tree_routes_no_error: COMMIT
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - 21:04:18 Using ***** connection "model.issue.tree_routes_no_error"
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - 21:04:18 On model.issue.tree_routes_no_error: COMMIT
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - 21:04:18 SQL status: COMMIT in 0.001 seconds
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - 21:04:18 On model.issue.tree_routes_no_error: Close
[2024-12-11, 18:04:18 -03] {subprocess.py:106} INFO - 21:04:18 Batch 1 of 2 OK created batch 2024-12-10 of *****.tree_routes_no_error ............ [MERGE 0 in 0.18s]
Environment
- OS: Windows 11 22H2 OS build 22621.4460
- Python: 3.12
- dbt-core: 1.9.0
- dbt-postgres: 1.9.0
Additional Context
No response
The text was updated successfully, but these errors were encountered:
Is this a new bug?
Current Behavior
when writing a dbt microbatch model:
code is compiled to:
Problem is, whenever a microbatch model is used with an alias, it throws an error.
Because basically it is passing two aliases:
code is compiled to:
A workaround I found to bypass this:
code is compiled to:
A little ugly but it works
Expected Behavior
When using alias with a microbatch reference, using this syntax should not throw an error:
Steps To Reproduce
1. Git clone this repo
2. Run astro dev start on terminal
This will start an astro-cli project with postgres and airflow. The host (172.18.0.2) available in
airflow_settings.yaml
anddbt/profiles.yaml
is default for when you don't have any other docker containers.If you have, you might need to change those settings.
To do that:
If you ever need to change settings, run this to rebuild your project:
3. Run dbt though airflow or terminal
using airflow UI
admin
and password:admin
microbatch_issue
ormicrobatch_issue_full_refresh
dagdbt_run_tree_routes_with_alias_error
tree_routes_no_error
task also, this shows the workaround working.using terminal
Run this to set var DBT_HOST in your env:
then:
cd dbt dbt run --debug
4. check errors
Relevant log output
Airflow task
tree_routes_with_error
log:Airflow task
tree_routes_no_error
log:Environment
Additional Context
No response
The text was updated successfully, but these errors were encountered: