Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] Limitation of dbt unit tests: not support the use of introspective queries (run_query, etc.) #11121

Closed
2 tasks done
ashwini-kumar2 opened this issue Dec 10, 2024 · 5 comments
Labels
bug Something isn't working duplicate This issue or pull request already exists unit tests Issues related to built-in dbt unit testing functionality user docs [docs.getdbt.com] Needs better documentation

Comments

@ashwini-kumar2
Copy link

ashwini-kumar2 commented Dec 10, 2024

Is this a new bug in dbt-core?

  • I believe this is a new bug in dbt-core
  • I have searched the existing issues, and I could not find an existing issue for this bug

Current Behavior

dbt unit tests do not support the use of source('some_schema', 'some_table') inside a variable.

command to execute dbt unit test
dbt test --select test_name

Expected Behavior

DBT trying to query a CTE which does not exist, Ideally dbt should create the CTE before quering it

select col1, col2 from  __dbt__cte__SOME_TABLE
 where col3 = "abc"

 __dbt__cte__SOME_TABLE does not exists

Steps To Reproduce

consider this model which uses source inside a variable

{{
    config(
        schema='ABC',
        alias='dummy_model',
        materialized = 'incremental',
    )
}}


{% set dummy_query %}

 select col1, col2 from {{ source('ABC', 'SOME_TABLE') }}
 where col3 = "abc"
{% endset %}



select * from ({{dummy_query}})

DBT generates below query, dbt tries to query a cte which doesn't exist

 select col1, col2 from  __dbt__cte__SOME_TABLE
 where col3 = "abc"

 __dbt__cte__SOME_TABLE does not exists

However changing the model to this works well with unit test

{{
    config(
        schema='ABC',
        alias='dummy_model',
        materialized = 'incremental',
    )
}}


with  dummy_query as (

 select col1, col2 from {{ source('ABC', 'SOME_TABLE') }}
 where col3 = "abc"
)



select * from dummy_query

Relevant log output

when using source inside a variable dbt tries to query a cte which doesn't exists

 select col1, col2 from  __dbt__cte__SOME_TABLE
 where col3 = "abc"

 __dbt__cte__SOME_TABLE does not exists

Environment

- OS: Windows
- Python: Python 3.9.6
- dbt:  1.8.8

Which database adapter are you using with dbt?

other (mention it in "Additional Context")

Additional Context

Azure Databricks

@ashwini-kumar2 ashwini-kumar2 added bug Something isn't working triage labels Dec 10, 2024
@dbeatty10 dbeatty10 added the unit tests Issues related to built-in dbt unit testing functionality label Dec 10, 2024
@dbeatty10
Copy link
Contributor

Thanks for reaching out @ashwini-kumar2 !

I wasn't able to reproduce this. Could you try tweaking the simple example below and see if you're able to trigger the error you reported?

Project files and commands

Create these files:

models/dummy.sql

select 1 as id

models/my_model.sql

{{
    config(
        materialized='incremental',
    )
}}

{% set dummy_query %}

    select * from {{ source("my_source", "dummy") }}

{% endset %}

select * from ({{ dummy_query }})

models/_properties.yml

sources:
  - name: my_source
    database: "{{ target.database }}"  
    schema: "{{ target.schema }}"  
    tables:
      - name: dummy


unit_tests:
  - name: test_name
    model: my_model
    overrides:
      macros:
        is_incremental: true
    given:
      - input: source('my_source', 'dummy')
        rows:
          - {id: 3}
    expect:
        rows:
          - {id: 3}

Run these commands:

dbt run --select dummy
dbt run --select my_model
dbt test --select test_name

@dbeatty10 dbeatty10 changed the title Issue: dbt Unit Test Limitation :: [dbt unit tests do not support the use of source('some_schema', 'some_table') inside a variable.] [Bug] dbt Unit Test Limitation :: dbt unit tests do not support the use of source('some_schema', 'some_table') inside a variable Dec 10, 2024
@ashwini-kumar2
Copy link
Author

I'm able to reproduce with this

models/my_model.sql

{{
    config(
        materialized='incremental',
    )
}}
{{dummy_macro()}}
select 1


macro/dummy_macro.sql



{% macro dummy_macro() %}


{% set dummy_query %}

    select * from {{ source("my_source", "dummy") }}

{% endset %}
  
{% endmacro %}

unit_test/dummy_test.yml


unit_tests:
  - name: dummy_test
    description: "Unit test dummy "
    model: dummy_model
    overrides:
      macros:
        is_incremental: true
    given:
      - input: source('my_source', 'dummy')
        format: sql
        fixture: dummy_fixture
    expect:
      format: sql
      fixture:  dummy_fixture

@dbeatty10
Copy link
Contributor

Thanks for providing more info @ashwini-kumar2 👍

The example you shared looks like it should have model: my_model instead of model: dummy_model (plus some other changes).

Here's a complete working example that looks like it is closest to what you intended. Could you try it and tweak it to reproduce the issue you saw?

If you are able to reproduce the issue, could you share all of the following?

  1. The files you used
  2. The commands you ran
  3. The full log output you got (including the error message)

Project files and commands

Create these files:

models/dummy.sql

select 1 as id

macros/dummy_macro.sql

{% macro dummy_macro() %}

    select * from {{ source("my_source", "dummy") }}

{% endmacro %}

models/dummy_model.sql

{{
    config(
        materialized='incremental',
    )
}}

{% set dummy_query = dummy_macro() %}

select * from ({{ dummy_query }})

tests/fixtures/dummy_fixture.csv

id
99

models/_properties.yml

sources:
  - name: my_source
    database: "{{ target.database }}"  
    schema: "{{ target.schema }}"  
    tables:
      - name: dummy


unit_tests:
  - name: dummy_test
    description: "Unit test dummy "
    model: dummy_model
    overrides:
      macros:
        is_incremental: true
    given:
      - input: source('my_source', 'dummy')
        format: csv
        fixture: dummy_fixture
    expect:
      format: csv
      fixture:  dummy_fixture

Run these commands:

dbt run --select dummy
dbt run --select dummy_model
dbt test --select dummy_test

Get this output (using dbt-duckdb):

$ dbt test --select dummy_test         
14:18:20  Running with dbt=1.9.0
14:18:20  Registered adapter: duckdb=1.9.1
14:18:20  Found 3 models, 1 source, 423 macros, 1 unit test
14:18:20  
14:18:20  Concurrency: 1 threads (target='duckdb')
14:18:20  
14:18:21  1 of 1 START unit_test dummy_model::dummy_test ................................. [RUN]
14:18:21  1 of 1 PASS dummy_model::dummy_test ............................................ [PASS in 0.16s]
14:18:21  
14:18:21  Finished running 1 unit test in 0 hours 0 minutes and 0.29 seconds (0.29s).
14:18:21  
14:18:21  Completed successfully
14:18:21  
14:18:21  Done. PASS=1 WARN=0 ERROR=0 SKIP=0 TOTAL=1

@ashwini-kumar2
Copy link
Author

Thank you @dbeatty10 for the support. I think I'm presenting the issue in chunks. Apologies for that.

Here are the complete details to reproduce it

macros/dummy_macro_which_calls_score.sql

{% macro dummy_macro_which_calls_score() %}

    {% set query%}

    select * from {{ source("my_source", "dummy_score_table") }}
    limit 1
    {% endset %}

   {%set dummy_result = [] %}
    {% if execute %}
   {% set dummy_result = run_query(query) %}
   {% endif %}
    {{return(dummy_result)}}
{% endmacro %}

models/dummy_score.sql

select 1 as score

models/my_model.sql

{{
    config(
        materialized='incremental',
    )
}}

{% set dummy_result = dummy_macro_which_calls_score() %}

select 1

models/sources/DUMMY/sources.yml

sources:
  - name: my_source
    description: ""
    schema: dbt_test123
    tables:
      - name: dummy_score_table

models/unit_tests/dummy_model_test.yml

unit_tests:
  - name: my_model_dummy_test
    description: "Unit test dummy "
    model: my_model
    overrides:
      macros:
        is_incremental: true
    given:
      - input: source("my_source", "dummy_score_table")
        format: sql
        fixture: dummy_score_table
    expect:
      format: sql
      fixture:  dummy_score_table

tests/fixtures/unit_test_mocked_data/dummy_score_table.sql

select 1 as score
union all
select 24 as score
union all
select 45 as score

LOGS



�[0m14:03:04.234700 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'invocation', 'label': 'start', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x000001B695CBB340>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x000001B698762C10>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x000001B698762A30>]}


============================== 14:03:04.254232 | ceccfff2-dfd8-47ef-b3a3-0c114a511421 ==============================
�[0m14:03:04.254232 [info ] [MainThread]: Running with dbt=1.8.8
�[0m14:03:04.261867 [debug] [MainThread]: running dbt with arguments {'printer_width': '80', 'indirect_selection': 'eager', 'write_json': 'True', 'log_cache_events': 'False', 'partial_parse': 'True', 'cache_selected_only': 'False', 'warn_error': 'None', 'version_check': 'True', 'profiles_dir': 'C:\\Users\\some_name\\some_gen\\projects\\foor_bar123-some_name', 'log_path': 'C:\\Users\\some_name\\some_gen\\projects\\foor_bar123-some_name\\logs', 'fail_fast': 'False', 'debug': 'False', 'use_colors': 'True', 'use_experimental_parser': 'False', 'empty': 'None', 'quiet': 'False', 'no_print': 'None', 'log_format': 'default', 'static_parser': 'True', 'warn_error_options': 'WarnErrorOptions(include=[], exclude=[])', 'introspect': 'True', 'target_path': 'None', 'invocation_command': 'dbt test --select my_model_dummy_test', 'send_anonymous_usage_stats': 'True'}
�[0m14:03:04.568080 [debug] [MainThread]: Spark adapter: Setting pyhive.hive logging to ERROR
�[0m14:03:04.571684 [debug] [MainThread]: Spark adapter: Setting thrift.transport logging to ERROR
�[0m14:03:04.571684 [debug] [MainThread]: Spark adapter: Setting thrift.protocol logging to ERROR
�[0m14:03:07.527681 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'project_id', 'label': 'ceccfff2-dfd8-47ef-b3a3-0c114a511421', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x000001B695FD3E80>]}
�[0m14:03:07.640116 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'adapter_info', 'label': 'ceccfff2-dfd8-47ef-b3a3-0c114a511421', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x000001B697CD18B0>]}
�[0m14:03:07.640116 [info ] [MainThread]: Registered adapter: databricks=1.8.7
�[0m14:03:07.670903 [debug] [MainThread]: checksum: ec53e341a936509427e16df1c646386a84befec52c1957e3f4359fbc36fc93f5, vars: {}, profile: , target: , version: 1.8.8
�[0m14:03:10.624792 [debug] [MainThread]: Partial parsing enabled: 0 files deleted, 0 files added, 0 files changed.
�[0m14:03:10.628818 [debug] [MainThread]: Partial parsing enabled, no changes found, skipping parsing
�[0m14:03:11.051473 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'load_project', 'label': 'ceccfff2-dfd8-47ef-b3a3-0c114a511421', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x000001B6AB48A130>]}
�[0m14:03:13.262171 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'resource_counts', 'label': 'ceccfff2-dfd8-47ef-b3a3-0c114a511421', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x000001B6AE564DF0>]}
�[0m14:03:13.262171 [info ] [MainThread]: Found 554 models, 156 snapshots, 1753 data tests, 562 sources, 819 macros, 8 unit tests
�[0m14:03:13.262171 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'runnable_timing', 'label': 'ceccfff2-dfd8-47ef-b3a3-0c114a511421', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x000001B6AE586E50>]}
�[0m14:03:13.324028 [info ] [MainThread]: 
�[0m14:03:13.328400 [debug] [MainThread]: Databricks adapter: DatabricksDBTConnection(id=1884120761392, session-id=None, name=master, idle-time=0s, acquire-count=0, language=None, thread-identifier=(4056, 4888), compute-name=) - Creating connection
�[0m14:03:13.328400 [debug] [MainThread]: Acquiring new databricks connection 'master'
�[0m14:03:13.330413 [debug] [MainThread]: Databricks adapter: DatabricksDBTConnection(id=1884120761392, session-id=None, name=master, idle-time=0.0s, acquire-count=1, language=None, thread-identifier=(4056, 4888), compute-name=) - Acquired connection on thread (4056, 4888), using default compute resource
�[0m14:03:13.458724 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=1884128846464, session-id=None, name=dummy_123, idle-time=0s, acquire-count=0, language=None, thread-identifier=(4056, 30432), compute-name=) - Creating connection
�[0m14:03:13.458724 [debug] [ThreadPool]: Acquiring new databricks connection 'dummy_123'
�[0m14:03:13.458724 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=dummy_1234567, session-id=None, name=foor_bar123, idle-time=0s, acquire-count=0, language=None, thread-identifier=(4056, 36200), compute-name=) - Creating connection
�[0m14:03:13.461859 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=1884128846464, session-id=None, name=dummy_123, idle-time=0.0s, acquire-count=1, language=None, thread-identifier=(4056, 30432), compute-name=) - Acquired connection on thread (4056, 30432), using default compute resource
�[0m14:03:13.462868 [debug] [ThreadPool]: Acquiring new databricks connection 'foor_bar123'
�[0m14:03:13.462868 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=dummy_12345679, session-id=None, name=foor_bar123, idle-time=0s, acquire-count=0, language=None, thread-identifier=(4056, 8840), compute-name=) - Creating connection
�[0m14:03:13.462868 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=1884128846464, session-id=None, name=dummy_123, idle-time=0.0010085105895996094s, acquire-count=1, language=None, thread-identifier=(4056, 30432), compute-name=) - Checking idleness
�[0m14:03:13.467062 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=dummy_1234567, session-id=None, name=foor_bar123, idle-time=0.0s, acquire-count=1, language=None, thread-identifier=(4056, 36200), compute-name=) - Acquired connection on thread (4056, 36200), using default compute resource
�[0m14:03:13.467062 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=1884128846464, session-id=None, name=dummy_123, idle-time=0.0052030086517333984s, acquire-count=1, language=None, thread-identifier=(4056, 30432), compute-name=) - Retrieving connection
�[0m14:03:13.471011 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=1884128846848, session-id=None, name=dummy_123, idle-time=0s, acquire-count=0, language=None, thread-identifier=(4056, 30248), compute-name=) - Creating connection
�[0m14:03:13.471011 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=dummy_1234, session-id=None, name=list_hive_metastore_foor_bar123, idle-time=0s, acquire-count=0, language=None, thread-identifier=(4056, 33844), compute-name=) - Creating connection
�[0m14:03:13.471011 [debug] [ThreadPool]: Acquiring new databricks connection 'foor_bar123'
�[0m14:03:13.473388 [debug] [ThreadPool]: Using databricks connection "dummy_123"
�[0m14:03:13.473388 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=dummy_1234567, session-id=None, name=foor_bar123, idle-time=0.006325721740722656s, acquire-count=1, language=None, thread-identifier=(4056, 36200), compute-name=) - Checking idleness
�[0m14:03:13.475376 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=dummy_12345679, session-id=None, name=foor_bar123, idle-time=0.0s, acquire-count=1, language=None, thread-identifier=(4056, 8840), compute-name=) - Acquired connection on thread (4056, 8840), using default compute resource
�[0m14:03:13.475376 [debug] [ThreadPool]: On dummy_123: GetTables(database=hive_metastore, schema=foor_bar123, identifier=None)
�[0m14:03:13.476477 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=dummy_1234567, session-id=None, name=foor_bar123, idle-time=0.009415149688720703s, acquire-count=1, language=None, thread-identifier=(4056, 36200), compute-name=) - Retrieving connection
�[0m14:03:13.476477 [debug] [ThreadPool]: Acquiring new databricks connection 'dummy_123'
�[0m14:03:13.476477 [debug] [ThreadPool]: Acquiring new databricks connection 'list_hive_metastore_foor_bar123'
�[0m14:03:13.476477 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=dummy_12345679, session-id=None, name=foor_bar123, idle-time=0.0011010169982910156s, acquire-count=1, language=None, thread-identifier=(4056, 8840), compute-name=) - Checking idleness
�[0m14:03:13.476477 [debug] [ThreadPool]: Opening a new connection, currently in state init
�[0m14:03:13.481774 [debug] [ThreadPool]: Using databricks connection "foor_bar123"
�[0m14:03:13.482807 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=1884128846848, session-id=None, name=dummy_123, idle-time=0.0s, acquire-count=1, language=None, thread-identifier=(4056, 30248), compute-name=) - Acquired connection on thread (4056, 30248), using default compute resource
�[0m14:03:13.483895 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=dummy_1234, session-id=None, name=list_hive_metastore_foor_bar123, idle-time=0.0s, acquire-count=1, language=None, thread-identifier=(4056, 33844), compute-name=) - Acquired connection on thread (4056, 33844), using default compute resource
�[0m14:03:13.484441 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=dummy_12345679, session-id=None, name=foor_bar123, idle-time=0.009064674377441406s, acquire-count=1, language=None, thread-identifier=(4056, 8840), compute-name=) - Retrieving connection
�[0m14:03:13.486083 [debug] [ThreadPool]: On foor_bar123: GetTables(database=hive_metastore, schema=foor_bar123_internal, identifier=None)
�[0m14:03:13.487179 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=1884128846848, session-id=None, name=dummy_123, idle-time=0.003831624984741211s, acquire-count=1, language=None, thread-identifier=(4056, 30248), compute-name=) - Checking idleness
�[0m14:03:13.520584 [debug] [ThreadPool]: Using databricks connection "foor_bar123"
�[0m14:03:13.520584 [debug] [ThreadPool]: Opening a new connection, currently in state init
�[0m14:03:13.522733 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=1884128846848, session-id=None, name=dummy_123, idle-time=0.03992605209350586s, acquire-count=1, language=None, thread-identifier=(4056, 30248), compute-name=) - Retrieving connection
�[0m14:03:13.522733 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=dummy_1234, session-id=None, name=list_hive_metastore_foor_bar123, idle-time=0.03883838653564453s, acquire-count=1, language=None, thread-identifier=(4056, 33844), compute-name=) - Checking idleness
�[0m14:03:13.522733 [debug] [ThreadPool]: On foor_bar123: GetTables(database=hive_metastore, schema=foor_bar123_internal, identifier=None)
�[0m14:03:13.525785 [debug] [ThreadPool]: Using databricks connection "dummy_123"
�[0m14:03:13.554254 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=dummy_1234, session-id=None, name=list_hive_metastore_foor_bar123, idle-time=0.0703589916229248s, acquire-count=1, language=None, thread-identifier=(4056, 33844), compute-name=) - Retrieving connection
�[0m14:03:13.562232 [debug] [ThreadPool]: Opening a new connection, currently in state init
�[0m14:03:13.582811 [debug] [ThreadPool]: On dummy_123: GetTables(database=hive_metastore, schema=foor_bar123, identifier=None)
�[0m14:03:13.582811 [debug] [ThreadPool]: Using databricks connection "list_hive_metastore_foor_bar123"
�[0m14:03:13.623996 [debug] [ThreadPool]: Opening a new connection, currently in state init
�[0m14:03:13.646293 [debug] [ThreadPool]: On list_hive_metastore_foor_bar123: GetTables(database=hive_metastore, schema=foor_bar123, identifier=None)
�[0m14:03:13.672975 [debug] [ThreadPool]: Opening a new connection, currently in state init
�[0m14:03:17.021908 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=dummy_1234567, session-id=dummy_12345, name=foor_bar123, idle-time=0.0s, acquire-count=1, language=None, thread-identifier=(4056, 36200), compute-name=) - Connection created
�[0m14:03:17.021908 [debug] [ThreadPool]: Databricks adapter: Cursor(session-id=dummy_12345, command-id=Unknown) - Created cursor
�[0m14:03:17.046779 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=1884128846848, session-id=01efb863-bc1a-1ada-8dc3-285dc6e00cf3, name=dummy_123, idle-time=0.0s, acquire-count=1, language=None, thread-identifier=(4056, 30248), compute-name=) - Connection created
�[0m14:03:17.046779 [debug] [ThreadPool]: Databricks adapter: Cursor(session-id=01efb863-bc1a-1ada-8dc3-285dc6e00cf3, command-id=Unknown) - Created cursor
�[0m14:03:17.071670 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=1884128846464, session-id=01efb863-bc1a-1f3b-bb22-4c93cad8ea7d, name=dummy_123, idle-time=0.0s, acquire-count=1, language=None, thread-identifier=(4056, 30432), compute-name=) - Connection created
�[0m14:03:17.071670 [debug] [ThreadPool]: Databricks adapter: Cursor(session-id=01efb863-bc1a-1f3b-bb22-4c93cad8ea7d, command-id=Unknown) - Created cursor
�[0m14:03:17.143738 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=dummy_12345679, session-id=dummy_123456, name=foor_bar123, idle-time=0.0s, acquire-count=1, language=None, thread-identifier=(4056, 8840), compute-name=) - Connection created
�[0m14:03:17.143738 [debug] [ThreadPool]: Databricks adapter: Cursor(session-id=dummy_123456, command-id=Unknown) - Created cursor
�[0m14:03:18.097394 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=dummy_1234, session-id=dummy_1234567912, name=list_hive_metastore_foor_bar123, idle-time=0.0s, acquire-count=1, language=None, thread-identifier=(4056, 33844), compute-name=) - Connection created
�[0m14:03:18.113002 [debug] [ThreadPool]: Databricks adapter: Cursor(session-id=dummy_1234567912, command-id=Unknown) - Created cursor
�[0m14:03:19.313361 [debug] [ThreadPool]: SQL status: OK in 5.840 seconds
�[0m14:03:19.555715 [debug] [ThreadPool]: Databricks adapter: Cursor(session-id=01efb863-bc1a-1f3b-bb22-4c93cad8ea7d, command-id=01efb863-bc6a-1179-adb0-4b1e77e4b2d4) - Closing cursor
�[0m14:03:19.723577 [debug] [ThreadPool]: SQL status: OK in 6.050 seconds
�[0m14:03:19.799383 [debug] [ThreadPool]: SQL status: OK in 6.280 seconds
�[0m14:03:19.839205 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=1884128846464, session-id=01efb863-bc1a-1f3b-bb22-4c93cad8ea7d, name=dummy_123, idle-time=2.7675344944000244s, acquire-count=1, language=None, thread-identifier=(4056, 30432), compute-name=) - Checking idleness
�[0m14:03:19.855406 [debug] [ThreadPool]: SQL status: OK in 6.290 seconds
�[0m14:03:19.857204 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=1884128846464, session-id=01efb863-bc1a-1f3b-bb22-4c93cad8ea7d, name=dummy_123, idle-time=2.785534143447876s, acquire-count=1, language=None, thread-identifier=(4056, 30432), compute-name=) - Retrieving connection
�[0m14:03:19.860175 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=1884128846464, session-id=01efb863-bc1a-1f3b-bb22-4c93cad8ea7d, name=dummy_123, idle-time=2.7885046005249023s, acquire-count=1, language=None, thread-identifier=(4056, 30432), compute-name=) - Checking idleness
�[0m14:03:19.861685 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=1884128846464, session-id=01efb863-bc1a-1f3b-bb22-4c93cad8ea7d, name=dummy_123, idle-time=2.79001522064209s, acquire-count=1, language=None, thread-identifier=(4056, 30432), compute-name=) - Retrieving connection
�[0m14:03:19.863037 [debug] [ThreadPool]: Spark adapter: NotImplemented: add_begin_query
�[0m14:03:19.864140 [debug] [ThreadPool]: Using databricks connection "dummy_123"
�[0m14:03:19.864140 [debug] [ThreadPool]: On dummy_123: /* {"app": "dbt", "dbt_version": "1.8.8", "dbt_databricks_version": "1.8.7", "databricks_sql_connector_version": "3.1.2", "profile_name": "data_transformation", "target_name": "dev", "connection_name": "dummy_123"} */

      select current_catalog()
  
�[0m14:03:19.865809 [debug] [ThreadPool]: Databricks adapter: Cursor(session-id=01efb863-bc1a-1f3b-bb22-4c93cad8ea7d, command-id=Unknown) - Created cursor
�[0m14:03:19.896995 [debug] [ThreadPool]: Databricks adapter: Cursor(session-id=dummy_1234567912, command-id=01efb863-bd05-1b3a-b1ef-7ef24e192f50) - Closing cursor
�[0m14:03:19.928683 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=dummy_1234, session-id=dummy_1234567912, name=list_hive_metastore_foor_bar123, idle-time=1.79960036277771s, acquire-count=1, language=None, thread-identifier=(4056, 33844), compute-name=) - Checking idleness
�[0m14:03:19.960027 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=dummy_1234, session-id=dummy_1234567912, name=list_hive_metastore_foor_bar123, idle-time=1.8626329898834229s, acquire-count=1, language=None, thread-identifier=(4056, 33844), compute-name=) - Retrieving connection
�[0m14:03:19.960027 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=dummy_1234, session-id=dummy_1234567912, name=list_hive_metastore_foor_bar123, idle-time=1.8626329898834229s, acquire-count=1, language=None, thread-identifier=(4056, 33844), compute-name=) - Checking idleness
�[0m14:03:19.993102 [debug] [ThreadPool]: Databricks adapter: Cursor(session-id=dummy_12345, command-id=01efb863-bc5e-12e2-b0c8-511c980a43e0) - Closing cursor
�[0m14:03:20.007114 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=dummy_1234, session-id=dummy_1234567912, name=list_hive_metastore_foor_bar123, idle-time=1.909719705581665s, acquire-count=1, language=None, thread-identifier=(4056, 33844), compute-name=) - Retrieving connection
�[0m14:03:20.025491 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=dummy_1234567, session-id=dummy_12345, name=foor_bar123, idle-time=3.0035831928253174s, acquire-count=1, language=None, thread-identifier=(4056, 36200), compute-name=) - Checking idleness
�[0m14:03:20.070301 [debug] [ThreadPool]: Databricks adapter: Cursor(session-id=dummy_123456, command-id=01efb863-bc72-12a7-a2d5-ff7d4133fc8e) - Closing cursor
�[0m14:03:20.070301 [debug] [ThreadPool]: Spark adapter: NotImplemented: add_begin_query
�[0m14:03:20.070301 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=dummy_1234567, session-id=dummy_12345, name=foor_bar123, idle-time=3.0483932495117188s, acquire-count=1, language=None, thread-identifier=(4056, 36200), compute-name=) - Retrieving connection
�[0m14:03:20.103053 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=dummy_12345679, session-id=dummy_123456, name=foor_bar123, idle-time=2.9593148231506348s, acquire-count=1, language=None, thread-identifier=(4056, 8840), compute-name=) - Checking idleness
�[0m14:03:20.103053 [debug] [ThreadPool]: Using databricks connection "list_hive_metastore_foor_bar123"
�[0m14:03:20.103053 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=dummy_12345679, session-id=dummy_123456, name=foor_bar123, idle-time=2.9593148231506348s, acquire-count=1, language=None, thread-identifier=(4056, 8840), compute-name=) - Retrieving connection
�[0m14:03:20.103053 [debug] [ThreadPool]: On list_hive_metastore_foor_bar123: /* {"app": "dbt", "dbt_version": "1.8.8", "dbt_databricks_version": "1.8.7", "databricks_sql_connector_version": "3.1.2", "profile_name": "data_transformation", "target_name": "dev", "connection_name": "list_hive_metastore_foor_bar123"} */

      select current_catalog()
  
�[0m14:03:20.103053 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=dummy_1234567, session-id=dummy_12345, name=foor_bar123, idle-time=3.0811452865600586s, acquire-count=1, language=None, thread-identifier=(4056, 36200), compute-name=) - Checking idleness
�[0m14:03:20.103053 [debug] [ThreadPool]: Databricks adapter: Cursor(session-id=dummy_1234567912, command-id=Unknown) - Created cursor
�[0m14:03:20.103053 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=dummy_1234567, session-id=dummy_12345, name=foor_bar123, idle-time=3.0811452865600586s, acquire-count=1, language=None, thread-identifier=(4056, 36200), compute-name=) - Retrieving connection
�[0m14:03:20.103053 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=dummy_12345679, session-id=dummy_123456, name=foor_bar123, idle-time=2.9593148231506348s, acquire-count=1, language=None, thread-identifier=(4056, 8840), compute-name=) - Checking idleness
�[0m14:03:20.103053 [debug] [ThreadPool]: Spark adapter: NotImplemented: add_begin_query
�[0m14:03:20.103053 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=dummy_12345679, session-id=dummy_123456, name=foor_bar123, idle-time=2.9593148231506348s, acquire-count=1, language=None, thread-identifier=(4056, 8840), compute-name=) - Retrieving connection
�[0m14:03:20.103053 [debug] [ThreadPool]: Using databricks connection "foor_bar123"
�[0m14:03:20.103053 [debug] [ThreadPool]: Spark adapter: NotImplemented: add_begin_query
�[0m14:03:20.103053 [debug] [ThreadPool]: On foor_bar123: /* {"app": "dbt", "dbt_version": "1.8.8", "dbt_databricks_version": "1.8.7", "databricks_sql_connector_version": "3.1.2", "profile_name": "data_transformation", "target_name": "dev", "connection_name": "foor_bar123"} */

      select current_catalog()
  
�[0m14:03:20.103053 [debug] [ThreadPool]: Using databricks connection "foor_bar123"
�[0m14:03:20.103053 [debug] [ThreadPool]: Databricks adapter: Cursor(session-id=dummy_12345, command-id=Unknown) - Created cursor
�[0m14:03:20.103053 [debug] [ThreadPool]: On foor_bar123: /* {"app": "dbt", "dbt_version": "1.8.8", "dbt_databricks_version": "1.8.7", "databricks_sql_connector_version": "3.1.2", "profile_name": "data_transformation", "target_name": "dev", "connection_name": "foor_bar123"} */

      select current_catalog()
  
�[0m14:03:20.118072 [debug] [ThreadPool]: Databricks adapter: Cursor(session-id=dummy_123456, command-id=Unknown) - Created cursor
�[0m14:03:20.403115 [debug] [ThreadPool]: SQL status: OK in 0.540 seconds
�[0m14:03:20.403115 [debug] [ThreadPool]: Databricks adapter: Cursor(session-id=01efb863-bc1a-1f3b-bb22-4c93cad8ea7d, command-id=01efb863-be10-1f72-8e8f-3e7082aa9252) - Closing cursor
�[0m14:03:20.420198 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=1884128846464, session-id=01efb863-bc1a-1f3b-bb22-4c93cad8ea7d, name=dummy_123, idle-time=3.3485283851623535s, acquire-count=1, language=None, thread-identifier=(4056, 30432), compute-name=) - Checking idleness
�[0m14:03:20.450441 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=1884128846464, session-id=01efb863-bc1a-1f3b-bb22-4c93cad8ea7d, name=dummy_123, idle-time=3.3787713050842285s, acquire-count=1, language=None, thread-identifier=(4056, 30432), compute-name=) - Retrieving connection
�[0m14:03:20.450441 [debug] [ThreadPool]: Using databricks connection "dummy_123"
�[0m14:03:20.450441 [debug] [ThreadPool]: On dummy_123: /* {"app": "dbt", "dbt_version": "1.8.8", "dbt_databricks_version": "1.8.7", "databricks_sql_connector_version": "3.1.2", "profile_name": "data_transformation", "target_name": "dev", "connection_name": "dummy_123"} */
show views in `hive_metastore`.`foor_bar123`
  
�[0m14:03:20.466196 [debug] [ThreadPool]: Databricks adapter: Cursor(session-id=01efb863-bc1a-1f3b-bb22-4c93cad8ea7d, command-id=Unknown) - Created cursor
�[0m14:03:20.723810 [debug] [ThreadPool]: SQL status: OK in 0.620 seconds
�[0m14:03:20.726328 [debug] [ThreadPool]: SQL status: OK in 0.620 seconds
�[0m14:03:20.729339 [debug] [ThreadPool]: Databricks adapter: Cursor(session-id=dummy_12345, command-id=01efb863-be35-1b53-8a80-b990ee7f31f8) - Closing cursor
�[0m14:03:20.730846 [debug] [ThreadPool]: SQL status: OK in 0.610 seconds
�[0m14:03:20.732361 [debug] [ThreadPool]: Databricks adapter: Cursor(session-id=dummy_1234567912, command-id=01efb863-be36-1fdf-8483-f805f9d12f90) - Closing cursor
�[0m14:03:20.739522 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=dummy_1234567, session-id=dummy_12345, name=foor_bar123, idle-time=3.7166054248809814s, acquire-count=1, language=None, thread-identifier=(4056, 36200), compute-name=) - Checking idleness
�[0m14:03:20.746036 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=dummy_1234567, session-id=dummy_12345, name=foor_bar123, idle-time=3.724128246307373s, acquire-count=1, language=None, thread-identifier=(4056, 36200), compute-name=) - Retrieving connection
�[0m14:03:20.747041 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=dummy_1234, session-id=dummy_1234567912, name=list_hive_metastore_foor_bar123, idle-time=2.649646520614624s, acquire-count=1, language=None, thread-identifier=(4056, 33844), compute-name=) - Checking idleness
�[0m14:03:20.747557 [debug] [ThreadPool]: Using databricks connection "foor_bar123"
�[0m14:03:20.748569 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=dummy_1234, session-id=dummy_1234567912, name=list_hive_metastore_foor_bar123, idle-time=2.651174783706665s, acquire-count=1, language=None, thread-identifier=(4056, 33844), compute-name=) - Retrieving connection
�[0m14:03:20.750564 [debug] [ThreadPool]: Databricks adapter: Cursor(session-id=dummy_123456, command-id=01efb863-be36-1414-9dfa-4a7044d55a15) - Closing cursor
�[0m14:03:20.751570 [debug] [ThreadPool]: On foor_bar123: /* {"app": "dbt", "dbt_version": "1.8.8", "dbt_databricks_version": "1.8.7", "databricks_sql_connector_version": "3.1.2", "profile_name": "data_transformation", "target_name": "dev", "connection_name": "foor_bar123"} */
show views in `hive_metastore`.`foor_bar123_internal`
  
�[0m14:03:20.752572 [debug] [ThreadPool]: Using databricks connection "list_hive_metastore_foor_bar123"
�[0m14:03:20.758118 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=dummy_12345679, session-id=dummy_123456, name=foor_bar123, idle-time=3.614379405975342s, acquire-count=1, language=None, thread-identifier=(4056, 8840), compute-name=) - Checking idleness
�[0m14:03:20.759118 [debug] [ThreadPool]: Databricks adapter: Cursor(session-id=dummy_12345, command-id=Unknown) - Created cursor
�[0m14:03:20.759118 [debug] [ThreadPool]: On list_hive_metastore_foor_bar123: /* {"app": "dbt", "dbt_version": "1.8.8", "dbt_databricks_version": "1.8.7", "databricks_sql_connector_version": "3.1.2", "profile_name": "data_transformation", "target_name": "dev", "connection_name": "list_hive_metastore_foor_bar123"} */
show views in `hive_metastore`.`foor_bar123`
  
�[0m14:03:20.760122 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=dummy_12345679, session-id=dummy_123456, name=foor_bar123, idle-time=3.6163837909698486s, acquire-count=1, language=None, thread-identifier=(4056, 8840), compute-name=) - Retrieving connection
�[0m14:03:20.761120 [debug] [ThreadPool]: Databricks adapter: Cursor(session-id=dummy_1234567912, command-id=Unknown) - Created cursor
�[0m14:03:20.762123 [debug] [ThreadPool]: Using databricks connection "foor_bar123"
�[0m14:03:20.764157 [debug] [ThreadPool]: On foor_bar123: /* {"app": "dbt", "dbt_version": "1.8.8", "dbt_databricks_version": "1.8.7", "databricks_sql_connector_version": "3.1.2", "profile_name": "data_transformation", "target_name": "dev", "connection_name": "foor_bar123"} */
show views in `hive_metastore`.`foor_bar123_internal`
  
�[0m14:03:20.765170 [debug] [ThreadPool]: Databricks adapter: Cursor(session-id=dummy_123456, command-id=Unknown) - Created cursor
�[0m14:03:21.181497 [debug] [ThreadPool]: SQL status: OK in 7.560 seconds
�[0m14:03:21.484337 [debug] [ThreadPool]: Databricks adapter: Cursor(session-id=01efb863-bc1a-1ada-8dc3-285dc6e00cf3, command-id=01efb863-bc61-1832-86a5-6e8b3f07025c) - Closing cursor
�[0m14:03:21.520738 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=1884128846848, session-id=01efb863-bc1a-1ada-8dc3-285dc6e00cf3, name=dummy_123, idle-time=4.473958969116211s, acquire-count=1, language=None, thread-identifier=(4056, 30248), compute-name=) - Checking idleness
�[0m14:03:21.524274 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=1884128846848, session-id=01efb863-bc1a-1ada-8dc3-285dc6e00cf3, name=dummy_123, idle-time=4.477494716644287s, acquire-count=1, language=None, thread-identifier=(4056, 30248), compute-name=) - Retrieving connection
�[0m14:03:21.525215 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=1884128846848, session-id=01efb863-bc1a-1ada-8dc3-285dc6e00cf3, name=dummy_123, idle-time=4.47843599319458s, acquire-count=1, language=None, thread-identifier=(4056, 30248), compute-name=) - Checking idleness
�[0m14:03:21.525215 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=1884128846848, session-id=01efb863-bc1a-1ada-8dc3-285dc6e00cf3, name=dummy_123, idle-time=4.47843599319458s, acquire-count=1, language=None, thread-identifier=(4056, 30248), compute-name=) - Retrieving connection
�[0m14:03:21.527373 [debug] [ThreadPool]: Spark adapter: NotImplemented: add_begin_query
�[0m14:03:21.528457 [debug] [ThreadPool]: Using databricks connection "dummy_123"
�[0m14:03:21.529524 [debug] [ThreadPool]: On dummy_123: /* {"app": "dbt", "dbt_version": "1.8.8", "dbt_databricks_version": "1.8.7", "databricks_sql_connector_version": "3.1.2", "profile_name": "data_transformation", "target_name": "dev", "connection_name": "dummy_123"} */

      select current_catalog()
  
�[0m14:03:21.530621 [debug] [ThreadPool]: Databricks adapter: Cursor(session-id=01efb863-bc1a-1ada-8dc3-285dc6e00cf3, command-id=Unknown) - Created cursor
�[0m14:03:21.554088 [debug] [ThreadPool]: SQL status: OK in 0.790 seconds
�[0m14:03:21.558652 [debug] [ThreadPool]: Databricks adapter: Cursor(session-id=dummy_1234567912, command-id=01efb863-be98-1ba7-8663-bf400d200885) - Closing cursor
�[0m14:03:21.568234 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=dummy_1234, session-id=dummy_1234567912, name=list_hive_metastore_foor_bar123, idle-time=0.0s, acquire-count=0, language=None, thread-identifier=(4056, 33844), compute-name=) - Released connection
�[0m14:03:21.679413 [debug] [ThreadPool]: SQL status: OK in 0.920 seconds
�[0m14:03:21.686016 [debug] [ThreadPool]: Databricks adapter: Cursor(session-id=dummy_123456, command-id=01efb863-be98-1f2f-8a5d-84964eadf63f) - Closing cursor
�[0m14:03:21.710868 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=dummy_12345679, session-id=dummy_123456, name=foor_bar123, idle-time=0.0s, acquire-count=0, language=None, thread-identifier=(4056, 8840), compute-name=) - Released connection
�[0m14:03:21.932942 [debug] [ThreadPool]: SQL status: OK in 1.170 seconds
�[0m14:03:22.026522 [debug] [ThreadPool]: SQL status: OK in 1.560 seconds
�[0m14:03:22.075455 [debug] [ThreadPool]: Databricks adapter: Cursor(session-id=dummy_12345, command-id=01efb863-be98-1e37-85ae-e45112b4f2ee) - Closing cursor
�[0m14:03:22.112479 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=dummy_1234567, session-id=dummy_12345, name=foor_bar123, idle-time=0.0s, acquire-count=0, language=None, thread-identifier=(4056, 36200), compute-name=) - Released connection
�[0m14:03:22.215025 [debug] [ThreadPool]: SQL status: OK in 0.680 seconds
�[0m14:03:22.365888 [debug] [ThreadPool]: Databricks adapter: Cursor(session-id=01efb863-bc1a-1f3b-bb22-4c93cad8ea7d, command-id=01efb863-be6b-13ea-813d-b4c1ea81b8dd) - Closing cursor
�[0m14:03:22.420280 [debug] [ThreadPool]: Databricks adapter: Cursor(session-id=01efb863-bc1a-1ada-8dc3-285dc6e00cf3, command-id=01efb863-bf0f-1d4c-96b6-c260b00c377d) - Closing cursor
�[0m14:03:22.470160 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=1884128846848, session-id=01efb863-bc1a-1ada-8dc3-285dc6e00cf3, name=dummy_123, idle-time=5.422382354736328s, acquire-count=1, language=None, thread-identifier=(4056, 30248), compute-name=) - Checking idleness
�[0m14:03:22.494924 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=1884128846848, session-id=01efb863-bc1a-1ada-8dc3-285dc6e00cf3, name=dummy_123, idle-time=5.4481446743011475s, acquire-count=1, language=None, thread-identifier=(4056, 30248), compute-name=) - Retrieving connection
�[0m14:03:22.589596 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=1884128846464, session-id=01efb863-bc1a-1f3b-bb22-4c93cad8ea7d, name=dummy_123, idle-time=0.0s, acquire-count=0, language=None, thread-identifier=(4056, 30432), compute-name=) - Released connection
�[0m14:03:22.590605 [debug] [ThreadPool]: Using databricks connection "dummy_123"
�[0m14:03:22.697699 [debug] [ThreadPool]: On dummy_123: /* {"app": "dbt", "dbt_version": "1.8.8", "dbt_databricks_version": "1.8.7", "databricks_sql_connector_version": "3.1.2", "profile_name": "data_transformation", "target_name": "dev", "connection_name": "dummy_123"} */
show views in `hive_metastore`.`foor_bar123`
  
�[0m14:03:22.712779 [debug] [ThreadPool]: Databricks adapter: Cursor(session-id=01efb863-bc1a-1ada-8dc3-285dc6e00cf3, command-id=Unknown) - Created cursor
�[0m14:03:24.248999 [debug] [ThreadPool]: SQL status: OK in 1.540 seconds
�[0m14:03:24.283521 [debug] [ThreadPool]: Databricks adapter: Cursor(session-id=01efb863-bc1a-1ada-8dc3-285dc6e00cf3, command-id=01efb863-bfcb-1263-8c35-85b5e15af097) - Closing cursor
�[0m14:03:24.333403 [debug] [ThreadPool]: Databricks adapter: DatabricksDBTConnection(id=1884128846848, session-id=01efb863-bc1a-1ada-8dc3-285dc6e00cf3, name=dummy_123, idle-time=0.0s, acquire-count=0, language=None, thread-identifier=(4056, 30248), compute-name=) - Released connection
�[0m14:03:25.004902 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'runnable_timing', 'label': 'ceccfff2-dfd8-47ef-b3a3-0c114a511421', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x000001B695ACE7F0>]}
�[0m14:03:25.005852 [debug] [MainThread]: Databricks adapter: DatabricksDBTConnection(id=1884120761392, session-id=None, name=master, idle-time=11.675438165664673s, acquire-count=1, language=None, thread-identifier=(4056, 4888), compute-name=) - Checking idleness
�[0m14:03:25.006856 [debug] [MainThread]: Databricks adapter: DatabricksDBTConnection(id=1884120761392, session-id=None, name=master, idle-time=11.676442861557007s, acquire-count=1, language=None, thread-identifier=(4056, 4888), compute-name=) - Retrieving connection
�[0m14:03:25.007851 [debug] [MainThread]: Databricks adapter: DatabricksDBTConnection(id=1884120761392, session-id=None, name=master, idle-time=11.677438020706177s, acquire-count=1, language=None, thread-identifier=(4056, 4888), compute-name=) - Checking idleness
�[0m14:03:25.007851 [debug] [MainThread]: Databricks adapter: DatabricksDBTConnection(id=1884120761392, session-id=None, name=master, idle-time=11.677438020706177s, acquire-count=1, language=None, thread-identifier=(4056, 4888), compute-name=) - Retrieving connection
�[0m14:03:25.008884 [debug] [MainThread]: Spark adapter: NotImplemented: add_begin_query
�[0m14:03:25.008884 [debug] [MainThread]: Spark adapter: NotImplemented: commit
�[0m14:03:25.009891 [debug] [MainThread]: Databricks adapter: DatabricksDBTConnection(id=1884120761392, session-id=None, name=master, idle-time=0.0s, acquire-count=0, language=None, thread-identifier=(4056, 4888), compute-name=) - Released connection
�[0m14:03:25.010882 [info ] [MainThread]: Concurrency: 5 threads (target='dev')
�[0m14:03:25.010882 [info ] [MainThread]: 
�[0m14:03:25.039013 [debug] [Thread-1  ]: Began running node unit_test.databricks.my_model.my_model_dummy_test
�[0m14:03:25.039983 [info ] [Thread-1  ]: 1 of 1 START unit_test my_model::my_model_dummy_test ........................... [RUN]
�[0m14:03:25.040980 [debug] [Thread-1  ]: Databricks adapter: DatabricksDBTConnection(id=1884120262688, session-id=None, name=unit_test.databricks.my_model.my_model_dummy_test, idle-time=0s, acquire-count=0, language=None, thread-identifier=(4056, 25848), compute-name=) - Creating connection
�[0m14:03:25.042030 [debug] [Thread-1  ]: Acquiring new databricks connection 'unit_test.databricks.my_model.my_model_dummy_test'
�[0m14:03:25.042573 [debug] [Thread-1  ]: Databricks adapter: DatabricksDBTConnection(id=1884120262688, session-id=None, name=unit_test.databricks.my_model.my_model_dummy_test, idle-time=0.0s, acquire-count=1, language=None, thread-identifier=(4056, 25848), compute-name=) - Acquired connection on thread (4056, 25848), using default compute resource for model '[Unknown]'
�[0m14:03:25.042573 [debug] [Thread-1  ]: Began compiling node unit_test.databricks.my_model.my_model_dummy_test
�[0m14:03:25.043612 [debug] [Thread-1  ]: Began executing node unit_test.databricks.my_model.my_model_dummy_test
�[0m14:03:25.098346 [debug] [Thread-1  ]: Databricks adapter: DatabricksDBTConnection(id=1884120262688, session-id=None, name=unit_test.databricks.my_model.my_model_dummy_test, idle-time=0.05577349662780762s, acquire-count=1, language=None, thread-identifier=(4056, 25848), compute-name=) - Checking idleness
�[0m14:03:25.098857 [debug] [Thread-1  ]: Databricks adapter: DatabricksDBTConnection(id=1884120262688, session-id=None, name=unit_test.databricks.my_model.my_model_dummy_test, idle-time=0.05628371238708496s, acquire-count=1, language=None, thread-identifier=(4056, 25848), compute-name=) - Retrieving connection
�[0m14:03:25.099871 [debug] [Thread-1  ]: Using databricks connection "unit_test.databricks.my_model.my_model_dummy_test"
�[0m14:03:25.100867 [debug] [Thread-1  ]: On unit_test.databricks.my_model.my_model_dummy_test: /* {"app": "dbt", "dbt_version": "1.8.8", "dbt_databricks_version": "1.8.7", "databricks_sql_connector_version": "3.1.2", "profile_name": "data_transformation", "target_name": "dev", "node_id": "unit_test.databricks.my_model.my_model_dummy_test"} */

    

    select * from __dbt__cte__dummy_score_table
    limit 1
    
  
�[0m14:03:25.100867 [debug] [Thread-1  ]: Opening a new connection, currently in state init
�[0m14:03:26.586037 [debug] [Thread-1  ]: Databricks adapter: DatabricksDBTConnection(id=1884120262688, session-id=01efb863-c1c9-1f08-b644-d0c62fff38ab, name=unit_test.databricks.my_model.my_model_dummy_test, idle-time=0.0s, acquire-count=1, language=None, thread-identifier=(4056, 25848), compute-name=) - Connection created
�[0m14:03:26.587047 [debug] [Thread-1  ]: Databricks adapter: Cursor(session-id=01efb863-c1c9-1f08-b644-d0c62fff38ab, command-id=Unknown) - Created cursor
�[0m14:03:27.202700 [debug] [Thread-1  ]: Databricks adapter: Cursor(session-id=01efb863-c1c9-1f08-b644-d0c62fff38ab, command-id=Unknown) - Closing cursor
�[0m14:03:27.204230 [debug] [Thread-1  ]: Databricks adapter: Exception while trying to execute query
/* {"app": "dbt", "dbt_version": "1.8.8", "dbt_databricks_version": "1.8.7", "databricks_sql_connector_version": "3.1.2", "profile_name": "data_transformation", "target_name": "dev", "node_id": "unit_test.databricks.my_model.my_model_dummy_test"} */

    

    select * from __dbt__cte__dummy_score_table
    limit 1
    
  
: [TABLE_OR_VIEW_NOT_FOUND] The table or view `__dbt__cte__dummy_score_table` cannot be found. Verify the spelling and correctness of the schema and catalog.
If you did not qualify the name with a schema, verify the current_schema() output, or qualify the name with the correct schema and catalog.
To tolerate the error on drop use DROP VIEW IF EXISTS or DROP TABLE IF EXISTS. SQLSTATE: 42P01; line 5 pos 18
Error properties: diagnostic-info=org.apache.hive.service.cli.HiveSQLException: Error running query: [TABLE_OR_VIEW_NOT_FOUND] org.apache.spark.sql.catalyst.ExtendedAnalysisException: [TABLE_OR_VIEW_NOT_FOUND] The table or view `__dbt__cte__dummy_score_table` cannot be found. Verify the spelling and correctness of the schema and catalog.
If you did not qualify the name with a schema, verify the current_schema() output, or qualify the name with the correct schema and catalog.
To tolerate the error on drop use DROP VIEW IF EXISTS or DROP TABLE IF EXISTS. SQLSTATE: 42P01; line 5 pos 18
	at org.apache.spark.sql.hive.thriftserver.HiveThriftServerErrors$.runningQueryError(HiveThriftServerErrors.scala:49)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$execute$1(SparkExecuteStatementOperation.scala:805)
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
	at com.databricks.unity.UCSEphemeralState$Handle.runWith(UCSEphemeralState.scala:51)
	at com.databricks.unity.HandleImpl.runWith(UCSHandle.scala:104)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org$apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:641)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.$anonfun$run$5(SparkExecuteStatementOperation.scala:486)
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
	at org.apache.spark.sql.execution.SQLExecution$.withRootExecution(SQLExecution.scala:711)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.$anonfun$run$2(SparkExecuteStatementOperation.scala:486)
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
	at com.databricks.logging.AttributionContextTracing.$anonfun$withAttributionContext$1(AttributionContextTracing.scala:48)
	at com.databricks.logging.AttributionContext$.$anonfun$withValue$1(AttributionContext.scala:276)
	at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
	at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:272)
	at com.databricks.logging.AttributionContextTracing.withAttributionContext(AttributionContextTracing.scala:46)
	at com.databricks.logging.AttributionContextTracing.withAttributionContext$(AttributionContextTracing.scala:43)
	at com.databricks.spark.util.PublicDBLogging.withAttributionContext(DatabricksSparkUsageLogger.scala:27)
	at com.databricks.logging.AttributionContextTracing.withAttributionTags(AttributionContextTracing.scala:95)
	at com.databricks.logging.AttributionContextTracing.withAttributionTags$(AttributionContextTracing.scala:76)
	at com.databricks.spark.util.PublicDBLogging.withAttributionTags(DatabricksSparkUsageLogger.scala:27)
	at com.databricks.spark.util.PublicDBLogging.withAttributionTags0(DatabricksSparkUsageLogger.scala:74)
	at com.databricks.spark.util.DatabricksSparkUsageLogger.withAttributionTags(DatabricksSparkUsageLogger.scala:175)
	at com.databricks.spark.util.UsageLogging.$anonfun$withAttributionTags$1(UsageLogger.scala:617)
	at com.databricks.spark.util.UsageLogging$.withAttributionTags(UsageLogger.scala:729)
	at com.databricks.spark.util.UsageLogging$.withAttributionTags(UsageLogger.scala:738)
	at com.databricks.spark.util.UsageLogging.withAttributionTags(UsageLogger.scala:617)
	at com.databricks.spark.util.UsageLogging.withAttributionTags$(UsageLogger.scala:615)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.withAttributionTags(SparkExecuteStatementOperation.scala:71)
	at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.$anonfun$withLocalProperties$12(ThriftLocalProperties.scala:234)
	at com.databricks.spark.util.IdentityClaim$.withClaim(IdentityClaim.scala:48)
	at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties(ThriftLocalProperties.scala:229)
	at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties$(ThriftLocalProperties.scala:89)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.withLocalProperties(SparkExecuteStatementOperation.scala:71)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.run(SparkExecuteStatementOperation.scala:463)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.run(SparkExecuteStatementOperation.scala:449)
	at java.base/java.security.AccessController.doPrivileged(AccessController.java:712)
	at java.base/javax.security.auth.Subject.doAs(Subject.java:439)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1899)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2.run(SparkExecuteStatementOperation.scala:499)
	at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539)
	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
	at java.base/java.lang.Thread.run(Thread.java:840)
Caused by: org.apache.spark.sql.catalyst.ExtendedAnalysisException: [TABLE_OR_VIEW_NOT_FOUND] The table or view `__dbt__cte__dummy_score_table` cannot be found. Verify the spelling and correctness of the schema and catalog.
If you did not qualify the name with a schema, verify the current_schema() output, or qualify the name with the correct schema and catalog.
To tolerate the error on drop use DROP VIEW IF EXISTS or DROP TABLE IF EXISTS. SQLSTATE: 42P01; line 5 pos 18
	at org.apache.spark.sql.catalyst.ExtendedAnalysisException.copyPlan(ExtendedAnalysisException.scala:91)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$execute$1(SparkExecuteStatementOperation.scala:775)
	... 43 more
, operation-idfoor_bar123
�[0m14:03:27.208275 [debug] [Thread-1  ]: Databricks adapter: DatabricksDBTConnection(id=1884120262688, session-id=01efb863-c1c9-1f08-b644-d0c62fff38ab, name=unit_test.databricks.my_model.my_model_dummy_test, idle-time=0.0s, acquire-count=0, language=None, thread-identifier=(4056, 25848), compute-name=) - Released connection
�[0m14:03:27.507227 [debug] [Thread-1  ]: Database Error in unit_test my_model_dummy_test (models\unit_tests\dummy_model_test.yml)
  [TABLE_OR_VIEW_NOT_FOUND] The table or view `__dbt__cte__dummy_score_table` cannot be found. Verify the spelling and correctness of the schema and catalog.
  If you did not qualify the name with a schema, verify the current_schema() output, or qualify the name with the correct schema and catalog.
  To tolerate the error on drop use DROP VIEW IF EXISTS or DROP TABLE IF EXISTS. SQLSTATE: 42P01; line 5 pos 18
�[0m14:03:27.508405 [debug] [Thread-1  ]: Databricks adapter: DatabricksDBTConnection(id=1884120262688, session-id=01efb863-c1c9-1f08-b644-d0c62fff38ab, name=unit_test.databricks.my_model.my_model_dummy_test, idle-time=0.0s, acquire-count=0, language=None, thread-identifier=(4056, 25848), compute-name=) - Released connection
�[0m14:03:27.509422 [error] [Thread-1  ]: 1 of 1 ERROR my_model::my_model_dummy_test ..................................... [�[31mERROR�[0m in 2.47s]
�[0m14:03:27.511855 [debug] [Thread-1  ]: Finished running node unit_test.databricks.my_model.my_model_dummy_test
�[0m14:03:27.513876 [debug] [MainThread]: Databricks adapter: DatabricksDBTConnection(id=1884120761392, session-id=None, name=master, idle-time=2.5039851665496826s, acquire-count=0, language=None, thread-identifier=(4056, 4888), compute-name=) - Checking idleness
�[0m14:03:27.514895 [debug] [MainThread]: Databricks adapter: DatabricksDBTConnection(id=1884120761392, session-id=None, name=master, idle-time=2.505004405975342s, acquire-count=0, language=None, thread-identifier=(4056, 4888), compute-name=) - Reusing connection previously named master
�[0m14:03:27.516411 [debug] [MainThread]: Databricks adapter: DatabricksDBTConnection(id=1884120761392, session-id=None, name=master, idle-time=2.5065195560455322s, acquire-count=1, language=None, thread-identifier=(4056, 4888), compute-name=) - Acquired connection on thread (4056, 4888), using default compute resource
�[0m14:03:27.518431 [debug] [MainThread]: Databricks adapter: DatabricksDBTConnection(id=1884120761392, session-id=None, name=master, idle-time=2.50753116607666s, acquire-count=1, language=None, thread-identifier=(4056, 4888), compute-name=) - Checking idleness
�[0m14:03:27.519432 [debug] [MainThread]: Databricks adapter: DatabricksDBTConnection(id=1884120761392, session-id=None, name=master, idle-time=2.5085396766662598s, acquire-count=1, language=None, thread-identifier=(4056, 4888), compute-name=) - Retrieving connection
�[0m14:03:27.520429 [debug] [MainThread]: On master: ROLLBACK
�[0m14:03:27.520429 [debug] [MainThread]: Opening a new connection, currently in state init
�[0m14:03:28.998124 [debug] [MainThread]: Databricks adapter: DatabricksDBTConnection(id=1884120761392, session-id=01efb863-c33a-1513-8d55-aca957f97a26, name=master, idle-time=0.0s, acquire-count=1, language=None, thread-identifier=(4056, 4888), compute-name=) - Connection created
�[0m14:03:28.998124 [debug] [MainThread]: Databricks adapter: NotImplemented: rollback
�[0m14:03:28.999164 [debug] [MainThread]: Databricks adapter: DatabricksDBTConnection(id=1884120761392, session-id=01efb863-c33a-1513-8d55-aca957f97a26, name=master, idle-time=0.001583099365234375s, acquire-count=1, language=None, thread-identifier=(4056, 4888), compute-name=) - Checking idleness
�[0m14:03:29.000170 [debug] [MainThread]: Databricks adapter: DatabricksDBTConnection(id=1884120761392, session-id=01efb863-c33a-1513-8d55-aca957f97a26, name=master, idle-time=0.0025892257690429688s, acquire-count=1, language=None, thread-identifier=(4056, 4888), compute-name=) - Retrieving connection
�[0m14:03:29.000170 [debug] [MainThread]: Spark adapter: NotImplemented: add_begin_query
�[0m14:03:29.001131 [debug] [MainThread]: Spark adapter: NotImplemented: commit
�[0m14:03:29.002135 [debug] [MainThread]: Databricks adapter: DatabricksDBTConnection(id=1884120761392, session-id=01efb863-c33a-1513-8d55-aca957f97a26, name=master, idle-time=0.0s, acquire-count=0, language=None, thread-identifier=(4056, 4888), compute-name=) - Released connection
�[0m14:03:29.003133 [debug] [MainThread]: Connection 'master' was properly closed.
�[0m14:03:29.003133 [debug] [MainThread]: On master: ROLLBACK
�[0m14:03:29.004134 [debug] [MainThread]: Databricks adapter: NotImplemented: rollback
�[0m14:03:29.004636 [debug] [MainThread]: On master: Close
�[0m14:03:29.006266 [debug] [MainThread]: Databricks adapter: Connection(session-id=01efb863-c33a-1513-8d55-aca957f97a26) - Closing connection
�[0m14:03:29.472494 [debug] [MainThread]: Connection 'dummy_123' was properly closed.
�[0m14:03:29.472494 [debug] [MainThread]: On dummy_123: ROLLBACK
�[0m14:03:29.473491 [debug] [MainThread]: Databricks adapter: NotImplemented: rollback
�[0m14:03:29.474491 [debug] [MainThread]: On dummy_123: Close
�[0m14:03:29.474491 [debug] [MainThread]: Databricks adapter: Connection(session-id=01efb863-bc1a-1f3b-bb22-4c93cad8ea7d) - Closing connection
�[0m14:03:29.953749 [debug] [MainThread]: Connection 'foor_bar123' was properly closed.
�[0m14:03:29.954749 [debug] [MainThread]: On foor_bar123: ROLLBACK
�[0m14:03:29.954749 [debug] [MainThread]: Databricks adapter: NotImplemented: rollback
�[0m14:03:29.955748 [debug] [MainThread]: On foor_bar123: Close
�[0m14:03:29.956748 [debug] [MainThread]: Databricks adapter: Connection(session-id=dummy_12345) - Closing connection
�[0m14:03:30.424086 [debug] [MainThread]: Connection 'foor_bar123' was properly closed.
�[0m14:03:30.424086 [debug] [MainThread]: On foor_bar123: ROLLBACK
�[0m14:03:30.425081 [debug] [MainThread]: Databricks adapter: NotImplemented: rollback
�[0m14:03:30.426082 [debug] [MainThread]: On foor_bar123: Close
�[0m14:03:30.426082 [debug] [MainThread]: Databricks adapter: Connection(session-id=dummy_123456) - Closing connection
�[0m14:03:30.905016 [debug] [MainThread]: Connection 'dummy_123' was properly closed.
�[0m14:03:30.906027 [debug] [MainThread]: On dummy_123: ROLLBACK
�[0m14:03:30.907027 [debug] [MainThread]: Databricks adapter: NotImplemented: rollback
�[0m14:03:30.907027 [debug] [MainThread]: On dummy_123: Close
�[0m14:03:30.908027 [debug] [MainThread]: Databricks adapter: Connection(session-id=01efb863-bc1a-1ada-8dc3-285dc6e00cf3) - Closing connection
�[0m14:03:31.372236 [debug] [MainThread]: Connection 'list_hive_metastore_foor_bar123' was properly closed.
�[0m14:03:31.374360 [debug] [MainThread]: On list_hive_metastore_foor_bar123: ROLLBACK
�[0m14:03:31.374360 [debug] [MainThread]: Databricks adapter: NotImplemented: rollback
�[0m14:03:31.375686 [debug] [MainThread]: On list_hive_metastore_foor_bar123: Close
�[0m14:03:31.376696 [debug] [MainThread]: Databricks adapter: Connection(session-id=dummy_1234567912) - Closing connection
�[0m14:03:31.879326 [debug] [MainThread]: Connection 'unit_test.databricks.my_model.my_model_dummy_test' was properly closed.
�[0m14:03:31.879841 [debug] [MainThread]: On unit_test.databricks.my_model.my_model_dummy_test: Close
�[0m14:03:31.880859 [debug] [MainThread]: Databricks adapter: Connection(session-id=01efb863-c1c9-1f08-b644-d0c62fff38ab) - Closing connection
�[0m14:03:32.355781 [info ] [MainThread]: 
�[0m14:03:32.356785 [info ] [MainThread]: Finished running 1 unit test in 0 hours 0 minutes and 19.03 seconds (19.03s).
�[0m14:03:32.357780 [debug] [MainThread]: Command end result
�[0m14:03:34.515462 [info ] [MainThread]: 
�[0m14:03:34.516827 [info ] [MainThread]: �[31mCompleted with 1 error and 0 warnings:�[0m
�[0m14:03:34.519361 [info ] [MainThread]: 
�[0m14:03:34.522361 [error] [MainThread]:   Database Error in unit_test my_model_dummy_test (models\unit_tests\dummy_model_test.yml)
  [TABLE_OR_VIEW_NOT_FOUND] The table or view `__dbt__cte__dummy_score_table` cannot be found. Verify the spelling and correctness of the schema and catalog.
  If you did not qualify the name with a schema, verify the current_schema() output, or qualify the name with the correct schema and catalog.
  To tolerate the error on drop use DROP VIEW IF EXISTS or DROP TABLE IF EXISTS. SQLSTATE: 42P01; line 5 pos 18
�[0m14:03:34.525356 [info ] [MainThread]: 
�[0m14:03:34.527357 [info ] [MainThread]: Done. PASS=0 WARN=0 ERROR=1 SKIP=0 TOTAL=1
�[0m14:03:34.532190 [debug] [MainThread]: Command `dbt test` failed at 14:03:34.532190 after 30.53 seconds
�[0m14:03:34.534194 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'invocation', 'label': 'end', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x000001B695CBB340>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x000001B6AA0CBFA0>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x000001B6AB8BA580>]}
�[0m14:03:34.537266 [debug] [MainThread]: Flushing usage events
�[0m14:03:35.543191 [debug] [MainThread]: Error sending anonymous usage statistics. Disabling tracking for this execution. If you wish to permanently disable tracking, see: https://docs.getdbt.com/reference/global-configs#send-anonymous-usage-stats.

@dbeatty10
Copy link
Contributor

dbeatty10 commented Dec 12, 2024

@ashwini-kumar2 I think we've gotten to the bottom of it! See below for a simplified example that gives the same error as you reported.

TLDR

The reason you're getting this error is because:

  • We currently don't support unit testing models that use introspective queries.

More detail

This is actually the same underlying thing that is reported in #10759, so I'm going to close this as a duplicate of that one.

In the meantime, I've opened dbt-labs/docs.getdbt.com#6652 to clarify this in our product documentation.

Reprex

Essentially the same as #10759 (comment)

Create these files:

models/queried_introspectively.sql

select 10 as score

models/my_model.sql

-- depends_on: {{ ref('queried_introspectively') }}
{% if execute %}
    {% set query %}
        select * from {{ ref("queried_introspectively") }}
    {% endset %}

   {% set dummy_result = run_query(query) %}
{% endif %}

select 1 as id

models/unit_tests/dummy_model_test.yml

unit_tests:
  - name: my_model_dummy_test
    model: my_model
    given:
      - input: ref("queried_introspectively")
        rows:
          - {score: 99}
    expect:
        rows:
          - {id: 1}

Run these commands:

dbt run -s +my_model --empty
dbt test --select my_model_dummy_test

Get an error similar to the following:

15:06:13    Runtime Error in unit_test my_model_dummy_test (models/unit_tests/dummy_model_test.yml)
  Catalog Error: Table with name __dbt__cte__queried_introspectively does not exist!
  LINE 4:         select * from __dbt__cte__queried_introspectively

@dbeatty10 dbeatty10 closed this as not planned Won't fix, can't repro, duplicate, stale Dec 12, 2024
@dbeatty10 dbeatty10 added duplicate This issue or pull request already exists and removed triage labels Dec 12, 2024
@dbeatty10 dbeatty10 changed the title [Bug] dbt Unit Test Limitation :: dbt unit tests do not support the use of source('some_schema', 'some_table') inside a variable [Bug] Limitation of dbt uni tests: not support the use of introspective queries (run_query, etc.) Dec 12, 2024
@dbeatty10 dbeatty10 changed the title [Bug] Limitation of dbt uni tests: not support the use of introspective queries (run_query, etc.) [Bug] Limitation of dbt unit tests: not support the use of introspective queries (run_query, etc.) Dec 12, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working duplicate This issue or pull request already exists unit tests Issues related to built-in dbt unit testing functionality user docs [docs.getdbt.com] Needs better documentation
Projects
None yet
Development

No branches or pull requests

2 participants