Description
Is this a new bug in dbt-spark?
- I believe this is a new bug in dbt-spark
- I have searched the existing issues, and I could not find an existing issue for this bug
Current Behavior
When I define model like following, and execute dbt run
command, command will succeed if metadata
field is really of type STRUCT
, but underlying keys in the struct and their corresponding data types won't be checked.
models:
- name: test_model
config:
contract:
enforced: true
alias_types: false
columns:
- name: id
data_type: STRING
- name: metadata
data_type: STRUCT<property_1 STRING, property_2 STRING>
SQL model:
select
id,
CAST(metadata AS STRUCT<property_3 STRING> as metadata
from {{ source('source', 'source_table') }}
I've noticed this when I tried testing and changing my data_type from STRUCT
to MAP
. In that case dbt run command retuned this error:
definition_type
and contract_type
are not taking into account values inside complex/nested structures.
Expected Behavior
I would expect for dbt to fail if contract
is enforced, and complex data types like ARRAY, MAP and STRUCT are not fully described, or even better under contract
configuration to have check_complex_types
boolean field which can decide whether to check nested structures or not.
Steps To Reproduce
Using model configuration and similar SQL command (that contains STRUCT
field) from above, execute dbt run
command. Command will succeed no matter what underlying struct configuration is setup in data_type
field (works with setting just STRUCT<>
).
Relevant log output
No response
Environment
- OS:
- Python: 3.11.9
- dbt-core: dbt Cloud CLI - 0.38.15
- dbt-spark:
Additional Context
No response