Proto commits in dbt-labs/dbt-core

These commits are when the Protocol Buffers files have changed: (only the last 100 relevant commits are shown)

Commit:e920053
Author:Quigley Malcolm
Committer:GitHub

Initial slate of deprecations for 1.10 (#11544) * Begin basic jsonschema validations of dbt_project.yml (#11505) * Add jsonschema for validation project file * Add utility for helping to load jsonschema resources Currently things are a bit hard coded. We should probably alter this to be a bit more flexible. * Begin validating the the `dbt_project.yml` via jsonschema * Begin emitting deprecation warnings for generic jsonschema violations in dbt_project.yml * Move from `DbtInternalError` to `DbtRuntimeError` to avoid circular imports * Add tests for basic jsonschema validation of `dbt_project.yml` * Add changie doc * Add seralization test for new deprecation events * Alter the project jsonschema to not require things that are optional * Add datafiles to package egg * Update inclusion of project jsonschema in setup.py to get files correctly Using the glob spec returns a list of found files. Our previous spec was raising the error `error: can't copy 'dbt/resources/input_schemas/project/*.json': doesn't exist or not a regular file` * Try another approach of adding jsonschema to egg * Add input_schemas dir to MANIFEST.in spec * Drop jsonschema inclusion spec from setup.py * Begin using importlib.resources.files for loading project jsonschema This doesn't currently work with editable installs :'( * Use relative paths for loading jsonchemas instead of importlib Using "importlib" is the blessed way to do this sort of thing. However, that is failing for us on editable installs. This commit switches us to using relative paths. Technically doing this has edge cases, however this is also what we do for the `start_project` used in `dbt init`. So we're going to do the same, for now. We should revisit this soon. * Drop requirment of `__additional_properties__` specified by project jsonschema * Drop requirement for `pre-hook` and `post-hook` specified by project jsonschema * Reset `active_deprecations` global at the end of tests using `project` fixture * Begin validation the jsonschema of YAML resource files (#11516) * Add jsonschema for resources * Begin jsonschema validating YAML resource files in dbt projects * Drop `tests` and `data_tests` as required properties of `Columns` and `Models` for resources jsonschema * Drop `__additional_properties__` as required for `_Metrics` in resources jsonschema * Drop `post_hook` and `pre_hook` requirement for `__SnapshotsConfig` in resources jsonschema * Update `_error_path_to_string` to handle empty paths * Create + use custom Draft7Validator to ignore datetime and date classes * Break `TestRetry` functional test class into multiple test classes There was some overflow global state from one test to another which was causing some of the tests to break. * Refactor duplicate instances of `jsonschema_validate` to single definition * Begin testing jsonschema validation of resource YAMLs * Add changie doc * Add Deprecation Warnings for Unexpected Jinja Blocks (#11514) * Add deprecation warnings on unexpected jinja blocks. * Add changelog entry. * Add test event. * Regen proto types. * Fix event test. * Add `UnexpectedJinjaBlockDeprecationSummary` and add file context to `UnexpectedJinjaBlockDeprecation` (#11517) * Add summary event for UnexpectedJinjaBlockDeprecation * Begin including file information in UnexpectedJinjaBlockDeprecation event * Add UnexpectedJinjaBlockDeprecationSummary to test_events.py * Deprecate Custom Top-Level Keys (#11518) * Add specific deprecation for custom top level keys. * Add changelog entry * Add test events * Add Check for Duplicate YAML Keys (#11510) * Add functionality to check for duplicate yaml keys, working around PyYAML limitation. * Fix up some ancient typing issues. * Ignore typing issue, for now. * Correct unit tests of `checked_load` * Add event and deprecation types for duplicate yaml keys * Begin validating `dbt_project.yml` for duplicate key violations * Begin checking for duplicate key violations in schema files * Add test to check duplicate keys are checked in schema files * Refactor checked_yaml failure handling to reduce duplicate code * Move `checked_load` utilities to separate file to avoid circular imports * Handle yaml `start_mark` correctly for top level key errors * Update changelog * Fix test. --------- Co-authored-by: Quigley Malcolm <quigley.malcolm@dbtlabs.com> * Fix issue with YAML anchors in new CheckedLoader class. * Deprecate having custom keys in config blocks (#11522) * Add deprecation event for custom keys found in configs * Begin checking schema files for custom keys found in configs * Test new CustomConfigInConfigDeprecation event * Add changie doc * Add custom config key deprecation events to event serialization test * Provide message to ValidationError in `SelectorConfig.from_path` This typing error is unrelated to the changes in this PR. However, it was failing CI, so I figured it'd be simple to just fix it. * Add some extra guards around the ValidationFailure `path` and `instance` * [TIDY-FRIST] Use new `deprecation_tag` (#11524) * Tidy First: Update deprecation events to use the new `deprecation_tag` Note did this for a majority of deprecations, but not _all_ deprecations. That is because not all deprecations were following the pattern. As some people do string parsing of our logs with regex, altering the deprecations that weren't doing what `deprecation_tag` does to use `deprecation_tag` would be a _breaking change_ for those events, thus we did not alter those events * Bump minimum dbt-common to `1.22.0` * Fix tests * Begin emitting deprecation events for custom properties found in objects (#11526) * Fix CustomKeyInConfigDeprecationSummary * Add deprecation type for custom properties in YAML objects * Begin emitting deprecation events for custom properties found in objects * Add changie doc * Add `loaded_at_query` property to `_Sources` definition in jsonschema This was breaking the test tests/unit/parser/test_parser.py::SchemaParserSourceTest::test_parse_source_custom_freshness_at_source * Move validating jsonschema of schema files earlier in the process Previously we were validating the jsonschema of schema files in `SchemaParser.parse_file`. However, the file is originally loaded in `yaml_from_file` (which happens before `SchemaParser.parse_file`), and `yaml_from_file` _modifies_ the loaded dictionary to add some additional properties. These additional properties violate the jsonschema unfortunately, and thus we needed to start validating the schema against the jsonschema before any such modifications. * Skip parser tests for `model.freshness` Model freshness never got fully implemented, won't be implemented nor documented for 1.10. As such we're gonna consider the `model.freshness` property an "unknown additional property". This is actually good as some people have "accidentally" defined "freshness" on their models (likely due to copy/paste of a source), and that property isn't doing anything. * One single DeprecationsSummary event to rule them all (#11540) * Begin emitting singular deprecations summary, instead of summary per deprecation type * Remove concept of deprecation specific summary events in deprecations module * Drop deprecation summary events that have been added to `feature-branch--11335-deprecations` but not `main` These are save to drop with no notice because they only ever existed on a feature branch, never main. * Correct code numbers for new events on feature-branch that haven't made it to main yet * Kill `PackageRedirectDeprecationSummary` event, and retire its event code * add changie doc * Update jsonschemas to versions 0.0.110 (#11541) * Update jsonschems to 0.0.110 * Don't allow additional properties in configs * Don't allow additional top level properties on objects * Allow for 'loaded_at_query' on Sources and Tables * Don't allow additional top level properties in schema files --------- Co-authored-by: Peter Webb <peter.webb@dbtlabs.com>

The documentation is generated from this commit.

Commit:5c70dee
Author:Quigley Malcolm
Committer:Quigley Malcolm

Begin emitting deprecation events for custom properties found in objects (#11526) * Fix CustomKeyInConfigDeprecationSummary * Add deprecation type for custom properties in YAML objects * Begin emitting deprecation events for custom properties found in objects * Add changie doc * Add `loaded_at_query` property to `_Sources` definition in jsonschema This was breaking the test tests/unit/parser/test_parser.py::SchemaParserSourceTest::test_parse_source_custom_freshness_at_source * Move validating jsonschema of schema files earlier in the process Previously we were validating the jsonschema of schema files in `SchemaParser.parse_file`. However, the file is originally loaded in `yaml_from_file` (which happens before `SchemaParser.parse_file`), and `yaml_from_file` _modifies_ the loaded dictionary to add some additional properties. These additional properties violate the jsonschema unfortunately, and thus we needed to start validating the schema against the jsonschema before any such modifications. * Skip parser tests for `model.freshness` Model freshness never got fully implemented, won't be implemented nor documented for 1.10. As such we're gonna consider the `model.freshness` property an "unknown additional property". This is actually good as some people have "accidentally" defined "freshness" on their models (likely due to copy/paste of a source), and that property isn't doing anything.

Commit:f09a96f
Author:Quigley Malcolm
Committer:Quigley Malcolm

Deprecate having custom keys in config blocks (#11522) * Add deprecation event for custom keys found in configs * Begin checking schema files for custom keys found in configs * Test new CustomConfigInConfigDeprecation event * Add changie doc * Add custom config key deprecation events to event serialization test * Provide message to ValidationError in `SelectorConfig.from_path` This typing error is unrelated to the changes in this PR. However, it was failing CI, so I figured it'd be simple to just fix it. * Add some extra guards around the ValidationFailure `path` and `instance`

Commit:d3f01e0
Author:Quigley Malcolm
Committer:GitHub

One single DeprecationsSummary event to rule them all (#11540) * Begin emitting singular deprecations summary, instead of summary per deprecation type * Remove concept of deprecation specific summary events in deprecations module * Drop deprecation summary events that have been added to `feature-branch--11335-deprecations` but not `main` These are save to drop with no notice because they only ever existed on a feature branch, never main. * Correct code numbers for new events on feature-branch that haven't made it to main yet * Kill `PackageRedirectDeprecationSummary` event, and retire its event code * add changie doc

The documentation is generated from this commit.

Commit:ae6678e
Author:Quigley Malcolm
Committer:Quigley Malcolm

Kill `PackageRedirectDeprecationSummary` event, and retire its event code

The documentation is generated from this commit.

Commit:b042c25
Author:Quigley Malcolm

Correct code numbers for new events on feature-branch that haven't made it to main yet

Commit:538bfeb
Author:Quigley Malcolm

Drop deprecation summary events that have been added to `feature-branch--11335-deprecations` but not `main` These are save to drop with no notice because they only ever existed on a feature branch, never main.

Commit:184fba9
Author:Quigley Malcolm

Begin emitting singular deprecations summary, instead of summary per deprecation type

Commit:5de6c8f
Author:Quigley Malcolm
Committer:GitHub

Begin emitting deprecation events for custom properties found in objects (#11526) * Fix CustomKeyInConfigDeprecationSummary * Add deprecation type for custom properties in YAML objects * Begin emitting deprecation events for custom properties found in objects * Add changie doc * Add `loaded_at_query` property to `_Sources` definition in jsonschema This was breaking the test tests/unit/parser/test_parser.py::SchemaParserSourceTest::test_parse_source_custom_freshness_at_source * Move validating jsonschema of schema files earlier in the process Previously we were validating the jsonschema of schema files in `SchemaParser.parse_file`. However, the file is originally loaded in `yaml_from_file` (which happens before `SchemaParser.parse_file`), and `yaml_from_file` _modifies_ the loaded dictionary to add some additional properties. These additional properties violate the jsonschema unfortunately, and thus we needed to start validating the schema against the jsonschema before any such modifications. * Skip parser tests for `model.freshness` Model freshness never got fully implemented, won't be implemented nor documented for 1.10. As such we're gonna consider the `model.freshness` property an "unknown additional property". This is actually good as some people have "accidentally" defined "freshness" on their models (likely due to copy/paste of a source), and that property isn't doing anything.

Commit:fde3bc6
Author:Quigley Malcolm
Committer:Quigley Malcolm

Add deprecation type for custom properties in YAML objects

Commit:7245d03
Author:Quigley Malcolm

Add deprecation type for custom properties in YAML objects

Commit:21a4779
Author:Quigley Malcolm
Committer:GitHub

Deprecate having custom keys in config blocks (#11522) * Add deprecation event for custom keys found in configs * Begin checking schema files for custom keys found in configs * Test new CustomConfigInConfigDeprecation event * Add changie doc * Add custom config key deprecation events to event serialization test * Provide message to ValidationError in `SelectorConfig.from_path` This typing error is unrelated to the changes in this PR. However, it was failing CI, so I figured it'd be simple to just fix it. * Add some extra guards around the ValidationFailure `path` and `instance`

Commit:b1c859d
Author:Peter Webb
Committer:GitHub

Add Check for Duplicate YAML Keys (#11510) * Add functionality to check for duplicate yaml keys, working around PyYAML limitation. * Fix up some ancient typing issues. * Ignore typing issue, for now. * Correct unit tests of `checked_load` * Add event and deprecation types for duplicate yaml keys * Begin validating `dbt_project.yml` for duplicate key violations * Begin checking for duplicate key violations in schema files * Add test to check duplicate keys are checked in schema files * Refactor checked_yaml failure handling to reduce duplicate code * Move `checked_load` utilities to separate file to avoid circular imports * Handle yaml `start_mark` correctly for top level key errors * Update changelog * Fix test. --------- Co-authored-by: Quigley Malcolm <quigley.malcolm@dbtlabs.com>

Commit:caa8cf2
Author:Peter Webb
Committer:GitHub

Deprecate Custom Top-Level Keys (#11518) * Add specific deprecation for custom top level keys. * Add changelog entry * Add test events

Commit:0b9d371
Author:Quigley Malcolm
Committer:GitHub

Add `UnexpectedJinjaBlockDeprecationSummary` and add file context to `UnexpectedJinjaBlockDeprecation` (#11517) * Add summary event for UnexpectedJinjaBlockDeprecation * Begin including file information in UnexpectedJinjaBlockDeprecation event * Add UnexpectedJinjaBlockDeprecationSummary to test_events.py

Commit:3d707bc
Author:Peter Webb
Committer:GitHub

Add Deprecation Warnings for Unexpected Jinja Blocks (#11514) * Add deprecation warnings on unexpected jinja blocks. * Add changelog entry. * Add test event. * Regen proto types. * Fix event test.

Commit:f1bd3f7
Author:Quigley Malcolm
Committer:GitHub

Begin basic jsonschema validations of dbt_project.yml (#11505) * Add jsonschema for validation project file * Add utility for helping to load jsonschema resources Currently things are a bit hard coded. We should probably alter this to be a bit more flexible. * Begin validating the the `dbt_project.yml` via jsonschema * Begin emitting deprecation warnings for generic jsonschema violations in dbt_project.yml * Move from `DbtInternalError` to `DbtRuntimeError` to avoid circular imports * Add tests for basic jsonschema validation of `dbt_project.yml` * Add changie doc * Add seralization test for new deprecation events * Alter the project jsonschema to not require things that are optional * Add datafiles to package egg * Update inclusion of project jsonschema in setup.py to get files correctly Using the glob spec returns a list of found files. Our previous spec was raising the error `error: can't copy 'dbt/resources/input_schemas/project/*.json': doesn't exist or not a regular file` * Try another approach of adding jsonschema to egg * Add input_schemas dir to MANIFEST.in spec * Drop jsonschema inclusion spec from setup.py * Begin using importlib.resources.files for loading project jsonschema This doesn't currently work with editable installs :'( * Use relative paths for loading jsonchemas instead of importlib Using "importlib" is the blessed way to do this sort of thing. However, that is failing for us on editable installs. This commit switches us to using relative paths. Technically doing this has edge cases, however this is also what we do for the `start_project` used in `dbt init`. So we're going to do the same, for now. We should revisit this soon. * Drop requirment of `__additional_properties__` specified by project jsonschema * Drop requirement for `pre-hook` and `post-hook` specified by project jsonschema * Reset `active_deprecations` global at the end of tests using `project` fixture

Commit:e2e86b7
Author:Quigley Malcolm
Committer:GitHub

General Deprecation Warning Improvements (#11466)

Commit:d7c2e83
Author:Quigley Malcolm

Move resource names with spaces deprecation to using new deprecations manager

Commit:300aa09
Author:Chenyu Li
Committer:GitHub

Support artifact upload (#11419) * wip * reorganize * changie * retry * nits * nits * improve retry, adjust error, adjust host name * adjust logic * pr_feedback * Update .changes/unreleased/Features-20250323-151625.yaml Co-authored-by: Emily Rockman <emily.rockman@dbtlabs.com> --------- Co-authored-by: Emily Rockman <emily.rockman@dbtlabs.com>

Commit:906e07c
Author:Michelle Ark
Committer:GitHub

Add node_checksum to node_info on structured logs (#11368) * update node_info to include node checksum * changelog entry * Discard changes to dev-requirements.txt --------- Co-authored-by: Chenyu Li <chenyulee777@gmail.com> Co-authored-by: Quigley Malcolm <quigley.malcolm@dbtlabs.com>

Commit:b0ca125
Author:Peter Webb
Committer:GitHub

Macro Annotations and Inference (#11389) * Default macro argument information from original definitions. * Add argument type and count warnings behind behavior flag. * Add changelog entry. * Make flag test more robust. * Use a unique event for macro annotation warnings, per review. * Add event to test list. * Regenerate core_types_pb2 using protoc 5.28.3 --------- Co-authored-by: Quigley Malcolm <quigley.malcolm@dbtlabs.com>

Commit:31881d2
Author:Kshitij Aranke
Committer:GitHub

Misc fixes for group info in logging (#11218)

Commit:dcc9a0c
Author:Kshitij Aranke
Committer:GitHub

Create `LogNodeResult` event (#11195) * Create LogNodeResult event * add msg directly during object creation

Commit:7fdd92f
Author:github-actions[bot]
Committer:GitHub

Warn if `concurrent_batches` config is set to `True`, but the available adapter doesn't support it (#11145) (#11154) * Begin producing warning when attempting to force concurrent batches without adapter support Batches of microbatch models can be executed sequentially or concurrently. We try to figure out which to do intelligently. As part of that, we implemented an override, the model config `concurrent_batches`, to allow the user to bypass _some_ of our automatic detection. However, a user _cannot_ for batches to run concurrently if the adapter doesn't support concurrent batches (declaring support is opt in). Thus, if an adapter _doesn't_ support running batches concurrently, and a user tries to force concurrent execution via `concurrent_batches`, then we need to warn the user that that isn't happening. * Add custom event type for warning about invalid `concurrent_batches` config * Fire `InvalidConcurrentBatchesConfig` warning via `warn_or_error` so it can be silenced (cherry picked from commit 6c61cb7f7adbdce8edec35a887d6c766a401e403) Co-authored-by: Quigley Malcolm <QMalcolm@users.noreply.github.com>

Commit:6c61cb7
Author:Quigley Malcolm
Committer:GitHub

Warn if `concurrent_batches` config is set to `True`, but the available adapter doesn't support it (#11145) * Begin producing warning when attempting to force concurrent batches without adapter support Batches of microbatch models can be executed sequentially or concurrently. We try to figure out which to do intelligently. As part of that, we implemented an override, the model config `concurrent_batches`, to allow the user to bypass _some_ of our automatic detection. However, a user _cannot_ for batches to run concurrently if the adapter doesn't support concurrent batches (declaring support is opt in). Thus, if an adapter _doesn't_ support running batches concurrently, and a user tries to force concurrent execution via `concurrent_batches`, then we need to warn the user that that isn't happening. * Add custom event type for warning about invalid `concurrent_batches` config * Fire `InvalidConcurrentBatchesConfig` warning via `warn_or_error` so it can be silenced

Commit:9f5f002
Author:github-actions[bot]
Committer:GitHub

Microbatch first last batch serial (#11072) (#11107) * microbatch: split out first and last batch to run in serial * only run pre_hook on first batch, post_hook on last batch * refactor: internalize parallel to RunTask._submit_batch * Add optional `force_sequential` to `_submit_batch` to allow for skipping parallelism check * Force last batch to run sequentially * Force first batch to run sequentially * Remove batch_idx check in `should_run_in_parallel` `should_run_in_parallel` shouldn't, and no longer needs to, take into consideration where in batch exists in a larger context. The first and last batch for a microbatch model are now forced to run sequentially by `handle_microbatch_model` * Begin skipping batches if first batch fails * Write custom `on_skip` for `MicrobatchModelRunner` to better handle when batches are skipped This was necessary specifically because the default on skip set the `X of Y` part of the skipped log using the `node_index` and the `num_nodes`. If there was 2 nodes and we are on the 4th batch of the second node, we'd get a message like `SKIPPED 4 of 2...` which didn't make much sense. We're likely in a future commit going to add a custom event for logging the start, result, and skipping of batches for better readability of the logs. * Add microbatch pre-hook, post-hook, and sequential first/last batch tests * Fix/Add tests around first batch failure vs latter batch failure * Correct MicrobatchModelRunner.on_skip to handle skipping the entire node Previously `MicrobatchModelRunner.on_skip` only handled when a _batch_ of the model was being skipped. However, that method is also used when the entire microbatch model is being skipped due to an upstream node error. Because we previously _weren't_ handling this second case, it'd cause an unhandled runtime exception. Thus, we now need to check whether we're running a batch or not, and there is no batch, then use the super's on_skip method. * Correct conditional logic for setting pre- and post-hooks for batches Previously we were doing an if+elif for setting pre- and post-hooks for batches, where in the `if` matched if the batch wasn't the first batch, and the `elif` matched if the batch wasn't the last batch. The issue with this is that if the `if` was hit, the `elif` _wouldn't_ be hit. This caused the first batch to appropriately not run the `post-hook` but then every hook after would run the `post-hook`. * Add two new event types `LogStartBatch` and `LogBatchResult` * Update MicrobatchModelRunner to use new batch specific log events * Fix event testing * Update microbatch integration tests to catch batch specific event types --------- Co-authored-by: Quigley Malcolm <quigley.malcolm@dbtlabs.com> (cherry picked from commit 03fdb4c1578d74e25615a4f46fb572ecee0685e0) Co-authored-by: Michelle Ark <MichelleArk@users.noreply.github.com>

Commit:03fdb4c
Author:Michelle Ark
Committer:GitHub

Microbatch first last batch serial (#11072) * microbatch: split out first and last batch to run in serial * only run pre_hook on first batch, post_hook on last batch * refactor: internalize parallel to RunTask._submit_batch * Add optional `force_sequential` to `_submit_batch` to allow for skipping parallelism check * Force last batch to run sequentially * Force first batch to run sequentially * Remove batch_idx check in `should_run_in_parallel` `should_run_in_parallel` shouldn't, and no longer needs to, take into consideration where in batch exists in a larger context. The first and last batch for a microbatch model are now forced to run sequentially by `handle_microbatch_model` * Begin skipping batches if first batch fails * Write custom `on_skip` for `MicrobatchModelRunner` to better handle when batches are skipped This was necessary specifically because the default on skip set the `X of Y` part of the skipped log using the `node_index` and the `num_nodes`. If there was 2 nodes and we are on the 4th batch of the second node, we'd get a message like `SKIPPED 4 of 2...` which didn't make much sense. We're likely in a future commit going to add a custom event for logging the start, result, and skipping of batches for better readability of the logs. * Add microbatch pre-hook, post-hook, and sequential first/last batch tests * Fix/Add tests around first batch failure vs latter batch failure * Correct MicrobatchModelRunner.on_skip to handle skipping the entire node Previously `MicrobatchModelRunner.on_skip` only handled when a _batch_ of the model was being skipped. However, that method is also used when the entire microbatch model is being skipped due to an upstream node error. Because we previously _weren't_ handling this second case, it'd cause an unhandled runtime exception. Thus, we now need to check whether we're running a batch or not, and there is no batch, then use the super's on_skip method. * Correct conditional logic for setting pre- and post-hooks for batches Previously we were doing an if+elif for setting pre- and post-hooks for batches, where in the `if` matched if the batch wasn't the first batch, and the `elif` matched if the batch wasn't the last batch. The issue with this is that if the `if` was hit, the `elif` _wouldn't_ be hit. This caused the first batch to appropriately not run the `post-hook` but then every hook after would run the `post-hook`. * Add two new event types `LogStartBatch` and `LogBatchResult` * Update MicrobatchModelRunner to use new batch specific log events * Fix event testing * Update microbatch integration tests to catch batch specific event types --------- Co-authored-by: Quigley Malcolm <quigley.malcolm@dbtlabs.com>

Commit:fd6ec71
Author:Michelle Ark
Committer:GitHub

Microbatch parallelism (#10958)

Commit:2a75dd4
Author:Doug Beatty
Committer:GitHub

Parseable JSON and text output in quiet mode for `dbt show` and `dbt compile` (#9958) * Allow `dbt show` and `dbt compile` to output JSON without extra logs * Add `quiet` attribute for ShowNode and CompiledNode messages * Output of protoc compiler * Utilize the `quiet` attribute for ShowNode and CompiledNode * Reuse the `dbt list` approach when the `--quiet` flag is used * Use PrintEvent to get to stdout even if the logger is set to ERROR * Functional tests for quiet compile * Functional tests for quiet show * Fire event same way regardless if LOG_FORMAT is json or not * Switch back to firing ShowNode and CompiledNode events * Make `--inline-direct` to be quiet-compatible * Temporarily change to dev branch for dbt-common * Remove extraneous newline * Functional test for `--quiet` for `--inline-direct` flag * Update changelog entry * Update `core_types_pb2.py` * Restore the original branch in `dev-requirements.txt` --------- Co-authored-by: Kshitij Aranke <kshitij.aranke@dbtlabs.com>

Commit:89caa33
Author:Michelle Ark
Committer:GitHub

Replace environment variable with a project flag to gate microbatch functionality (#10799) * first pass: replace os env with project flag * Fix `TestMicrobatchMultipleRetries` to not use `os.env` * Turn off microbatch project flag for `TestMicrobatchCustomUserStrategyDefault` as it was prior to a9df50f * Update `BaseMicrobatchTest` to turn on microbatch via project flags * Add changie doc * Fix functional tests after merging in main * Add function to that determines whether the new microbatch functionality should be used The new microbatch functionality is, unfortunately, potentially dangerous. That is it adds a new materalization strategy `microbatch` which an end user could have defined as a custom strategy previously. Additionally we added config keys to nodes, and as `config` is just a Dict[str, Any], it could contain anything, thus meaning people could already be using the configs we're adding for different purposes. Thus we need some intellegent gating. Specifically something that adheres to the following: cms = Custom Microbatch Strategy abms = Adapter Builtin Microbatch Strategy bf = Behavior flag umb = Use Microbatch Batching t/f/e = True/False/Error | cms | abms | bf | umb | | t | t | t | t | | f | t | t | t | | t | f | t | t | | f | f | t | e | | t | t | f | f | | f | t | f | t | | t | f | f | f | | f | f | f | e | (The above table assumes that there is a microbatch model present in the project) In order to achieve this we need to check that either the microbatch behavior flag is set to true OR microbatch materializaion being used is the _root_ microbatch materialization (i.e. not custom). The function we added in this commit, `use_microbatch_batches`, does just that. * Gate microbatch functionality by `use_microbatch_batches` manifest function * Rename microbatch behavior flag to `require_batched_execution_for_custom_microbatch_strategy` * Extract logic of `find_macro_by_name` to `find_macro_candiate_by_name` In 0349968c615444de05360509ddeaf6d75d41d826 I had done this for the function `find_materialization_macro_by_name`, but that wasn't the right function to do it to, and will be reverted shortly. `find_materialization_macro_by_name` is used for finding the general materialization macro, whereas `find_macro_by_name` is more general. For the work we're doing, we need to find the microbatch macro, which is not a materialization macro. * Use `find_macro_candidate_by_name` to find the microbatch macro * Fix microbatch macro locality check to search for `core` locality instead of `root` Previously were were checking for a locality of `root`. However, a locality of `root` means it was provided by a `package`. We wnt to check for locality of `core` which basically means `builtin via dbt-core/adapters`. There is another locality `imported` which I beleive means it comes from another package. * Move the evaluation of `use_microbatch_batches` to the last position in boolean checks The method `use_microbatch_batches` is always invoked to evaluate an `if` statement. In most instances, it is part of a logic chain (i.e. there are multiple things being evaluated in the `if` statement). In `if` statements where there are multiple things being evaulated, `use_microbatch_batches` should come _last_ (or as late as possible). This is because it is likely the most costly thing to evaluate in the logic chain, and thus any shortcuts cuts via other evaluations in the if statement failing (and thus avoiding invoking `use_microbatch_batches) is desirable. * Drop behavior flag setting for BaseMicrobatchTest tests * Rename 'env_var' to 'project_flag' in test_microbatch.py * Update microbatch tests to assert when we are/aren't running with batches * Update `test_resolve_event_time_filter` to use `use_microbatch_batches` * Fire deprecation warning for custom microbatch macros * Add microbatch deprecation events to test_events.py --------- Co-authored-by: Quigley Malcolm <quigley.malcolm@dbtlabs.com>

Commit:e26af57
Author:Devon Fulcher
Committer:GitHub

Behavior change cumulative type param (#10909) * Behavior change for mf timespinse without yaml config * Flipping behavior flag causes parse error * Added more tests * Appending just one error

Commit:8a17a0d
Author:Devon Fulcher
Committer:GitHub

Behavior change for mf timespine without yaml configuration (#10857)

Commit:8c6bec4
Author:Quigley Malcolm
Committer:GitHub

Emit `ArtifactWritten` event when artifacts are written (#10940) * Add new `ArtifactWritten` event * Emit ArtifactWritten event whenever an artifact is written * Get artifact_type from class name for ArtifactWritten event * Add changie docs * Add test to check that ArtifactWritten events are being emitted * Regenerate core_types_pb2.py using correct protobuf version * Regen core_types_pb2 again, using a more correct protoc version

Commit:6b5db17
Author:Michelle Ark
Committer:GitHub

raise MicrobatchModelNoEventTimeInputs warning when no microbatch input has event_time config (#10929)

Commit:c018e9d
Author:Quigley Malcolm

Add new `ArtifactWritten` event

Commit:73896ca
Author:Michelle Ark

merge

Commit:1076352
Author:Kshitij Aranke
Committer:GitHub

[CORE-388] Add group metadata info to `LogModelResult` and `LogTestResult` (#10775)

Commit:1fd4d2e
Author:Quigley Malcolm
Committer:GitHub

Enable `retry` support for Microbatch models (#10751) * Add `PartialSuccess` status type and use it for microbatch models with mixed results * Handle `PartialSuccess` in `interpret_run_result` * Add `BatchResults` object to `BaseResult` and begin tracking during microbatch runs * Ensure batch_results being propagated to `run_results` artifact * Move `batch_results` from `BaseResult` class to `RunResult` class * Move `BatchResults` and `BatchType` to separate arifacts file to avoid circular imports In our next commit we're gonna modify `dbt/contracts/graph/nodes.py` to import the `BatchType` as part of our work to implement dbt retry for microbatch model nodes. Unfortunately, the import in `nodes.py` creates a circular dependency because `dbt/artifacts/schemas/results.py` imports from `nodes.py` and `dbt/artifacts/schemas/run/v5/run.py` imports from that `results.py`. Thus the new import creates a circular import. Now this _shouldn't_ be necessary as nothing in artifacts should import from the rest of dbt-core. However, we do. We should fix this, but this is also out of scope for this segement of work. * Add `PartialSuccess` as a retry-able status, and use batches to retry microbatch models * Fix BatchType type so that the first datetime is no longer Optional * Ensure `PartialSuccess` causes skipping of downstream nodes * Alter `PartialSuccess` status to be considered an error in `interpret_run_result` * Update schemas and test artifacts to include new batch_results run results key * Add functional test to check that 'dbt retry' retries 'PartialSuccess' models * Update partition failure test to assert downstream models are skipped * Improve `success`/`error`/`partial success` messaging for microbatch models * Include `PartialSuccess` in status that `--fail-fast` counts as a failure * Update `LogModelResult` to handle partial successes * Update `EndOfRunSummary` to handle partial successes * Cleanup TODO comment * Raise a DbtInternalError if we get a batch run result without `batch_results` * When running a microbatch model with supplied batches, force non full-refresh behavior This is necessary because of retry. Say on the initial run the microbatch model succeeds on 97% of it's batches. Then on retry it does the last 3%. If the retry of the microbatch model executes in full refresh mode it _might_ blow away the 97% of work that has been done. This edge case seems to be adapter specific. * Only pass batches to retry for microbatch model when there was a PartialSuccess In the previous commit we made it so that retries of microbatch models wouldn't run in full refresh mode when the microbatch model to retry has batches already specified from the prior run. This is only problematic when the run being retried was a full refresh AND all the batches for a given microbatch model failed. In that case WE DO want to do a full refresh for the given microbatch model. To better outline the problem, consider the following: * a microbatch model had a begin of `2020-01-01` and has been running this way for awhile * the begin config has changed to `2024-01-01` and dbt run --full-refresh gets run * every batch for an microbatch model fails * on dbt retry the the relation is said to exist, and the now out of range data (2020-01-01 through 2023-12-31) is never purged To avoid this, all we have to do is ONLY pass the batch information for partially successful microbatch models. Note: microbatch models only have a partially successful status IFF they have both successful and failed batches. * Fix test_manifest unit tests to know about model 'batches' key * Add some console output assertions to microbatch functional tests * add batch_results: None to expected_run_results * Add changie doc for microbatch retry functionality * maintain protoc version 5.26.1 * Cleanup extraneous comment in LogModelResult --------- Co-authored-by: Michelle Ark <michelle.ark@dbtlabs.com>

Commit:394b91b
Author:Gerda Shank
Committer:Gerda Shank

Warn if timestamp updated_at field uses incompatible timestamp (#10352) Co-authored-by: Michelle Ark <michelle.ark@dbtlabs.com>

Commit:c28cb92
Author:Gerda Shank
Committer:GitHub

Warn if timestamp updated_at field uses incompatible timestamp (#10352) Co-authored-by: Michelle Ark <michelle.ark@dbtlabs.com>

Commit:3695698
Author:Kshitij Aranke
Committer:GitHub

[CORE-364] Add `group` info to `RunResultError`, `RunResultFailure`, `RunResultWarning` log lines (#10535)

Commit:a309283
Author:Gerda Shank
Committer:GitHub

Add additional logging information to capture skipped ephemeral model info (#10390)

Commit:3c82a02
Author:Michelle Ark
Committer:GitHub

deprecate materialization overrides from imported packages (#9971) (#10008)

Commit:a36057d
Author:Jeremy Cohen
Committer:GitHub

Consistent behavior change flag names + deprecation warnings (#10063) * Rename flags * Attempt refactor of flag require_resource_names_without_spaces * Add deprecation warning for source_freshness_run_project_hooks * Switch require_explicit_package_overrides_for_builtin_materializations default to True * Add changelog * fix core_types_pb2.py * add playbook on introducing + maintaining behavoiur change flags * be less canadian * refactor: use hooks in FreshnessTask.get_hooks_by_type * changelog entry for behaviour change * Update docs link --------- Co-authored-by: Michelle Ark <michelle.ark@dbtlabs.com>

Commit:87ac4de
Author:Michelle Ark
Committer:GitHub

[Backport] deprecate materialization overrides from imported packages (#9998)

Commit:0290cf7
Author:Michelle Ark
Committer:GitHub

deprecate materialization overrides from imported packages (#9971)

Commit:f15e128
Author:Quigley Malcolm
Committer:GitHub

Begin warning people about spaces in model names (#9886) * Add event type for deprecation of spaces in model names * Begin emitting deprecation warning for spaces in model names * Only warn on first model name with spaces unless `--debug` is specified For projects with a lot of models that have spaces in their names, the warning about this deprecation would be incredibly annoying. Now we instead only log the first model name issue and then a count of how many models have the issue, unless `--debug` is specified. * Refactor `EventCatcher` so that the event to catch is setable We want to be able to catch more than just `SpacesInModelNameDeprecation` events, and in the next commit we will alter our tests to do so. Thus instead of writing a new catcher for each event type, a slight modification to the existing `EventCatcher` makes this much easier. * Add project flag to control whether spaces are allowed in model names * Log errors and raise exception when `allow_spaces_in_model_names` is `False` * Use custom event for output invalid name counts instead of `Note` events Using `Note` events was causing test flakiness when run in a multi worker environment using `pytest -nauto`. This is because the event manager is currently a global. So in a situation where test `A` starts and test `tests_debug_when_spaces_in_name` starts shortly there after, the event manager for both tests will have the callbacks set in `tests_debug_when_spaces_in_name`. Then if something in test `A` fired a `Note` event, this would affect the count of `Note` events that `tests_debug_when_spaces_in_name` sees, causing assertion failures. By creating a custom event, `TotalModelNamesWithSpacesDeprecation`, we limit the possible flakiness to only tests that fire the custom event. Thus we didn't _eliminate_ all possibility of flakiness, but realistically only the tests in `test_check_for_spaces_in_model_names.py` can now interfere with each other. Which still isn't great, but to fully resolve the problem we need to work on how the event manager is handled (preferably not globally). * Always log total invalid model names if at least one Previously we only logged out the count of how many invalid model names there were if there was two or more invalid names (and not in debug mode). However this message is important if there is even one invalid model name and regardless of whether you are running debug mode. That is because automated tools might be looking for the event type to track if anything is wrong. A related change in this commit is that we now only output the debug hint if it wasn't run with debug mode. The idea being that if they are already running it in debug mode, the hint could come accross as somewhat patronizing. * Reduce duplicate `if` logic in `check_for_spaces_in_model_names` * Improve readability of logs related to problematic model names We want people running dbt to be able to at a glance see warnings/errors with running their project. In this case we are focused specifically on errors/warnings in regards to model names containing spaces. Previously we were only ever emitting the `warning_tag` in the message even if the event itself was being emitted at an `ERROR` level. We now properly have `[ERROR]` or `[WARNING]` in the message depending on the level. Unfortunately we couldn't just look what level the event was being fired at, because that information doesn't exist on the event itself. Additionally, we're using events that base off of `DynamicEvents` which unfortunately hard coded to `DEBUG`. Changing this would involve still having a `level` property on the definition in `core_types.proto` and then having `DynamicEvent`s look to `self.level` in the `level_tag` method. Then we could change how firing events works based on the an event's `level_tag` return value. This all sounds like a bit of tech debt suited for PR, possibly multiple, and thus is not being done here. * Alter `TotalModelNamesWithSpacesDeprecation` message to handle singular and plural * Remove duplicate import in `test_graph.py` introduced from merging in main --------- Co-authored-by: Emily Rockman <emily.rockman@dbtlabs.com>

Commit:cfaacc6
Author:Gerda Shank
Committer:GitHub

Include node_info in various Result events (#9820)

Commit:865b09b
Author:Jeremy Cohen
Committer:GitHub

Clearer no-op logging in stubbed SavedQueryRunner (#9605) * Clearer no-op logging in stubbed SavedQueryRunner * Add changelog entry * Fix unit test * More logging touchups * Fix failing test * Rename flag + refactor per #9629 * Fix failing test * regenerate core_proto_types with libprotoc 25.3 --------- Co-authored-by: Michelle Ark <michelle.ark@dbtlabs.com>

Commit:bbedd73
Author:Jeremy Cohen

Include columns in show output

Commit:6517c75
Author:Gerda Shank
Committer:GitHub

[Backport 1.7.latest] Store node_info in GenericExceptionOnRun logging event (#9568) * Add node_info to GenericExceptionOnRun, InternalErrorOnRun & SQLRunnerException * Changie * Formatting

Commit:2411f93
Author:Gerda Shank
Committer:GitHub

Store node_info in GenericExceptionOnRun logging event (#9559)

Commit:2f2e0ce
Author:colin-rogers-dbt
Committer:GitHub

delete dbt/adapters and add dbt-adapters package (#9401) * delete dbt/adapters * update dbt-adapters requirement * fix dev-requirements.txt * update dev-requirements.txt * add changie

Commit:b5a0c4c
Author:Gerda Shank
Committer:GitHub

Unit testing feature branch pull request (#8411) * Initial implementation of unit testing (from pr #2911) Co-authored-by: Michelle Ark <michelle.ark@dbtlabs.com> * 8295 unit testing artifacts (#8477) * unit test config: tags & meta (#8565) * Add additional functional test for unit testing selection, artifacts, etc (#8639) * Enable inline csv format in unit testing (#8743) * Support unit testing incremental models (#8891) * update unit test key: unit -> unit-tests (#8988) * convert to use unit test name at top level key (#8966) * csv file fixtures (#9044) * Unit test support for `state:modified` and `--defer` (#9032) Co-authored-by: Michelle Ark <michelle.ark@dbtlabs.com> * Allow use of sources as unit testing inputs (#9059) * Use daff for diff formatting in unit testing (#8984) * Fix #8652: Use seed file from disk for unit testing if rows not specified in YAML config (#9064) Co-authored-by: Michelle Ark <MichelleArk@users.noreply.github.com> Fix #8652: Use seed value if rows not specified * Move unit testing to test and build commands (#9108) * Enable unit testing in non-root packages (#9184) * convert test to data_test (#9201) * Make fixtures files full-fledged members of manifest and enable partial parsing (#9225) * In build command run unit tests before models (#9273) --------- Co-authored-by: Michelle Ark <michelle.ark@dbtlabs.com> Co-authored-by: Michelle Ark <MichelleArk@users.noreply.github.com> Co-authored-by: Emily Rockman <emily.rockman@dbtlabs.com> Co-authored-by: Jeremy Cohen <jeremy@dbtlabs.com> Co-authored-by: Kshitij Aranke <kshitij.aranke@dbtlabs.com>

Commit:15704ab
Author:Emily Rockman
Committer:GitHub

remove dbt/common req in favor of dbt-common dependency (#9368) * replace dbt/common with dbt-common * update requirements, remove colorama * remove dbt-common unit tests * WIP * some cleanup * update imports from dbt.common to dbt_common * remove tests/unit/common * changelog entry * remove commented out code * move cache exceptions to dbt/adapters (#9361) * point to dbt-common main * Move the contents of dbt.contracts.results to a new dbt.artifacts directory (#9350) * conflict resolution cleanup * cleanup * add ignoreb --------- Co-authored-by: Michelle Ark <michelle.ark@dbtlabs.com> Co-authored-by: Michelle Ark <MichelleArk@users.noreply.github.com> Co-authored-by: Gerda Shank <gerda@dbtlabs.com>

Commit:9160738
Author:Gerda Shank
Committer:Gerda Shank

Merge branch 'main' into unit_testing_feature_branch

Commit:a1f78a8
Author:Michelle Ark
Committer:GitHub

Revert "replace dbt/common with dbt-common" (#9365)

Commit:00f4a25
Author:Michelle Ark
Committer:GitHub

replace dbt/common with dbt-common (#9342)

Commit:e4ddbc8
Author:Michelle Ark

replace dbt/common with dbt-common

Commit:e42b7ca
Author:Gerda Shank
Committer:GitHub

Move UserConfig to dbt_project.yml and rename to ProjectFlags (#9289)

Commit:48d9a67
Author:Michelle Ark
Committer:GitHub

move events only used by core from dbt/common/events to dbt/events (#9326)

Commit:56dfb34
Author:Gerda Shank
Committer:Gerda Shank

Merge branch 'main' into unit_testing_feature_branch

Commit:7763212
Author:Michelle Ark
Committer:GitHub

Feature/decouple adapters from core (#8906) * remove dbt.contracts.connection imports from adapter module * Move events to common (#8676) * Move events to common * More Type Annotations (#8536) * Extend use of type annotations in the events module. * Add return type of None to more __init__ definitions. * Still more type annotations adding -> None to __init__ * Tweak per review * Allow adapters to include python package logging in dbt logs (#8643) * add set_package_log_level functionality * set package handler * set package handler * add logging about stting up logging * test event log handler * add event log handler * add event log level * rename package and add unit tests * revert logfile config change * cleanup and add code comments * add changie * swap function for dict * add additional unit tests * fix unit test * update README and protos * fix formatting * update precommit --------- Co-authored-by: Peter Webb <peter.webb@dbtlabs.com> * fix import * move types_pb2.py from events to common/events * move agate_helper into common * Add utils module (#8910) * moving types_pb2.py to common/events * split out utils into core/common/adapters * add changie * remove usage of dbt.config.PartialProject from dbt/adapters (#8909) * remove usage of dbt.config.PartialProject from dbt/adapters * add changie --------- Co-authored-by: Colin <colin.rogers@dbtlabs.com> * move agate_helper unit tests under tests/unit/common * move agate_helper into common (#8911) * move agate_helper into common * add changie --------- Co-authored-by: Colin <colin.rogers@dbtlabs.com> * remove dbt.flags.MP_CONTEXT usage in dbt/adapters (#8931) * remove dbt.flags.LOG_CACHE_EVENTS usage in dbt/adapters (#8933) * Refactor Base Exceptions (#8989) * moving types_pb2.py to common/events * Refactor Base Exceptions * update make_log_dir_if_missing to handle str * move remaining adapters exception imports to common/adapters --------- Co-authored-by: Michelle Ark <michelle.ark@dbtlabs.com> * Remove usage of dbt.deprecations in dbt/adapters, enable core & adapter-specific (#9051) * Decouple adapter constraints from core (#9054) * Move constraints to dbt.common * Move constraints to contracts folder, per review * Add a changelog entry. * move include/global_project to adapters (#8930) * remove adapter.get_compiler (#9134) * Move adapter logger to adapters (#9165) * moving types_pb2.py to common/events * Move AdapterLogger to adapter folder * add changie * delete accidentally merged types_pb2.py * Move the semver package to common and alter references. (#9166) * Move the semver package to common and alter references. * Alter leftover references to dbt.semver, this time using from syntax. --------- Co-authored-by: Mila Page <versusfacit@users.noreply.github.com> * Refactor EventManager setup and interaction (#9180) * moving types_pb2.py to common/events * move event manager setup back to core, remove ref to global EVENT_MANAGER and clean up event manager functions * move invocation_id from events to first class common concept * move lowercase utils to common * move lowercase utils to common * ref CAPTURE_STREAM through method * add changie * first pass: adapter migration script (#9160) * Decouple macro generator from adapters (#9149) * Remove usage of dbt.contracts.relation in dbt/adapters (#9207) * Remove ResultNode usage from connections (#9211) * Add RelationConfig Protocol for use in Relation.create_from (#9210) * move relation contract to dbt.adapters * changelog entry * first pass: clean up relation.create_from * type ignores * type ignore * changelog entry * update RelationConfig variable names * Merge main into feature/decouple-adapters-from-core (#9240) * moving types_pb2.py to common/events * Restore warning on unpinned git packages (#9157) * Support --empty flag for schema-only dry runs (#8971) * Fix ensuring we produce valid jsonschema artifacts for manifest, catalog, sources, and run-results (#9155) * Drop `all_refs=True` from jsonschema-ization build process Passing `all_refs=True` makes it so that Everything is a ref, even the top level schema. In jsonschema land, this essentially makes the produced artifact not a full schema, but a fractal object to be included in a schema. Thus when `$id` is passed in, jsonschema tools blow up because `$id` is for identifying a schema, which we explicitly weren't creating. The alternative was to drop the inclusion of `$id`. Howver, we're intending to create a schema, and having an `$id` is recommended best practice. Additionally since we were intending to create a schema, not a fractal, it seemed best to create to full schema. * Explicity produce jsonschemas using DRAFT_2020_12 dialect Previously were were implicitly using the `DRAFT_2020_12` dialect through mashumaro. It felt wise to begin explicitly specifying this. First, it is closest in available mashumaro provided dialects to what we produced pre 1.7. Secondly, if mashumaro changes its default for whatever reason (say a new dialect is added, and mashumaro moves to that), we don't want to automatically inherit that. * Bump manifest version to v12 Core 1.7 released with manifest v11, and we don't want to be overriding that with 1.8. It'd be weird for 1.7 and 1.8 to both have v11 manifests, but for them to be different, right? * Begin including schema dialect specification in produced jsonschema In jsonschema's documentation they state > It's not always easy to tell which draft a JSON Schema is using. > You can use the $schema keyword to declare which version of the JSON Schema specification the schema is written to. > It's generally good practice to include it, though it is not required. and > For brevity, the $schema keyword isn't included in most of the examples in this book, but it should always be used in the real world. Basically, to know how to parse a schema, it's important to include what schema dialect is being used for the schema specification. The change in this commit ensures we include that information. * Create manifest v12 jsonschema specification * Add change documentation for jsonschema schema production fix * Bump run-results version to v6 * Generate new v6 run-results jsonschema * Regenerate catalog v1 and sources v3 with fixed jsonschema production * Update tests to handle bumped versions of manifest and run-results --------- Co-authored-by: Jeremy Cohen <jeremy@dbtlabs.com> Co-authored-by: Michelle Ark <MichelleArk@users.noreply.github.com> Co-authored-by: Quigley Malcolm <QMalcolm@users.noreply.github.com> * Move BaseConfig to Common (#9224) * moving types_pb2.py to common/events * move BaseConfig and assorted dependencies to common * move ShowBehavior and OnConfigurationChange to common * add changie * Remove manifest from catalog and connection method signatures (#9242) * Add MacroResolverProtocol, remove lazy loading of manifest in adapter.execute_macro (#9243) * remove manifest from adapter.execute_macro, replace with MacroResolver + remove lazy loading * rename to MacroResolverProtocol * pass MacroResolverProtcol in adapter.calculate_freshness_from_metadata * changelog entry * fix adapter.calculate_freshness call * pass context to MacroQueryStringSetter (#9248) * moving types_pb2.py to common/events * remove manifest from adapter.execute_macro, replace with MacroResolver + remove lazy loading * rename to MacroResolverProtocol * pass MacroResolverProtcol in adapter.calculate_freshness_from_metadata * changelog entry * fix adapter.calculate_freshness call * pass context to MacroQueryStringSetter * changelog entry --------- Co-authored-by: Colin <colin.rogers@dbtlabs.com> * add macro_context_generator on adapter (#9251) * moving types_pb2.py to common/events * remove manifest from adapter.execute_macro, replace with MacroResolver + remove lazy loading * rename to MacroResolverProtocol * pass MacroResolverProtcol in adapter.calculate_freshness_from_metadata * changelog entry * fix adapter.calculate_freshness call * add macro_context_generator on adapter * fix adapter test setup * changelog entry * Update parser to support conversion metrics (#9173) * added ConversionTypeParams classes * updated parser for ConversionTypeParams * added step to populate input_measure for conversion metrics * version bump on DSI * comment back manifest generating line * updated v12 schemas * added tests * added changelog * Add typing for macro_context_generator, fix query_header_context --------- Co-authored-by: Colin <colin.rogers@dbtlabs.com> Co-authored-by: William Deng <33618746+WilliamDee@users.noreply.github.com> * Pass mp_context to adapter factory (#9275) * moving types_pb2.py to common/events * require core to pass mp_context to adapter factory * add changie * fix SpawnContext annotation * Fix include for decoupling (#9286) * moving types_pb2.py to common/events * fix include path in MANIFEST.in * Fix include for decoupling (#9288) * moving types_pb2.py to common/events * fix include path in MANIFEST.in * add index.html to in MANIFEST.in * move system client to common (#9294) * moving types_pb2.py to common/events * move system.py to common * add changie update README * remove dbt.utils from semver.py * remove aliasing connection_exception_retry * Update materialized views to use RelationConfigs and remove refs to dbt.utils (#9291) * moving types_pb2.py to common/events * add AdapterRuntimeConfig protocol and clean up dbt-postgress core imports * add changie * remove AdapterRuntimeConfig * update changelog * Add config field to RelationConfig (#9300) * moving types_pb2.py to common/events * add config field to RelationConfig * merge main into feature/decouple-adapters-from-core (#9305) * moving types_pb2.py to common/events * Update parser to support conversion metrics (#9173) * added ConversionTypeParams classes * updated parser for ConversionTypeParams * added step to populate input_measure for conversion metrics * version bump on DSI * comment back manifest generating line * updated v12 schemas * added tests * added changelog * Remove `--dry-run` flag from `dbt deps` (#9169) * Rm --dry-run flag for dbt deps * Add changelog entry * Update test * PR feedback * adding clean_up methods to basic and unique_id tests (#9195) * init attempt of adding clean_up methods to basic and unique_id tests * swapping cleanup method drop of test_schema to unique_schema to test breakage on docs_generate test * moving the clean_up method down into class BaseDocsGenerate * remove drop relation for unique_schema * manually define alternate_schema for clean_up as not being seen as part of project_config * add changelog * remove unneeded changelog * uncomment line that generates new manifest and delete manifest our changes created * make sure the manifest test is deleted and readd older version of manifest.json to appease test * manually revert file to previous commit * Revert "manually revert file to previous commit" This reverts commit a755419e8b0ea9bd9ac5fbbcc4b14ba7fc6cec14. --------- Co-authored-by: William Deng <33618746+WilliamDee@users.noreply.github.com> Co-authored-by: Jeremy Cohen <jeremy@dbtlabs.com> Co-authored-by: Matthew McKnight <91097623+McKnight-42@users.noreply.github.com> * resolve merge conflict on unparsed.py (#9309) * moving types_pb2.py to common/events * Update parser to support conversion metrics (#9173) * added ConversionTypeParams classes * updated parser for ConversionTypeParams * added step to populate input_measure for conversion metrics * version bump on DSI * comment back manifest generating line * updated v12 schemas * added tests * added changelog * Remove `--dry-run` flag from `dbt deps` (#9169) * Rm --dry-run flag for dbt deps * Add changelog entry * Update test * PR feedback * adding clean_up methods to basic and unique_id tests (#9195) * init attempt of adding clean_up methods to basic and unique_id tests * swapping cleanup method drop of test_schema to unique_schema to test breakage on docs_generate test * moving the clean_up method down into class BaseDocsGenerate * remove drop relation for unique_schema * manually define alternate_schema for clean_up as not being seen as part of project_config * add changelog * remove unneeded changelog * uncomment line that generates new manifest and delete manifest our changes created * make sure the manifest test is deleted and readd older version of manifest.json to appease test * manually revert file to previous commit * Revert "manually revert file to previous commit" This reverts commit a755419e8b0ea9bd9ac5fbbcc4b14ba7fc6cec14. --------- Co-authored-by: William Deng <33618746+WilliamDee@users.noreply.github.com> Co-authored-by: Jeremy Cohen <jeremy@dbtlabs.com> Co-authored-by: Matthew McKnight <91097623+McKnight-42@users.noreply.github.com> * Resolve unparsed.py conflict (#9311) * Update parser to support conversion metrics (#9173) * added ConversionTypeParams classes * updated parser for ConversionTypeParams * added step to populate input_measure for conversion metrics * version bump on DSI * comment back manifest generating line * updated v12 schemas * added tests * added changelog * Remove `--dry-run` flag from `dbt deps` (#9169) * Rm --dry-run flag for dbt deps * Add changelog entry * Update test * PR feedback * adding clean_up methods to basic and unique_id tests (#9195) * init attempt of adding clean_up methods to basic and unique_id tests * swapping cleanup method drop of test_schema to unique_schema to test breakage on docs_generate test * moving the clean_up method down into class BaseDocsGenerate * remove drop relation for unique_schema * manually define alternate_schema for clean_up as not being seen as part of project_config * add changelog * remove unneeded changelog * uncomment line that generates new manifest and delete manifest our changes created * make sure the manifest test is deleted and readd older version of manifest.json to appease test * manually revert file to previous commit * Revert "manually revert file to previous commit" This reverts commit a755419e8b0ea9bd9ac5fbbcc4b14ba7fc6cec14. --------- Co-authored-by: William Deng <33618746+WilliamDee@users.noreply.github.com> Co-authored-by: Jeremy Cohen <jeremy@dbtlabs.com> Co-authored-by: Matthew McKnight <91097623+McKnight-42@users.noreply.github.com> --------- Co-authored-by: colin-rogers-dbt <111200756+colin-rogers-dbt@users.noreply.github.com> Co-authored-by: Peter Webb <peter.webb@dbtlabs.com> Co-authored-by: Colin <colin.rogers@dbtlabs.com> Co-authored-by: Mila Page <67295367+VersusFacit@users.noreply.github.com> Co-authored-by: Mila Page <versusfacit@users.noreply.github.com> Co-authored-by: Jeremy Cohen <jeremy@dbtlabs.com> Co-authored-by: Quigley Malcolm <QMalcolm@users.noreply.github.com> Co-authored-by: William Deng <33618746+WilliamDee@users.noreply.github.com> Co-authored-by: Matthew McKnight <91097623+McKnight-42@users.noreply.github.com> Co-authored-by: Chenyu Li <chenyu.li@dbtlabs.com>

Commit:fe82ef2
Author:Colin

split adapters factory into client and load_adapter

Commit:a570a2c
Author:Emily Rockman
Committer:GitHub

convert test to data_test (#9201) * convert test to data_test * generate proto types * fixing tests * add tests * add more tests * test cleanup * WIP * fix graph * fix testing manifest * set resource type back to test and reset unique id * reset expected run results * cleanup * changie * modify to only look for tests under columns in schema files * stop using dashes

Commit:09f5bb3
Author:Jeremy Cohen
Committer:GitHub

Backport #9147 to 1.7.latest (#9156) * Fixups for deps lock file (#9147) * Update git revision with commit SHA * Use PackageRenderer for lock file * add unit tests for git and tarball packages * deepcopy unrendered_packages_data before iteration, fix remaining tests * Add functional tests * Add changelog entries * Assert one more --------- Co-authored-by: Michelle Ark <michelle.ark@dbtlabs.com> * Restore warning on unpinned git packages --------- Co-authored-by: Michelle Ark <michelle.ark@dbtlabs.com>

Commit:12bd1e8
Author:Michelle Ark

Merge branch 'main' into feature/decouple-adapters-from-core

Commit:32fde75
Author:Jeremy Cohen
Committer:GitHub

Fixups for deps lock file (#9147) * Update git revision with commit SHA * Use PackageRenderer for lock file * add unit tests for git and tarball packages * deepcopy unrendered_packages_data before iteration, fix remaining tests * Add functional tests * Add changelog entries * Assert one more --------- Co-authored-by: Michelle Ark <michelle.ark@dbtlabs.com>

Commit:cdb26be
Author:Emily Rockman
Committer:Emily Rockman

update to just pass around data-tests

Commit:e56a5da
Author:Michelle Ark
Committer:GitHub

Remove usage of dbt.deprecations in dbt/adapters, enable core & adapter-specific (#9051)

Commit:c0da090
Author:Doug Beatty

Add new event `TypeCodeNotFound` (E050)

Commit:c141148
Author:colin-rogers-dbt
Committer:Michelle Ark

Move events to common (#8676) * Move events to common * More Type Annotations (#8536) * Extend use of type annotations in the events module. * Add return type of None to more __init__ definitions. * Still more type annotations adding -> None to __init__ * Tweak per review * Allow adapters to include python package logging in dbt logs (#8643) * add set_package_log_level functionality * set package handler * set package handler * add logging about stting up logging * test event log handler * add event log handler * add event log level * rename package and add unit tests * revert logfile config change * cleanup and add code comments * add changie * swap function for dict * add additional unit tests * fix unit test * update README and protos * fix formatting * update precommit --------- Co-authored-by: Peter Webb <peter.webb@dbtlabs.com>

Commit:e9f81a9
Author:Colin

merge main

Commit:2e35426
Author:Peter Webb
Committer:GitHub

Add support for getting freshness from DBMS metadata (#8795) * Add support for getting freshness from DBMS metadata * Add changelog entry * Add simple test case * Change parsing error to warning and add new event type for warning * Code review simplification of capability dict. * Revisions to the capability mechanism per review * Move utility function. * Reduce try/except scope * Clean up imports. * Simplify typing per review * Unit test fix

Commit:549dbf3
Author:Chenyu Li
Committer:GitHub

Deps lock by justbldwn (#8408) * :sparkles: adding installed_packages.json functionality * :white_check_mark: update test_simple_dependency_deps test * :memo: adding changelog for deps feature via changie * :sparkles: restructure deps command, include lock/add * :white_check_mark: add new deps event types to sample_values * :white_check_mark: fix test_simple_dependency_deps test * :bug: attempting to fix cli commands * :bug: convert dbt deps to dbt deps install also leave dbt deps as just a new click group * :white_check_mark: update test_command_mutually_exclusive_option change deps command to deps install * :white_check_mark: update functional tests from deps > deps install * :white_check_mark: change missing deps to deps install * :white_check_mark: convert adapter tests to deps install from deps * move back to deps and merge more with main * fix-unittest * add hash * foramt yml and update command structure * nits * add new param * nits * nits * nits * fix_tests * pr_feedback * nits * nits * move_check * Update Features-20230125-165933.yaml --------- Co-authored-by: Justin Baldwin <91483530+justbldwn@users.noreply.github.com>

Commit:5e1f0c5
Author:Peter Webb
Committer:GitHub

Report Resource Usage Statistics When a dbt Command Finishes (#8671) * Add performance metrics to the CommandCompleted event. * Add changelog entry. * Add flag for controling the log level of ResourceReport. * Update changelog entry to reflect changes * Remove outdated attributes * Work around missing resource module on windows * Fix corner case where flags are not set

Commit:29f734d
Author:colin-rogers-dbt
Committer:GitHub

Move events to common (#8676) * Move events to common * More Type Annotations (#8536) * Extend use of type annotations in the events module. * Add return type of None to more __init__ definitions. * Still more type annotations adding -> None to __init__ * Tweak per review * Allow adapters to include python package logging in dbt logs (#8643) * add set_package_log_level functionality * set package handler * set package handler * add logging about stting up logging * test event log handler * add event log handler * add event log level * rename package and add unit tests * revert logfile config change * cleanup and add code comments * add changie * swap function for dict * add additional unit tests * fix unit test * update README and protos * fix formatting * update precommit --------- Co-authored-by: Peter Webb <peter.webb@dbtlabs.com>

Commit:f52bd92
Author:Kshitij Aranke
Committer:GitHub

Fix #8160: Warn when --state == --target (#8638)

Commit:845aeb6
Author:Gerda Shank
Committer:GitHub

Fix snapshot success message to display "INSERT 0 1" (for example) instead of success (#8524) (#8532)

Commit:3be943e
Author:Gerda Shank
Committer:GitHub

Fix snapshot success message to display "INSERT 0 1" (for example) instead of success (#8524) (#8531)

Commit:5372157
Author:Gerda Shank
Committer:GitHub

Fix snapshot success message to display "INSERT 0 1" (for example) instead of success (#8524) (#8530)

Commit:d8e8a78
Author:Gerda Shank
Committer:GitHub

Fix snapshot success message to display "INSERT 0 1" (for example) instead of success (#8524)

Commit:0d64bd9
Author:Peter Webb
Committer:GitHub

Backport 8210 (#8500) * Replaced the FirstRunResultError and AfterFirstRunResultError events with RunResultError. * Attempts at reasonable unit tests. * Restore event manager after unit test.

Commit:1afbb87
Author:Emily Rockman
Committer:GitHub

Convert error to conditional warning for unversioned contracted model, fix msg format (#8451) * first pass, tests need updates * update proto defn * fixing tests * more test fixes * finish fixing test file * reformat the message * formatting messages * changelog * add event to unit test * feedback on message structure * WIP * fix up event to take in all fields * fix test

Commit:5814928
Author:Peter Webb
Committer:GitHub

Issue One Event Per Node Failure (#8210) * Replaced the FirstRunResultError and AfterFirstRunResultError events with RunResultError. * Attempts at reasonable unit tests. * Restore event manager after unit test.

Commit:44572e7
Author:Peter Webb
Committer:GitHub

Semantic Model Validation (#8049) * Use dbt-semantic-interface validations on semantic models and metrics defined in Core. * Remove empty test, since semantic models don't generate any validation warnings. * Add changelog entry. * Temporarily remove requirement that there must be semantic models definied in order to define metrics

Commit:3e5e693
Author:Jeremy Cohen
Committer:GitHub

fire proper event for inline query error (#7960) (#8021)

Commit:4c44c29
Author:Chenyu Li
Committer:GitHub

fire proper event for inline query error (#7960)

Commit:7a6beda
Author:Michelle Ark
Committer:GitHub

consolidate cross-project ref entrypoint + plugin framework (#7955)

Commit:ecf90d6
Author:Michelle Ark
Committer:GitHub

Refactor/unify public and model nodes (#7891)

Commit:9776e7a
Author:colin-rogers-dbt
Committer:GitHub

backport 7862 to 1.5.latest (#7878) * cherry pick f767943fb22b493a762ff76c4d942eec7b781b2c * Regenerate event proto types --------- Co-authored-by: Jeremy Cohen <jeremy@dbtlabs.com>

Commit:24d61fc
Author:Colin

cherry pick e3498bdaa5cdfeb6a52b2d1959ddbd4260fc7b8d

Commit:f767943
Author:colin-rogers-dbt
Committer:GitHub

Add AdapterRegistered event log message (#7862) * Add AdapterRegistered event log message * Add AdapterRegistered to unit test * make versioning and logging consistent * make versioning and logging consistent * add to_version_string * remove extra equals * format fire_event

Commit:963a38f
Author:Emily Rockman
Committer:GitHub

[BACKPORT] Improve warning for constraints and mat types (#7806) * Improve warnings for constraints and materialization types (#7696) * first pass * debugging * regen proto types * refactor to use warn_supported flag * PR feedback * regen proto files after conflicts * fix problems wqith conflict resolution

Commit:587bbcb
Author:Emily Rockman
Committer:GitHub

Improve warnings for constraints and materialization types (#7696) * first pass * debugging * regen proto types * refactor to use warn_supported flag * PR feedback

Commit:7c1bd91
Author:Gerda Shank
Committer:GitHub

CT 2590 write pub artifact to log (#7686)

Commit:4a4b7be
Author:Peter Webb
Committer:GitHub

Model Deprecation (#7562) * CT-2461: Work toward model deprecation * CT-2461: Remove unneeded conversions * CT-2461: Fix up unit tests for new fields, correct a couple oversights * CT-2461: Remaining implementation and tests for model/ref deprecation warnings * CT-2461: Changelog entry for deprecation warnings * CT-2461: Refine datetime handling and tests * CT-2461: Fix up unit test data * CT-2461: Fix some more unit test data. * CT-2461: Fix merge issues * CT-2461: Code review items. * CT-2461: Improve version -> str conversion

Commit:e10f844
Author:Gerda Shank

Create logging event and put pub artifact in log

Commit:2519171
Author:Jeremy Cohen
Committer:GitHub

Back compat for previous return type of `collect_freshness` (#7535) (#7548) * Back compat for previous retrurn type of 'collect_freshness' * Test fixups * PR feedback