support query_max_size for insert_rows_method chunk
#679
+4
−1
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Hey guys,
To fix the issue raised in
#675
I added support for query_max_size to the insert_rows_method
chunk
.This addition is important to prevent errors like
The query is too large. The maximum standard SQL query length is 1024.00K characters
(in BigQuery) when use the insert_rows_methodchunk
Test
I intended to write a unit test for this change, but I encountered errors running the current test cases even after following the procedure outlined in
https://github.com/elementary-data/dbt-data-reliability/blob/master/integration_tests/README.md
I faced conflicting dependencies errors when attempting to install the required packages.
After I added
elementary-data
anddbt-postgres
to therequirements.txt
, another error occurred.requirements.txt
error
Therefore, instead of writing a unit test, I tested this PR by running
dbt run
in our own environment, and it worked well.