These 25 commits are when the Protocol Buffers files have changed:
Commit: | 87b7bc0 | |
---|---|---|
Author: | Anthonios Partheniou | |
Committer: | GitHub |
test: Fix system test test_append_rows_with_proto3 (#802) * test: Fix system test test_append_rows_with_proto3 * add license header * fix build * add license header * update license header
The documentation is generated from this commit.
Commit: | 27dbbc2 | |
---|---|---|
Author: | Lingqing Gan | |
Committer: | GitHub |
feat: add stream write samples for range (#780) * feat: add stream write samples for range * lint
Commit: | 8ca3f2f | |
---|---|---|
Author: | Rosie Zou |
moved test files into system test directory
Commit: | 594099a | |
---|---|---|
Author: | Rosie Zou |
more WIP
Commit: | a53aa1f | |
---|---|---|
Author: | Rosie Zou |
WIP; storage write API modernization
Commit: | cf5709e | |
---|---|---|
Author: | Tim Swast | |
Committer: | GitHub |
doc: add region tags to `customer_record.proto` so it can be embedded (#391) In response to customer feedback, where they had trouble finding the module generated from this file.
Commit: | db51469 | |
---|---|---|
Author: | Veronica Wasson | |
Committer: | GitHub |
docs(samples): Add minimal sample to show Write API in pending mode (#322) This sample is a stripped down version of the bigquerystorage_append_rows_raw_proto2 sample, for embedding in the Write API documentation. The docs would then link to the longer sample which shows how to format all of the datatypes including STRUCT types. btw I registered a new region tag for this snippet
Commit: | 2461f63 | |
---|---|---|
Author: | Tim Swast | |
Committer: | GitHub |
feat: add `BigQueryWriteClient` where `append_rows` returns a helper for writing rows (#284) * WIP: write client sample * add sample with nullable types * add schema for all supported types * add complex types to code sample * refactor sample so that it can be tested * make test assertions more thorough * fix lint error * remove done TODO * address reviewer comments * fix tag mismatch * test on multiple regions * correct comments about why offset exists * upgrade g-c-b * WIP: invert stream using BiDi class * WIP: attempt to use Future for send instead * WIP: use futures, populated by background consumer * make sure stream is actually open before returning from open * copy close implementation from pub/sub * support extra metadata * process exceptions, add open timeout * sort imports * WIP: unit tests * drain futures when stream closes * update docs * add callbacks to detect when a stream fails * add unit tests * add sleep to loop waiting for RPC to be active * don't freeze if initial RPC fails * add needed initializations so done() functions * fail fast when there is a problem with the initial request * don't inherit concurrent.futures It's unnecessary and kept resulting in stuff getting stuck. * add unit test for open timeout * 🦉 Updates from OwlBot See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md * add manual client to docs * typo in sample comments * force timeout and metadata to be kwargs * unify interface for sending row data * pull stream name from merged request * require newer proto-plus for copy_from method Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com>
Commit: | e761697 | |
---|---|---|
Author: | Anthonios Partheniou | |
Committer: | GitHub |
chore: delete unused protos (#201)
This commit does not contain any .proto
files.
Commit: | f941446 | |
---|---|---|
Author: | Yoshi Automation Bot | |
Committer: | GitHub |
feat: Add ZSTD compression as an option for Arrow (#197) Committer: @emkornfield PiperOrigin-RevId: 374220891 Source-Author: Google APIs <noreply@google.com> Source-Date: Mon May 17 10:03:14 2021 -0700 Source-Repo: googleapis/googleapis Source-Sha: 23efea9fc7bedfe53b24295ed84b5f873606edcb Source-Link: https://github.com/googleapis/googleapis/commit/23efea9fc7bedfe53b24295ed84b5f873606edcb
Commit: | a6d6afa | |
---|---|---|
Author: | Yoshi Automation Bot | |
Committer: | GitHub |
feat: new JSON type through BigQuery Write (#178) Committer: @yirutang PiperOrigin-RevId: 368275477 Source-Author: Google APIs <noreply@google.com> Source-Date: Tue Apr 13 12:51:14 2021 -0700 Source-Repo: googleapis/googleapis Source-Sha: 95e7f055087a43f638ffd9a0f25ce36dbea87953 Source-Link: https://github.com/googleapis/googleapis/commit/95e7f055087a43f638ffd9a0f25ce36dbea87953
Commit: | bef63fb | |
---|---|---|
Author: | Yoshi Automation Bot | |
Committer: | GitHub |
feat: updates for v1beta2 storage API - Updated comments on BatchCommitWriteStreams - Added new support Bigquery types BIGNUMERIC and INTERVAL to TableSchema - Added read rows schema in ReadRowsResponse - Misc comment updates (#172) Committer: @yirutang PiperOrigin-RevId: 366811078 Source-Author: Google APIs <noreply@google.com> Source-Date: Mon Apr 5 09:19:17 2021 -0700 Source-Repo: googleapis/googleapis Source-Sha: b1614aa0668564ec41d78b72cf776e0292ffc98c Source-Link: https://github.com/googleapis/googleapis/commit/b1614aa0668564ec41d78b72cf776e0292ffc98c
Commit: | 1c91a27 | |
---|---|---|
Author: | Yoshi Automation Bot | |
Committer: | GitHub |
feat: add a Arrow compression options (Only LZ4 for now) (#166) Also: * feat: Return schema on first ReadRowsResponse. * doc: clarify limit on filter string. This PR was generated using Autosynth. :rainbow: Synth log will be available here: https://source.cloud.google.com/results/invocations/72a2a14b-0135-4939-ae4b-93b118a2b3e8/targets - [ ] To automatically regenerate this PR, check this box. (May take up to 24 hours.) PiperOrigin-RevId: 365759522 Source-Link: https://github.com/googleapis/googleapis/commit/c539b9b08b3366ee00c0ec1950f4df711552a269
Commit: | e5f6198 | |
---|---|---|
Author: | Tim Swast | |
Committer: | GitHub |
feat: add clients for v1beta2 endpoint (#113) This is 100% autogenerated code. Subsequent PRs will cover manual classes. Thank you for opening a Pull Request! Before submitting your PR, there are a few things you can do to make sure it goes smoothly: - [ ] Make sure to open an issue as a [bug/issue](https://github.com/googleapis/python-bigquery-storage/issues/new/choose) before writing your code! That way we can discuss the change, evaluate designs, and agree on the general idea - [ ] Ensure the tests and linter pass - [ ] Code coverage does not decrease (if any source code was changed) - [ ] Appropriate docs were updated (if necessary) Fixes #<issue_number_goes_here> 🦕 TODO: - [x] docs build successful - [x] unit tests pass (need to remove `test_append_rows_flattened_*` tests, as there are no flattened arguments for `append_rows`)
Commit: | 0a0eb2e | |
---|---|---|
Author: | Peter Lamut | |
Committer: | GitHub |
chore: transition the library to microgenerator (#62) * chore: remove old GAPIC code for v1 API * Regenerate the v1 API with microgenerator * Adjust dependencies and classifiers in setup.py * Fix types aggregation in types.py * Adjust import paths * Fix and adjust unit tests * Fix and adjust system tests * Adjust unit test coverage threshold Not all paths are covered, not even in the generated code, thus the adjustment is necessary. * Fix docs build * Adjust quickstart sample * Adjust sample in client docstring * Remove beta API code and docs * Simplify synth replacement rules and regenerate Rules conditionally matching versions other than v1 are not needed anymore. * Consolidate imports in google.cloud.bigquery.storage * Use gogole.cloud.bigquery.storage as import path * Hide async client from most import paths * Use GAPIC client mock in ReadRowsStream tests * Remove redundant installations in nox sessions * Include manual classes in reference docs * Add UPGRADING guide * Add minor CHANGELOG improvements
Commit: | 645e65d | |
---|---|---|
Author: | Yoshi Automation Bot | |
Committer: | GitHub |
feat: add resource path helper methods (#40)
Commit: | 9812244 | |
---|---|---|
Author: | Yoshi Automation Bot | |
Committer: | GitHub |
chore: template updates (via synth)
Commit: | 49b941a | |
---|---|---|
Author: | Yoshi Automation Bot | |
Committer: | GitHub |
[CHANGE ME] Re-generated to pick up changes in the API or client library generator. (#21) Co-authored-by: shollyman <shollyman@google.com>
Commit: | 2ea5ac4 | |
---|---|---|
Author: | shollyman | |
Committer: | GitHub |
feat: update synth to generate v1beta2, v1 endpoints for bigquerystorage (#10) * feat: update synth to generate v1beta2, v1 endpoints for bigquerystorage There's also some work to try to accomodate v1alpha2 in this PR, but there exists further oddities that must be tackled before we can fully add generation of the alpha client here. Also, note that this PR does not include manual client modifications such as streaming offset resumption that are present in the v1beta1 client. Intent is to address further changes like that in subsequent PRs.
Commit: | 9e09634 | |
---|---|---|
Author: | Tres Seaver | |
Committer: | GitHub |
chore(bigquery_storage): fix up protobuf messages w/o summary docstrings (#9333) Closes #9329.
Commit: | 3b6f76f | |
---|---|---|
Author: | Yoshi Automation Bot | |
Committer: | Tim Swast |
Add sharding strategy, stream splitting, Arrow support (via synth). (#8477)
Commit: | ee330bb | |
---|---|---|
Author: | Yoshi Automation Bot | |
Committer: | Tim Swast |
Add annotations to protocol buffers indicating request parameters (via synth). (#7550) [Internal] This commit should not have any effect on the public interface, or even the behavior of the client. It picks up some additional metadata to indicate resources used in request parameters for the purpose of adding helpers in the generated clients for statically-typed languages.
Commit: | 02f1df1 | |
---|---|---|
Author: | Yoshi Automation Bot | |
Committer: | Tres Seaver |
Copy lintified proto files (via synth). (#7475)
Commit: | c4d3c98 | |
---|---|---|
Author: | Yoshi Automation Bot | |
Committer: | Tres Seaver |
Update proto / docstrings (via synth). (#7461)
Commit: | da1a830 | |
---|---|---|
Author: | Christopher Wilcox | |
Committer: | GitHub |
Add protos as an artifact to library (#7205)