-
Notifications
You must be signed in to change notification settings - Fork 10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: add capability for writing to bigquery #993
Conversation
Codecov ReportAttention: Patch coverage is
✅ All tests successful. No failed tests found.
Additional details and impacted files@@ Coverage Diff @@
## main #993 +/- ##
==========================================
- Coverage 97.96% 97.74% -0.22%
==========================================
Files 446 454 +8
Lines 36045 36368 +323
==========================================
+ Hits 35311 35548 +237
- Misses 734 820 +86
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. |
Codecov ReportAttention: Patch coverage is ✅ All tests successful. No failed tests found.
📢 Thoughts on this report? Let us know! |
❌ 1 Tests Failed:
View the top 1 failed tests by shortest run time
To view more test analytics, go to the Test Analytics Dashboard |
❌ 1 Tests Failed:
View the top 1 failed tests by shortest run time
To view more test analytics, go to the Test Analytics Dashboard |
✅ All tests successful. No failed tests were found. 📣 Thoughts on this report? Let Codecov know! | Powered by Codecov |
this is an abstraction to help us use bigquery for test analytics data this abstraction allows us to write data to bigquery using the Storage Write API and query it using SQL statements as strings, it also handles creating the connection to bigquery for us the use protocol buffers is a consequence of using the Storage Write API feat: create TADriver abstraction this commit creates the TADriver abstraction which (for now) exposes a write_testruns method, which will take care of persisting a list of testruns to some specific storage this commit implements this interface for Postgres and for Bigquery the idea is that we will start by persisting testruns to both, then eventually we add a method to the interface for retrieving the aggregates at which point it will be safe for our cloud offering to switch over to only bigquery for storage of TA testruns.
e47e13c
to
b863e0f
Compare
batch_commit_write_streams_request.parent = parent | ||
batch_commit_write_streams_request.write_streams = [write_stream.name] | ||
|
||
self.write_client.batch_commit_write_streams(batch_commit_write_streams_request) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is the complexity specific for batch_commit_write_streams
? I mean wholy sh…, this API is convoluted and overly complex.
Can you point to the docs related to inserting / batch-writing to bigquery?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this PR: