Monitoring - Look for specific messages in retries#2451
Monitoring - Look for specific messages in retries#2451waprin wants to merge 5 commits intogoogleapis:masterfrom
Conversation
system_tests/monitoring.py
Outdated
|
|
||
| def _unknown_metric(result): | ||
| return ('The provided filter doesn\'t refer to any known ' | ||
| 'metric.'in result.message) |
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
system_tests/monitoring.py
Outdated
| retry_result = RetryResult(_has_timeseries, | ||
| max_tries=MAX_RETRIES)(client.query) | ||
| return RetryErrors(BadRequest, max_tries=MAX_RETRIES)(retry_result) | ||
| return RetryErrors(BadRequest, _unknown_metric, |
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
|
RE: "generic messages" Sometimes the error payload isn't even JSON: Sometimes it's a gRPC error (which is usually full of good info): But usually we can get specific error info from the error response: |
|
Review comments addressed, I am looking at the other retries, some of them are a bit harder to repro so still playing with it. |
system_tests/monitoring.py
Outdated
| retry_500 = RetryErrors(InternalServerError) | ||
| retry_503 = RetryErrors(ServiceUnavailable) | ||
|
|
||
| # Retry predicates |
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
|
@dhermes I also don't see anything in the |
That's why I mentioned
@waprin I can just take over that issue if you like. Wasn't trying to make it an undue burden on you. |
|
@dhermes definitely not an undue burden, but you seem like you understand what you want better, so happy to punt it to you, but if you change your mind I am more than happy to do it. |
|
@waprin You wrote:
That's incorrect.
I'm not understanding the problem you are running into. Please don't modify the class. It's working as intended. |
Yes, I was totally confused and misunderstood the problem I had previously encountered.
Looked a it again and realized this is the issue: by re-creating the Query object I was getting a new |
|
This issue may no longer be relevant due to its age. Feel free to re-open. |
This PR updates notebooks/multimodal/multimodal_dataframe.ipynb to use public APIs (bigframes.bigquery.obj and bigframes.bigquery.to_json_string) instead of internal operations for retrieving runtime JSON strings. Fixes #<issue_number_goes_here> 🦕
PR created by the Librarian CLI to initialize a release. Merging this PR will auto trigger a release. Librarian Version: v0.7.0 Language Image: us-central1-docker.pkg.dev/cloud-sdk-librarian-prod/images-prod/python-librarian-generator@sha256:1a2a85ab507aea26d787c06cc7979decb117164c81dd78a745982dfda80d4f68 <details><summary>bigframes: 2.36.0</summary> ## [2.36.0](googleapis/python-bigquery-dataframes@v2.35.0...v2.36.0) (2026-02-17) ### Features * Initial support for biglake iceberg tables (#2409) ([ae35a989](googleapis/python-bigquery-dataframes@ae35a989)) * add bigquery.ai.generate_table function (#2453) ([b925aa24](googleapis/python-bigquery-dataframes@b925aa24)) ### Documentation * fix generate_text and generate_table input docs (#2455) ([078bd32e](googleapis/python-bigquery-dataframes@078bd32e)) * update multimodal dataframe notebook to use public APIs (#2456) ([342fa723](googleapis/python-bigquery-dataframes@342fa723)) * use direct API for pdf chunk and pdf extract (#2452) ([543ce52c](googleapis/python-bigquery-dataframes@543ce52c)) * use direct API for audio transcription (#2447) ([59cbc5db](googleapis/python-bigquery-dataframes@59cbc5db)) * Add EXIF metadata extraction example to multimodal notebook (#2429) ([84c6f883](googleapis/python-bigquery-dataframes@84c6f883)) * Update multimodal notebook to use public runtime helpers (#2451) ([e36dd8b4](googleapis/python-bigquery-dataframes@e36dd8b4)) </details>
This starts to address #2415.
This was the most obvious check to add. For the other errors, 404s, 500s, and 503s all provide very generic error messages (and really we need the API team to just fix the 500s).
As far as retry logic, I'm not convinced it can be significantly improved, using a base of 3 makes the jumps too big. Maybe we could start from a higher number, but I think it would complicate the retry logic to save at best a few seconds.
So I am voting to just close #2415 after this is merged but let me know if you disagree.