-
Notifications
You must be signed in to change notification settings - Fork 4.5k
[Bug]: BigQuery Storage Write API does not write with no complaint #28168
Copy link
Copy link
Closed
Milestone
Description
What happened?
I wanted to test Storage Write API with SDK 2.49.0 and tried to write a simple data on Dataflow but the "writing" step does not do anything, no logging there as well.
Here is my code snippet.
with beam.Pipeline(options=pipeline_options) as pipeline:
...
# pylint: disable=line-too-long
result = objects_for_storage | 'WriteToBigQuery' >> beam.io.WriteToBigQuery(table_spec,
schema=_SCHEMA,
method=beam.io.WriteToBigQuery.Method.STORAGE_WRITE_API)
_ = (result.failed_rows_with_errors
| 'Get Errors' >> beam.Map(lambda e: {
"destination": e[0],
"row": json.dumps(e[1]),
"error_message": e[2][0]['message']
})
| "LogElements" >> beam.ParDo(LogElements()))
Here the step does not produce output
Issue Priority
Priority: 1
Issue Components
- Component: Python SDK
- Component: Java SDK
- Component: Go SDK
- Component: Typescript SDK
- Component: IO connector
- Component: Beam examples
- Component: Beam playground
- Component: Beam katas
- Component: Website
- Component: Spark Runner
- Component: Flink Runner
- Component: Samza Runner
- Component: Twister2 Runner
- Component: Hazelcast Jet Runner
- Component: Google Cloud Dataflow Runner
Reactions are currently unavailable