Skip to content

[Bug]: BigQuery Storage Write API does not write with no complaint #28168

@onurdialpad

Description

@onurdialpad

What happened?

I wanted to test Storage Write API with SDK 2.49.0 and tried to write a simple data on Dataflow but the "writing" step does not do anything, no logging there as well.

Here is my code snippet.

  with beam.Pipeline(options=pipeline_options) as pipeline:
    ...
    # pylint: disable=line-too-long
    result = objects_for_storage | 'WriteToBigQuery' >> beam.io.WriteToBigQuery(table_spec,
                                                                                schema=_SCHEMA,
                                                                                method=beam.io.WriteToBigQuery.Method.STORAGE_WRITE_API)
    _ = (result.failed_rows_with_errors
         | 'Get Errors' >> beam.Map(lambda e: {
              "destination": e[0],
              "row": json.dumps(e[1]),
              "error_message": e[2][0]['message']
            })
         | "LogElements" >> beam.ParDo(LogElements()))

Here the step does not produce output

Screen Shot 2023-08-25 at 1 58 00 PM Screen Shot 2023-08-25 at 1 58 11 PM

Issue Priority

Priority: 1

Issue Components

  • Component: Python SDK
  • Component: Java SDK
  • Component: Go SDK
  • Component: Typescript SDK
  • Component: IO connector
  • Component: Beam examples
  • Component: Beam playground
  • Component: Beam katas
  • Component: Website
  • Component: Spark Runner
  • Component: Flink Runner
  • Component: Samza Runner
  • Component: Twister2 Runner
  • Component: Hazelcast Jet Runner
  • Component: Google Cloud Dataflow Runner

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions