Skip to content
This repository was archived by the owner on Mar 31, 2026. It is now read-only.
This repository was archived by the owner on Mar 31, 2026. It is now read-only.

TypeError: execute_streaming_sql() got an unexpected keyword argument 'resume_token' #311

@sebastian-montero

Description

@sebastian-montero

Hello!

I have been using the latest Python Spanner release that includes timeouts and retries.

I am running my code in Dataflow. I am generating partitioned reads and then reading from those partitions (with the code below).

class readSpannerPartitions(beam.DoFn):
      def __init__(self, SPANNER_CONFIG):
          self.project = SPANNER_CONFIG['spanner_project']
          self.instance = SPANNER_CONFIG['spanner_instance']
          self.db = SPANNER_CONFIG['spanner_database']
          self.query = SPANNER_CONFIG['query']
          self.exact_staleness = timedelta(hours=6)

      def setup(self):
          spanner_client = spanner.Client(self.project)
          spanner_instance = spanner_client.instance(self.instance)
          self.spanner_db = spanner_instance.database(self.db)
          self.snapshot = self.spanner_db.batch_snapshot(exact_staleness=self.exact_staleness)
          self.snapshot_dict = self.snapshot.to_dict()

      def process(self, element):
          self.snapshot = BatchSnapshot.from_dict(
              self.spanner_db, element['transaction_info'])

          read_action = self.snapshot.process_query_batch
          for row in read_action(element['partitions'], timeout=21600):
              yield row

      def teardown(self):
          self.snapshot.close()

When running this code I get the stack trace below.

Error message from worker: Traceback (most recent call last): File "/usr/local/lib/python3.8/site-packages/google/api_core/grpc_helpers.py", line 113, in next return six.next(self._wrapped) File "/usr/local/lib/python3.8/site-packages/grpc/_channel.py", line 416, in __next__ return self._next() File "/usr/local/lib/python3.8/site-packages/grpc/_channel.py", line 689, in _next raise self grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with: status = StatusCode.INTERNAL details = "Received RST_STREAM with error code 2" debug_error_string = "{"created":"@1618528054.626520550","description":"Error received from peer ipv4:172.217.169.74:443","file":"src/core/lib/surface/call.cc","file_line":1061,"grpc_message":"Received RST_STREAM with error code 2","grpc_status":13}" > The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/usr/local/lib/python3.8/site-packages/google/cloud/spanner_v1/snapshot.py", line 56, in _restart_on_unavailable for item in iterator: File "/usr/local/lib/python3.8/site-packages/google/api_core/grpc_helpers.py", line 116, in next six.raise_from(exceptions.from_grpc_error(exc), exc) File "<string>", line 3, in raise_from google.api_core.exceptions.InternalServerError: 500 Received RST_STREAM with error code 2 During handling of the above exception, another exception occurred: Traceback (most recent call last): File "apache_beam/runners/common.py", line 1239, in apache_beam.runners.common.DoFnRunner.process File "apache_beam/runners/common.py", line 587, in apache_beam.runners.common.SimpleInvoker.invoke_process File "apache_beam/runners/common.py", line 1374, in apache_beam.runners.common._OutputProcessor.process_outputs File "<file_name_hidden>", line 184, in process File "/usr/local/lib/python3.8/site-packages/google/cloud/spanner_v1/streamed.py", line 146, in __iter__ self._consume_next() File "/usr/local/lib/python3.8/site-packages/google/cloud/spanner_v1/streamed.py", line 118, in _consume_next response = six.next(self._response_iterator) File "/usr/local/lib/python3.8/site-packages/google/cloud/spanner_v1/snapshot.py", line 75, in _restart_on_unavailable iterator = restart(resume_token=resume_token) TypeError: execute_streaming_sql() got an unexpected keyword argument 'resume_token' During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/usr/local/lib/python3.8/site-packages/dataflow_worker/batchworker.py", line 649, in do_work work_executor.execute() File "/usr/local/lib/python3.8/site-packages/dataflow_worker/executor.py", line 179, in execute op.start() File "dataflow_worker/shuffle_operations.py", line 63, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start File "dataflow_worker/shuffle_operations.py", line 64, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start File "dataflow_worker/shuffle_operations.py", line 79, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start File "dataflow_worker/shuffle_operations.py", line 80, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start File "dataflow_worker/shuffle_operations.py", line 84, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start File "apache_beam/runners/worker/operations.py", line 359, in apache_beam.runners.worker.operations.Operation.output File "apache_beam/runners/worker/operations.py", line 221, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive File "dataflow_worker/shuffle_operations.py", line 261, in dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process File "dataflow_worker/shuffle_operations.py", line 268, in dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process File "apache_beam/runners/worker/operations.py", line 359, in apache_beam.runners.worker.operations.Operation.output File "apache_beam/runners/worker/operations.py", line 221, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process File "apache_beam/runners/worker/operations.py", line 719, in apache_beam.runners.worker.operations.DoOperation.process File "apache_beam/runners/common.py", line 1241, in apache_beam.runners.common.DoFnRunner.process File "apache_beam/runners/common.py", line 1306, in apache_beam.runners.common.DoFnRunner._reraise_augmented File "apache_beam/runners/common.py", line 1239, in apache_beam.runners.common.DoFnRunner.process File "apache_beam/runners/common.py", line 587, in apache_beam.runners.common.SimpleInvoker.invoke_process File "apache_beam/runners/common.py", line 1401, in apache_beam.runners.common._OutputProcessor.process_outputs File "apache_beam/runners/worker/operations.py", line 221, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process File "apache_beam/runners/worker/operations.py", line 719, in apache_beam.runners.worker.operations.DoOperation.process File "apache_beam/runners/common.py", line 1241, in apache_beam.runners.common.DoFnRunner.process File "apache_beam/runners/common.py", line 1306, in apache_beam.runners.common.DoFnRunner._reraise_augmented File "apache_beam/runners/common.py", line 1239, in apache_beam.runners.common.DoFnRunner.process File "apache_beam/runners/common.py", line 587, in apache_beam.runners.common.SimpleInvoker.invoke_process File "apache_beam/runners/common.py", line 1401, in apache_beam.runners.common._OutputProcessor.process_outputs File "apache_beam/runners/worker/operations.py", line 221, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive File "apache_beam/runners/worker/operations.py", line 718, in apache_beam.runners.worker.operations.DoOperation.process File "apache_beam/runners/worker/operations.py", line 719, in apache_beam.runners.worker.operations.DoOperation.process File "apache_beam/runners/common.py", line 1241, in apache_beam.runners.common.DoFnRunner.process File "apache_beam/runners/common.py", line 1321, in apache_beam.runners.common.DoFnRunner._reraise_augmented File "/usr/local/lib/python3.8/site-packages/future/utils/__init__.py", line 446, in raise_with_traceback raise exc.with_traceback(traceback) File "apache_beam/runners/common.py", line 1239, in apache_beam.runners.common.DoFnRunner.process File "apache_beam/runners/common.py", line 587, in apache_beam.runners.common.SimpleInvoker.invoke_process File "apache_beam/runners/common.py", line 1374, in apache_beam.runners.common._OutputProcessor.process_outputs File "/home/hal9000/Periodic-Tasks/BeamerySQL/BeamerySQL-US/spanner2bq/app/src/main.py", line 184, in process File "/usr/local/lib/python3.8/site-packages/google/cloud/spanner_v1/streamed.py", line 146, in __iter__ self._consume_next() File "/usr/local/lib/python3.8/site-packages/google/cloud/spanner_v1/streamed.py", line 118, in _consume_next response = six.next(self._response_iterator) File "/usr/local/lib/python3.8/site-packages/google/cloud/spanner_v1/snapshot.py", line 75, in _restart_on_unavailable iterator = restart(resume_token=resume_token) TypeError: execute_streaming_sql() got an unexpected keyword argument 'resume_token' [while running 'Read From Partitions']

Happy to provide further details.

Metadata

Metadata

Assignees

Labels

🚨This issue needs some love.api: spannerIssues related to the googleapis/python-spanner API.priority: p1Important issue which blocks shipping the next release. Will be fixed prior to next release.type: bugError or flaw in code with unintended results or allowing sub-optimal usage patterns.

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions