Skip to content

BufferError: Existing exports of data: object cannot be re-sized #2271

@jeffreydwalter

Description

@jeffreydwalter

OS: Darwin 17.4.0 Darwin Kernel Version 17.4.0: Sun Dec 17 09:19:54 PST 2017; root:xnu-4570.41.2~1/RELEASE_X86_64 x86_64
Python Version: 3.6.0
tornado==4.5.3

So, I have a little websocket server that serves video streams. And I am getting this error:

$ python3 test.py 
*** Websocket Server Started at 192.168.119.1***
websocket opened
message received: start data stream
spawning subprocess to generate data stream
ERROR:tornado.application:Uncaught exception GET /ws (127.0.0.1)
HTTPServerRequest(protocol='http', host='localhost:8888', method='GET', uri='/ws', version='HTTP/1.1', remote_ip='127.0.0.1', headers={'Host': 'localhost:8888', 'Connection': 'Upgrade', 'Pragma': 'no-cache', 'Cache-Control': 'no-cache', 'Upgrade': 'websocket', 'Origin': 'http://localhost:8888', 'Sec-Websocket-Version': '13', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_13_3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/63.0.3239.132 Safari/537.36', 'Dnt': '1', 'Accept-Encoding': 'gzip, deflate, br', 'Accept-Language': 'en-US,en;q=0.9', 'Sec-Websocket-Key': 'ZAug6WtdtAp7g1RYx2/m9Q==', 'Sec-Websocket-Extensions': 'permessage-deflate; client_max_window_bits'})
Traceback (most recent call last):
  File "/usr/local/lib/python3.6/site-packages/tornado/web.py", line 1468, in _stack_context_handle_exception
    raise_exc_info((type, value, traceback))
  File "<string>", line 4, in raise_exc_info
  File "/usr/local/lib/python3.6/site-packages/tornado/stack_context.py", line 316, in wrapped
    ret = fn(*args, **kwargs)
  File "/usr/local/lib/python3.6/site-packages/tornado/websocket.py", line 502, in <lambda>
    self.stream.io_loop.add_future(result, lambda f: f.result())
  File "/usr/local/lib/python3.6/site-packages/tornado/concurrent.py", line 238, in result
    raise_exc_info(self._exc_info)
  File "<string>", line 4, in raise_exc_info
  File "/usr/local/lib/python3.6/site-packages/tornado/gen.py", line 1063, in run
    yielded = self.gen.throw(*exc_info)
  File "test.py", line 87, in on_message
    yield executor.submit(self.stream_data)
  File "/usr/local/lib/python3.6/site-packages/tornado/gen.py", line 1055, in run
    value = future.result()
  File "/usr/local/Cellar/python3/3.6.0/Frameworks/Python.framework/Versions/3.6/lib/python3.6/concurrent/futures/_base.py", line 398, in result
    return self.__get_result()
  File "/usr/local/Cellar/python3/3.6.0/Frameworks/Python.framework/Versions/3.6/lib/python3.6/concurrent/futures/_base.py", line 357, in __get_result
    raise self._exception
  File "/usr/local/Cellar/python3/3.6.0/Frameworks/Python.framework/Versions/3.6/lib/python3.6/concurrent/futures/thread.py", line 55, in run
    result = self.fn(*self.args, **self.kwargs)
  File "test.py", line 79, in stream_data
    self.write_message(data)#, binary=True)
  File "/usr/local/lib/python3.6/site-packages/tornado/websocket.py", line 252, in write_message
    return self.ws_connection.write_message(message, binary=binary)
  File "/usr/local/lib/python3.6/site-packages/tornado/websocket.py", line 793, in write_message
    return self._write_frame(True, opcode, message, flags=flags)
  File "/usr/local/lib/python3.6/site-packages/tornado/websocket.py", line 776, in _write_frame
    return self.stream.write(frame)
  File "/usr/local/lib/python3.6/site-packages/tornado/iostream.py", line 395, in write
    self._write_buffer += data
BufferError: Existing exports of data: object cannot be re-sized
^CTraceback (most recent call last):
  File "test.py", line 105, in <module>
    ioloop.IOLoop.instance().start()
  File "/usr/local/lib/python3.6/site-packages/tornado/ioloop.py", line 863, in start
    event_pairs = self._impl.poll(poll_timeout)
  File "/usr/local/lib/python3.6/site-packages/tornado/platform/kqueue.py", line 66, in poll
    kevents = self._kqueue.control(None, 1000, timeout)

I have put together a little gist which reproduces the problem.

It seems to be related to the amount (rate?) of data being processed.
If I change line 72 p = subprocess.Popen(['/usr/bin/tr', '-dc', 'A-Za-z0-9'], stdin=dev_rand.stdout, stdout=subprocess.PIPE, env=new_env)
to p = subprocess.Popen(['/usr/bin/tr', '-dc', 'A-Z'], stdin=dev_rand.stdout, stdout=subprocess.PIPE, env=new_env)
the problem seems to go away. I surmise that this change "fixes" the problem because it reduces the amount (rate?) of the data being generated by the subprocess.

I'm new to tornado, so any insight into this would be tremendously helpful. If I'm doing it wrong, please let me know the correct way to do what I'm after.

Thanks!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions