Skip to content

RecordingStream context managers are broken when used with generators  #6238

@jleibs

Description

@jleibs

Problem

Example of problematic code:

import rerun as rr

def my_gen_func(stream, name):
    with stream:
        for i in range(10):
            print(f"{name} {i}")
            rr.log("stream", rr.TextLog(f"{name} {i}"))
            yield

rr.init("rerun_example_leak_context")

stream1 = rr.new_recording("rerun_example_stream1")
stream1.save("stream1.rrd")
stream2 = rr.new_recording("rerun_example_stream2")
stream2.save("stream2.rrd")

gen1 = my_gen_func(stream1, "stream1")
gen2 = my_gen_func(stream2, "stream2")

next(gen1)
next(gen2)
rr.log("stream", rr.TextLog("This should go to the global stream"))
next(gen1)
next(gen1)
next(gen1)

Run this and then open stream2.rrd.
You will see that both global messages and messages that should have been destined for stream 1 leaked into stream 2:
image

Investigation

The reason this happens is our context-manager tries to establish the context by running:

bindings.set_thread_local_data_recording(...)

But the context manager stays active on the thread after yielding the result, since the context is held around by the generator state. This means state leaks out of the generator into the thread and additionally when resuming the generator, can leak in from whatever context is calling the generator.

Metadata

Metadata

Assignees

Labels

sdk-pythonPython logging API🪳 bugSomething isn't working

Type

No type

Projects

No projects

Milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions