Short name describing what triggered the graph break
Dynamic shape operator
Values or code snippet captured at the break point
str(cause.func)
Explanation of why the graph break was triggered
Operator {cause.func}’s output shape depends on input Tensor data.
Hints on how to resolve the graph break
torch._dynamo.config.capture_dynamic_output_shape_ops = TrueExample code that causes the graph break is:
def fn():
return torch.nonzero(torch.rand([10, 10]))
A sample workaround that shows you how you can fix this:
def fn():
return torch.nonzero(torch.rand([10, 10]))
with torch._dynamo.config.patch(capture_dynamic_output_shape_ops=True):
compiled_fn = torch.compile(fn, backend="eager", fullgraph=True)
result = compiled_fn()