class DropoutModel(nn.Module):
def __init__(self):
super().__init__()
self.linear = nn.Linear(10, 10)
self.dropout = nn.Dropout(p=0.1)
def forward(self, x):
x = self.linear(x)
x = self.dropout(x)
return x
def get_example_inputs(self):
return (torch.randn(10),)
USE_FAKE_TENSOR=0 GPU_NUM_DEVICES=1 python benchmarks/dynamo/torchbench.py --randomize-input --performance --trace-on-xla --training --backend=aot_torchxla_trace_once -n1 --only path:/pytorch/myscripts/model_collection.py,class:DropoutModel
..
🐛 Describe the bug
For this simple model
Running this thru the aot_torchxla_trace_once dynamo backend (need patch #88449), we would see that test and baseline dropout different items. This may be an issue with random seed.
Command:
Versions
..
cc @ezyang @gchanan @zou3519 @bdhirsh @mlazos @soumith @voznesenskym @yanboliang @penguinwu @anijain2305 @EikanWang @jgong5 @Guobing-Chen @chunyuan-w @XiaobingSuper @zhuhaozhe @blzheng @Xia-Weiwen @wenzhe-nrv @jiayisunx @desertfire