Ok, that's a weird one
Environment info
transformers version: 4.1.1
- Platform: Linux-5.8.0-34-generic-x86_64-with-glibc2.32
- Python version: 3.9.0+
- PyTorch version (GPU?): 1.7.1 (False)
- Tensorflow version (GPU?): not installed (NA)
- Using GPU in script?: no
- Using distributed or parallel set-up in script?: no
Steps to reproduce the behavior:
-
Run the following script
import torch
import transformers
loss = torch.tensor([1.0], requires_grad=True)
loss.backward()
The script runs correctly but exits with
[1] 46823 segmentation fault (core dumped) python testcase.py
Which doesn't happen if import transformers is commented out.
Only happens when on Python 3.9, it works as expected in 3.8.
Full env
certifi==2020.12.5
chardet==4.0.0
click==7.1.2
filelock==3.0.12
idna==2.10
joblib==1.0.0
numpy==1.19.4
packaging==20.8
pyparsing==2.4.7
regex==2020.11.13
requests==2.25.1
sacremoses==0.0.43
six==1.15.0
tokenizers==0.9.4
torch==1.7.1
tqdm==4.54.1
transformers==4.1.1
typing-extensions==3.7.4.3
urllib3==1.26.2
Ok, that's a weird one
Environment info
transformersversion: 4.1.1Steps to reproduce the behavior:
Run the following script
The script runs correctly but exits with
Which doesn't happen if
import transformersis commented out.Only happens when on Python 3.9, it works as expected in 3.8.
Full env