🐛 Bug
The documentation [0] states that to enable sharded training one needs to install the extras packages with pip install pytorch-lightning["extra"], in my case only following the second option pip install https://github.com/PyTorchLightning/fairscale/archive/pl_1.1.0.zip actually installed fairscale.
[0] https://pytorch-lightning.readthedocs.io/en/stable/multi_gpu.html#model-parallelism-beta
To Reproduce
pip install pytorch-lightning["extra"]
Expected behavior
Fairscale gets installed.
Environment
- PyTorch Version 1.7.0
- OS: Linux
- How you installed PyTorch: pip
- Python version: 3.6.9
- CUDA/cuDNN version: 10.2
- GPU models and configuration: 4x TITAN X
🐛 Bug
The documentation [0] states that to enable sharded training one needs to install the extras packages with
pip install pytorch-lightning["extra"], in my case only following the second optionpip install https://github.com/PyTorchLightning/fairscale/archive/pl_1.1.0.zipactually installed fairscale.[0] https://pytorch-lightning.readthedocs.io/en/stable/multi_gpu.html#model-parallelism-beta
To Reproduce
pip install pytorch-lightning["extra"]Expected behavior
Fairscale gets installed.
Environment