Skip to content

Conversation

@trajepl
Copy link
Owner

@trajepl trajepl commented Jul 21, 2022

merge from official ds to my repo

RezaYazdaniAminabadi and others added 27 commits June 23, 2022 08:56
* Fix the half-precision version of CPU-Adam

* remove unexpected return

* fix the increase width (fp32/fp16)

* support fp16 tests for cpu-adam

* fix the fp16 data-loading

* change unit-test for fp16 check & slight change to parameter size

* fix for numpy error

Co-authored-by: Jeff Rasley <jerasley@microsoft.com>
Co-authored-by: Olatunji Ruwase <olruwase@microsoft.com>
Co-authored-by: Michael Wyatt <michaelwyatt@microsoft.com>
* assert no FP16 with AMD CPUs

* add unit test for AMD assert error

* missing import

* downgrade assert to warning
* Fix missing import in replace_module.py

* Change import from torch.distributed to deepspeed.comm
Co-authored-by: Jeff Rasley <jerasley@microsoft.com>
Co-authored-by: Olatunji Ruwase <olruwase@microsoft.com>
Co-authored-by: Jeff Rasley <jerasley@microsoft.com>
Co-authored-by: Michael Wyatt <michaelwyatt@microsoft.com>
…oad (#2059)

Co-authored-by: Olatunji Ruwase <olruwase@microsoft.com>
* Add github username to CODEOWNERS

* add import torch.distributed to small model debugging test script

* Replace torch.dist with deepspeed.com
Co-authored-by: Olatunji Ruwase <olruwase@microsoft.com>
Co-authored-by: Conglong Li <conglong.li@gmail.com>
Co-authored-by: Cheng Li <pistasable@gmail.com>
* Shards expert parameter groups
* Do upscaling, optimizer and deletion of fp32 grads one-by-one on each parameter group in zero-2
Co-authored-by: Olatunji Ruwase <olruwase@microsoft.com>
…2083)

Co-authored-by: Jeff Rasley <jerasley@microsoft.com>
Co-authored-by: Michael Wyatt <michaelwyatt@microsoft.com>
* [ds-inference] checkpoint loading => tqdm

solve 2 issues:
- less noise using tqdm progress bar
- more informative - tell users how much to wait and how many shards to load

New way:

```
Loading 72 checkpoints:  12%|█▎        | 9/72 [01:12<08:39,  8.25s/it]
```

* write only from one process

* style
Co-authored-by: Olatunji Ruwase <olruwase@microsoft.com>
Co-authored-by: Olatunji Ruwase <olruwase@microsoft.com>
Co-authored-by: Jeff Rasley <jerasley@microsoft.com>
Co-authored-by: yaozhewei <zheweiy@berkeley.edu>
Co-authored-by: xiaoxiawu <yxiaoxiawu@microsoft.com>
Co-authored-by: Conglong Li <conglong.li@gmail.com>
Co-authored-by: Xiaoxia (Shirley) Wu <94406484+xiaoxiawu-microsoft@users.noreply.github.com>
Co-authored-by: Jeff Rasley <jerasley@microsoft.com>
Co-authored-by: Olatunji Ruwase <olruwase@microsoft.com>
Co-authored-by: Michael Wyatt <michaelwyatt@microsoft.com>
* fix hard-coded rocm install path

* added fix for newest torch+rocm install

* added backup for not detecting rocm at all
Thanks a lot for finding this issue and fixed it :)
Co-authored-by: Zhewei Yao <zheweiyao@gmail.com>
Co-authored-by: Jeff Rasley <jerasley@microsoft.com>
Co-authored-by: Xiaoxia (Shirley) Wu <94406484+xiaoxiawu-microsoft@users.noreply.github.com>
* unit test, remove exception, add notes

* Move param_shapes to model files

* Remove hard-coded constants

* Conditioned to zero optimizer

* Add zero checkpoint merging

* Print checkpoint version

* Reshape zero_* ckpt files

* Merge zero* files contraction

* Utils for 3D contraction reshaping

* Remove bogus import

* Support bf16_zero ckpts

* Add param slice mappings

* Load universal checkpoints

* Per group mappings from Stas

* Hack to load bf16 zero files

* Param attributes

* WIP

* Fix api bug

* Update lp with local/remote hp

* Disable vocab padding handling

* Update z2 checkpoint

* Remove debug prints

* Remove debug prints; Rebase unit test

* Add reshape assert

* Padding

* Typo

* Catch nonexistent checkpoint path

* Cleanup

* Restore checkpoint state comparisons

* Add torch version guards

* More precise avoidance of false positives.

Co-authored-by: Jeff Rasley <jerasley@microsoft.com>
@trajepl trajepl closed this Jul 21, 2022
@trajepl trajepl reopened this Jul 21, 2022
@trajepl trajepl merged commit bbd2bde into trajepl:master Jul 21, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.