-
Notifications
You must be signed in to change notification settings - Fork 4.7k
Closed
Labels
bugSomething isn't workingSomething isn't working
Description
I'm trying to use 1-Cycle scheduler, but I meet the following error :
TypeError: FP16_DeepSpeedZeroOptimizer is not an Optimizer
Here is my configuration file :
{
"train_batch_size": 64,
"train_micro_batch_size_per_gpu": 1,
"gradient_accumulation_steps": 16,
"optimizer": {
"type": "Adam",
"params": {
"lr": 3e-05,
"betas": [
0.9,
0.999
],
"eps": 1e-8,
"weight_decay": 0.01
}
},
"gradient_clipping": 0.1,
"scheduler": {
"type": "OneCycle",
"params": {
"cycle_first_step_size": 16000,
"cycle_first_stair_count": 8000,
"decay_step_size": 16000,
"cycle_min_lr": 1e-06,
"cycle_max_lr": 3e-05,
"decay_lr_rate": 1e-07,
"cycle_min_mom": 0.85,
"cycle_max_mom": 0.99,
"decay_mom_rate": 0.0
}
},
"zero_optimization": true,
"disable_allgather": true,
"fp16": {
"enabled": true,
"loss_scale": 0,
"min_loss_scale": 1
}
}
When using another Scheduler (with FP16), I meet no problem.
Metadata
Metadata
Labels
bugSomething isn't workingSomething isn't working