-
Notifications
You must be signed in to change notification settings - Fork 4.7k
Closed
Labels
bugSomething isn't workingSomething isn't working
Description
There is a bug encountered when using client optimizer (Adam) and zero optimization:
class Args:
pass
model = wide_resnet50_2(num_classes=100).to(device)
optimizer = Adam(model.parameters(), lr=0.001)
ds_args = Args()
ds_args.local_rank = 0
ds_args.deepspeed_config = None
ds_config_params = {
"train_batch_size": batch_size,
"steps_per_print": len(train_loader),
"fp16": {
"enabled": "true",
},
"zero_optimization": {
"stage": 2,
},
# "zero_allow_untested_optimizer": "true"
}
model_engine = DeepSpeedLight(ds_args, model, optimizer=optimizer, config_params=ds_config_params)In this case self.optimizer_name() is None and the following check is incorrect:
'You are using an untested ZeRO Optimizer. Please add <"zero_allow_untested_optimizer": true> in the configuration file to use it.'
AssertionError: You are using an untested ZeRO Optimizer. Please add <"zero_allow_untested_optimizer": true> in the configuration file to use it.
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working