This proposed feature allows freezing some parameters when finetuning from checkpoints. The frozen parameter does not need gradients and will not get updated during finetuning, thus making finetuning faster, more efficient and avoiding catastrophic forgetting.
Needs 1 new configuration key:
finetune_frozen_params: parameter key prefixes that need to be frozen during finetuning.