Skip to content

Feature proposal: freeze specific parameters when finetuning #116

@yqzhishen

Description

@yqzhishen

This proposed feature allows freezing some parameters when finetuning from checkpoints. The frozen parameter does not need gradients and will not get updated during finetuning, thus making finetuning faster, more efficient and avoiding catastrophic forgetting.

Needs 1 new configuration key:

  • finetune_frozen_params: parameter key prefixes that need to be frozen during finetuning.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    Status

    Done

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions