Skip to content

Add GPT-SW3 models to huggingface #20176

@ekgren

Description

@ekgren

Model description

At AI Sweden we are developing GPT models for the nordic region. Languages include English, Swedish, Danish, Norwegian and Icelandic.
The models are of the GPT family.
The models will range in size from 126m to 20B.
They are trained from scratch on a large corpora of 320B tokens.
They are trained with the nemo megatron framework and has a sentencepiece tokenizer.
The weights are not shared yet and we intend to share them through huggingface as well as publishing our training process and results.

Open source status

  • The model implementation is available
  • The model weights are available

Provide useful links for the implementation

Training framework: https://developer.nvidia.com/nemo/megatron

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions