Skip to content

[Mixtral] Fixes attention masking in the loss#29363

Merged
ArthurZucker merged 1 commit intohuggingface:mainfrom
DesmonDay:fix_mixtral_loss
Mar 4, 2024
Merged

[Mixtral] Fixes attention masking in the loss#29363
ArthurZucker merged 1 commit intohuggingface:mainfrom
DesmonDay:fix_mixtral_loss

Conversation

@DesmonDay
Copy link
Contributor

@DesmonDay DesmonDay commented Feb 29, 2024

What does this PR do?

I think there may be something not quite correct in load_balancing_loss.

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a Github issue or the forum? Please add a link
    to it if that's the case.
  • Did you make sure to update the documentation with your changes? Here are the
    documentation guidelines, and
    here are tips on formatting docstrings.
  • Did you write any new necessary tests?

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

@ArthurZucker

Copy link
Collaborator

@ArthurZucker ArthurZucker left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's update the title to "fixes attention masking in the loss "(for the title)
LGTM otherwise

@DesmonDay DesmonDay changed the title Fix Mixtral load balancing loss [Mixtral] Fixes attention masking in the loss Mar 1, 2024
@DesmonDay
Copy link
Contributor Author

DesmonDay commented Mar 1, 2024

Hi, I have updated the title. Please help merge the pull request, thanks!

@ArthurZucker ArthurZucker merged commit 39ef3fb into huggingface:main Mar 4, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants