Skip to content

Support for LLaMA models#160

Merged
pacman100 merged 1 commit intohuggingface:mainfrom
zphang:llama
Mar 14, 2023
Merged

Support for LLaMA models#160
pacman100 merged 1 commit intohuggingface:mainfrom
zphang:llama

Conversation

@zphang
Copy link
Copy Markdown
Contributor

@zphang zphang commented Mar 8, 2023

Can be merged after huggingface/transformers#21955

@aeryncaen
Copy link
Copy Markdown

Have you tested this? In my local testing previously, implementing this exact change, I ran into an issue where the trainers didn't like not having a pad token in the tokenizer.

@younesbelkada
Copy link
Copy Markdown
Contributor

@zoidbb does adding:

tokenizer.pad_token = tokenizer.eos_token

helps solving your issue?

@zphang
Copy link
Copy Markdown
Contributor Author

zphang commented Mar 8, 2023

Have you tested this? In my local testing previously, implementing this exact change, I ran into an issue where the trainers didn't like not having a pad token in the tokenizer.

Yes, but I tested with my own data pipeline/data collator.

@aeryncaen
Copy link
Copy Markdown

@zoidbb does adding:

tokenizer.pad_token = tokenizer.eos_token

helps solving your issue?

Oddly not, I tried this and it still complained. I'll provide more details this evening when I'm done with work.

@tloen
Copy link
Copy Markdown

tloen commented Mar 12, 2023

@zoidbb you may also need:

tokenizer.pad_token_id = tokenizer.eos_token_id

Copy link
Copy Markdown
Contributor

@pacman100 pacman100 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you @zphang for adding this! 🤗

@pacman100 pacman100 merged commit 1c11bc0 into huggingface:main Mar 14, 2023
Guy-Bilitski pushed a commit to Guy-Bilitski/peft that referenced this pull request May 13, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants