Skip to content

Conversation

@stewartoallen
Copy link

Adds feature parity with llama.cpp's main

--promptFile for testing of larger initial prompts including embedding text
--batchSize to override the default batchSize

@stewartoallen stewartoallen changed the title add --batchSize and --promptFile options to the chat CLI command feat: add --batchSize and --promptFile options to the chat CLI command Jan 21, 2024
@stewartoallen stewartoallen changed the title feat: add --batchSize and --promptFile options to the chat CLI command feat(minor): add --batchSize and --promptFile options to the chat CLI command Jan 21, 2024
@giladgd
Copy link
Member

giladgd commented Jan 22, 2024

@stewartoallen Thanks for the PR!
I've opened another PR based on your changes for the beta branch since I think it'd be more beneficial to have these changes in the version 3 beta

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants