Skip to content

Conversation

@mattt
Copy link
Contributor

@mattt mattt commented Dec 3, 2023

$ replicate stream --help
Alias for "prediction create --stream"

Usage:
  replicate stream <owner/model[:version]> [input=value] ... [flags]

Flags:
  -h, --help                      help for stream
      --json                      Emit JSON
      --no-wait                   Don't wait for prediction to complete
      --output-directory string   Output directory, defaults to ./{prediction-id}
      --save                      Save prediction outputs to directory
      --separator string          Separator between input key and value (default "=")
      --stream                    Stream prediction output
  -w, --wait                      Wait for prediction to complete (default true)
      --web                       View on web
$ replicate stream meta/llama-2-70b-chat prompt="Tell me a joke about llamas"
Sure, here's a joke about llamas for you:

Why did the llama refuse to play poker?

Because he always got fleeced!

I hope that made you smile! Is there anything else I can assist you with?⏎ 

@mattt mattt marked this pull request as ready for review December 4, 2023 13:41
@mattt mattt merged commit e14440a into main Dec 4, 2023
@mattt mattt deleted the mattt/stream branch December 4, 2023 14:18
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants