-
Notifications
You must be signed in to change notification settings - Fork 4.1k
Closed
Description
Description
I was following the docs for implementing useChat but the responses from the stream would not show in the messages array returned from useChat. I would only see the user's prompts in there and not the AI reponses.
When I inspected the response from the chat endpoint everything seemed to work fine and it was returning the correctly streamed AI response.
By inserting a log statement in the onError callback in useChat I was able to get the following message:
error Error: Failed to parse stream string. No separator found.
at parseStreamPart (index.mjs:173:15)
at Array.map (<anonymous>)
at readDataStream (index.mjs:217:52)
at async parseComplexResponse (index.mjs:247:36)
at async callChatApi (index.mjs:374:12)
at async getStreamedResponse (index.mjs:541:12)
at async processChatStream (index.mjs:392:46)
at async eval (index.mjs:642:13)Finally I tried reverting to an older version and there it worked just fine. I tried bumping the versions and found out that the error started appearing in version 3.0.20.
Reverting to 3.0.19 fixed the issue.
Code example
API route
export async function POST(req: NextRequest) {
try {
const body = await req.json();
const messages = body.messages ?? [];
const previousMessages = messages.slice(0, -1);
const currentMessageContent = messages[messages.length - 1].content;
const model = new ChatOpenAI({
modelName: 'gpt-3.5-turbo-1106',
temperature: 0.2,
apiKey: OPENAI_API_KEY,
streaming: true
});
const standaloneQuestionChain = RunnableSequence.from([
condenseQuestionPrompt,
model,
new StringOutputParser()
]);
const pinecone = new Pinecone({ apiKey: PINECONE_API_KEY });
const pineconeIndex = pinecone.Index(PINECONE_INDEX_NAME);
const vectorstore = await PineconeStore.fromExistingIndex(
new CohereEmbeddings({ model: 'multilingual-22-12' }),
{ pineconeIndex }
);
const retriever = vectorstore.asRetriever();
const retrievalChain = retriever.pipe(combineDocumentsFn);
const answerChain = RunnableSequence.from([
{
context: RunnableSequence.from([
input => input.question,
retrievalChain
]),
chat_history: input => input.chat_history,
question: input => input.question
},
answerPrompt,
model
]);
const conversationalRetrievalQAChain = RunnableSequence.from([
{
question: standaloneQuestionChain,
chat_history: input => input.chat_history
},
answerChain,
new BytesOutputParser()
]);
const stream = await conversationalRetrievalQAChain.stream({
question: currentMessageContent,
chat_history: formatVercelMessages(previousMessages)
});
return new StreamingTextResponse(stream);
} catch (e: any) {
return NextResponse.json({ error: e.message }, { status: e.status ?? 500 });
}
}Client component
'use client';
import { useChat } from 'ai/react';
import type { FormEvent } from 'react';
import { Box, Button, Container, TextField, Typography } from '@mui/material';
export default function Chat() {
const {
messages,
input,
handleInputChange,
handleSubmit,
isLoading: chatEndpointIsLoading
} = useChat();
async function sendMessage(e: FormEvent<HTMLFormElement>) {
e.preventDefault();
if (!messages.length) {
await new Promise(resolve => setTimeout(resolve, 300));
}
if (chatEndpointIsLoading) {
return;
}
handleSubmit(e);
}
return (
<Container>
{messages.length === 0 ? 'No messages' : ''}
{messages.length > 0
? [...messages].reverse().map(m => {
return (
<Box key={m.id}>
<Typography>{m.content}</Typography>
</Box>
);
})
: ''}
<Container>
<form onSubmit={sendMessage}>
<TextField value={input} onChange={handleInputChange} />
<Button type="submit">Send</Button>
</form>
</Container>
</Container>
);
}Additional context
No response
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels