-
Notifications
You must be signed in to change notification settings - Fork 4k
Closed
Labels
area:execution-modelRelated to Streamlit's execution modelRelated to Streamlit's execution modelfeature:st.chat_messageRelated to the `st.chat_message` elementRelated to the `st.chat_message` elementfeature:st.spinnerRelated to `st.spinner` elementRelated to `st.spinner` elementpriority:P3Medium priorityMedium prioritystatus:confirmedBug has been confirmed by the Streamlit teamBug has been confirmed by the Streamlit teamtype:bugSomething isn't working as expectedSomething isn't working as expected
Description
Checklist
- I have searched the existing issues for similar issues.
- I added a very descriptive title to this issue.
- I have provided sufficient information below to help reproduce this issue.
Summary
I have a langchain RAG chatbot app where the user can ask the AI assistant questions. A bug I am consistently running into is that the previous AI response duplicates when the user inputs a question. The response is faded for some reason, and it disappears once the new AI message is displayed in the app. Please see the screenshot below:
Reproducible Code Example
import streamlit as st
import uuid
from langchain_community.chat_message_histories import StreamlitChatMessageHistory
from langchain_core_runnables.history import RunnableWithMessageHistory
# Generate random uuid for session_id
if 'session_id' not in st.session_state:
st.session_state.session_id = str(uuid.uuid4())
sesson_id = st.session_state.session_id
# Create Streamlit configuration with session_id
config = {"configurable": {"session_id": session_id}}
# Create Streamlit Chat Message History
msgs = StreamlitChatMessageHistory(key="chat_messages")
# Create LLM Chains, where final chain has streamlit chat message history
[code cut for brevity...]
conversational_rag_chain = RunnableWithMessageHistory(
rag_chain,
lambda session_id: msgs,
input_messages_key="input",
history_messages_key="chat_history",
output_messages_key="answer"
)
# Show chat history in streamlit UI
for msg in msgs.messages:
if msg.type == "ai":
st.chat_message(msg.type, avatar="images/assistant.png").write(msg.content.replace("$", "\$"))
else:
st.chat_message(msg.type).write(msg.content.replace("$", "\$"))
# Get user input and show it (executes for each user input)
if user_input := st.chat_input("Ask a question:"):
st.chat_message("human").write(user_input.replace("$", "\$"))
# Begin AI response code block
with st.chat_message("ai", avatar="images/assistant.png"):
with st.spinner("Thinking..."):
response_container = st.empty()
# Get AI response (config is needed to pass streamlit session_id)
response = conversational_rag_chain.invoke(
{"input": user_input},
config=config
)
# Fill empty container with AI response, replace "$" with “\$” to prevent latex formatting
response_container.write(response["answer"].replace("$", "\$"))Steps To Reproduce
No response
Expected Behavior
No response
Current Behavior
No response
Is this a regression?
- Yes, this used to work in a previous version.
Debug info
- Streamlit version:
- Python version:
- Operating System:
- Browser:
Additional Information
This bug was reported on the streamlit forum, but was never given a fix: https://discuss.streamlit.io/t/old-response-still-displaying-faded/68553
github-actions, zalexk, arurangiza, SamNicotine, Extramarks-Education and 2 more
Metadata
Metadata
Assignees
Labels
area:execution-modelRelated to Streamlit's execution modelRelated to Streamlit's execution modelfeature:st.chat_messageRelated to the `st.chat_message` elementRelated to the `st.chat_message` elementfeature:st.spinnerRelated to `st.spinner` elementRelated to `st.spinner` elementpriority:P3Medium priorityMedium prioritystatus:confirmedBug has been confirmed by the Streamlit teamBug has been confirmed by the Streamlit teamtype:bugSomething isn't working as expectedSomething isn't working as expected
