Skip to content
This repository was archived by the owner on Mar 7, 2026. It is now read-only.

Added Send Button and Dropdown in Chatbot Homelayout#382

Merged
joseplayero merged 9 commits intoreorproject:mainfrom
dheerajsingh89:Fixes/Adding_Send_And_DropdownButton
Aug 29, 2024
Merged

Added Send Button and Dropdown in Chatbot Homelayout#382
joseplayero merged 9 commits intoreorproject:mainfrom
dheerajsingh89:Fixes/Adding_Send_And_DropdownButton

Conversation

@dheerajsingh89
Copy link
Copy Markdown
Contributor

@dheerajsingh89 dheerajsingh89 commented Aug 27, 2024

/claim #381
Fixes : #381

After adding the changes , now the home layout looks like below

reor3

Copy link
Copy Markdown

@greptile-apps greptile-apps bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

PR Summary

This pull request adds a send button and LLM model selection dropdown to the chat interface, addressing issue #381. Key changes include:

  • Modified src/components/Chat/ChatMessages.tsx to implement new UI elements and functionality
  • Added LLM model fetching and selection state management in ChatMessages.tsx
  • Updated electron/main/llm/llmConfig.ts to include a fallback for empty LLM configurations
  • Exposed ipcRenderer.invoke in electron/preload/index.ts for main process communication

Key points to consider:

  • Ensure proper error handling for LLM model fetching and selection
  • Verify the implementation of the send button logic in ChatMessages.tsx
  • Review the security implications of exposing ipcRenderer.invoke in index.ts
  • Check if the UI layout matches the requested design in the issue description
  • Confirm that the context selector button has been repositioned as specified

3 file(s) reviewed, 5 comment(s)
Edit PR Review Bot Settings

Copy link
Copy Markdown
Collaborator

@joseplayero joseplayero left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thanks for making @itsdheerajdp

you do not need to interface the invoke function to the renderer process. you need to only call the function await window.llm.getLLMConfigs() to get the llm configs and the equivalent get/set default llm function

also please keep the code clean and consistent, state variables go at the top of a component

@dheerajsingh89
Copy link
Copy Markdown
Contributor Author

@joseplayero I made the requested changes , kindly check it out ..

@joseplayero joseplayero merged commit 90d8435 into reorproject:main Aug 29, 2024
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Add send button to chat window & dropdown to choose LLM

2 participants