Skip to content

Fix token budget + Clarify code#839

Merged
glahaye merged 3 commits intomicrosoft:mainfrom
glahaye:fix_budget
Mar 9, 2024
Merged

Fix token budget + Clarify code#839
glahaye merged 3 commits intomicrosoft:mainfrom
glahaye:fix_budget

Conversation

@glahaye
Copy link
Contributor

@glahaye glahaye commented Mar 6, 2024

Motivation and Context

The user message is not added to the GPT completion request when the token count of the prompt exceeds 50% of the configuration.

This fix is inspired by #836

Description

Fix budget calculation and clarify / simplify code.

Contribution Checklist

@glahaye glahaye requested review from crickman and teresaqhoang March 6, 2024 01:23
@github-actions github-actions bot added the webapi Pull requests that update .net code label Mar 6, 2024
@glahaye glahaye requested a review from TaoChenOSU March 6, 2024 01:23
@glahaye glahaye self-assigned this Mar 6, 2024
@tdechanterac
Copy link

@glahaye when I try your PR changes with my large prompt, the chatcompletion endpoint returns me the folowing choices :

    "choices": [
        {
            "finish_reason": "tool_calls",
            "index": 0,
            "message": {
                "role": "assistant",
                "content": null,
                "tool_calls": [
                    {
                        "id": "call_1pn6T4POraqkRVYODVYjYXEK",
                        "type": "function",
                        "function": {
                            "name": "ChatPlugin-Chat",
                            "arguments": "{\"message\": \"The candidate has strong sales experience and has worked internationally, which aligns with the job posting. However, they lack specific SDR experience in a SaaS environment.\", \"userId\": \"RecruitmentAnalysis\", \"userName\": \"HR Specialist\", \"chatId\": \"recruitment_analysis\", \"messageType\": \"text\"}"
                        }
                    },
                    {
                        "id": "call_OrXJOYstW7Iudz7Mon9VgT2S",
                        "type": "function",
                        "function": {
                            "name": "ChatPlugin-ExtractChatHistory",
                            "arguments": "{\"chatId\": \"recruitment_analysis\", \"tokenLimit\": 2000}"
                        }
                    }
                ]
            },
            "content_filter_results": {}
        }
    ]

And in the ChatAsync method, an ArgumentException is thrown in SetSystemDescriptionAsync because the chatId provided does not exists. I didn't have this issue before, so I'm not sure if this is related or if I should open an issue.

@glahaye
Copy link
Contributor Author

glahaye commented Mar 6, 2024

And in the ChatAsync method, an ArgumentException is thrown in SetSystemDescriptionAsync because the chatId provided does not exists. I didn't have this issue before, so I'm not sure if this is related or if I should open an issue.

I don't believe this is related.

@glahaye glahaye added this pull request to the merge queue Mar 9, 2024
Merged via the queue into microsoft:main with commit 8ca758c Mar 9, 2024
@glahaye glahaye deleted the fix_budget branch March 9, 2024 00:05
teamleader-dev pushed a commit to vlink-group/chat-copilot that referenced this pull request Oct 7, 2024
### Motivation and Context
The user message is not added to the GPT completion request when the
token count of the prompt exceeds 50% of the configuration.

This fix is inspired by microsoft#836

### Description
Fix budget calculation and clarify / simplify code.

### Contribution Checklist
- [ ] The code builds clean without any errors or warnings
- [ ] The PR follows the [Contribution
Guidelines](https://github.com/microsoft/chat-copilot/blob/main/CONTRIBUTING.md)
and the [pre-submission formatting
script](https://github.com/microsoft/chat-copilot/blob/main/CONTRIBUTING.md#development-scripts)
raises no violations
- [ ] All unit tests pass, and I have added new tests where possible
- [ ] I didn't break anyone 😄
kb0039 pushed a commit to aaronba/chat-copilot that referenced this pull request Jan 8, 2025
### Motivation and Context
The user message is not added to the GPT completion request when the
token count of the prompt exceeds 50% of the configuration.

This fix is inspired by microsoft#836

### Description
Fix budget calculation and clarify / simplify code.

### Contribution Checklist
- [ ] The code builds clean without any errors or warnings
- [ ] The PR follows the [Contribution
Guidelines](https://github.com/microsoft/chat-copilot/blob/main/CONTRIBUTING.md)
and the [pre-submission formatting
script](https://github.com/microsoft/chat-copilot/blob/main/CONTRIBUTING.md#development-scripts)
raises no violations
- [ ] All unit tests pass, and I have added new tests where possible
- [ ] I didn't break anyone 😄
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

webapi Pull requests that update .net code

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants