-
Notifications
You must be signed in to change notification settings - Fork 4.5k
Closed
Labels
bugSomething isn't workingSomething isn't workingpythonPull requests for the Python Semantic KernelPull requests for the Python Semantic Kernel
Description
Describe the bug
Calling invoke_stream raises the following exception in the context:
Error occurred while invoking stream function: 'PromptTemplate' object has no attribute 'render_messages'
It seems like a else is missing in _local_stream_func (which was there before #4491)
semantic-kernel/python/semantic_kernel/orchestration/kernel_function.py
Lines 253 to 267 in dcd7e14
| async def _local_stream_func(client, prompt_execution_settings, context): | |
| if client is None: | |
| raise ValueError("AI LLM service cannot be `None`") | |
| if not function_config.has_chat_prompt: | |
| try: | |
| prompt = await function_config.prompt_template.render(context) | |
| result = client.complete_stream(prompt, prompt_execution_settings) | |
| async for chunk in result: | |
| yield chunk | |
| except Exception as e: | |
| # TODO: "critical exceptions" | |
| context.fail(str(e), e) | |
| try: |
To Reproduce
- Execute
invoke_streamon any function of your choice - Check
context.last_exceptionfor the message
Expected behavior
No exception should be raised
Platform
- OS: WSL
- IDE: VS Code
- Language: Python
- Source: 0.5.0
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't workingpythonPull requests for the Python Semantic KernelPull requests for the Python Semantic Kernel
Type
Projects
Status
Sprint: Done