Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Python: Include a function_invoke_attempt index with Streaming CMC (#…
…10009) ### Motivation and Context During auto function calling, we're yielding all messages back without any indication as to which invocation index they are related to. This information could be helpful to the caller to understand in which order message chunks were received during the auto function invocation loop. Depending upon the behavior of auto function calling, the `request_index` iterates up to the `maximum_auto_invoke_attempts`. The caller doesn't know today which function auto invoke attempt they're currently on -- so simply handing all yielded messages can be confusing. In a new PR, we will handle adding the `request_index` (perhaps with a different name) to make it easier to know which streaming message chunks to concatenate, which should help reduce the confusion down the line. <!-- Thank you for your contribution to the semantic-kernel repo! Please help reviewers and future users, providing the following information: 1. Why is this change required? 2. What problem does it solve? 3. What scenario does it contribute to? 4. If it fixes an open issue, please link to the issue here. --> ### Description This PR adds: - The `function_invoke_attempt` attribute to the `StreamingChatMessageContent` class. This can help callers/users track which streaming chat message chunks belong to which auto function invocation attempt. - A new keyword argument was added to the `_inner_get_streaming_chat_message_contents` to allow the `function_invoke_attempt` int to be passed through to the `StreamingChatMessageContent` creation in each AI Service. This **additive** keyword argument should not break existing. - Updates unit tests - Creates four new samples to showcase auto function calling: streaming auto invoke / manual invoke (print tool calls), and non-streaming auto invoke / manual invoke (print tool calls). These samples allow one to specify the AI service that supports function calling, as listed in the samples. - Closes #10006 <!-- Describe your changes, the overall approach, the underlying design. These notes will help understanding how your code works. Thanks! --> ### Contribution Checklist <!-- Before submitting this PR, please make sure: --> - [X] The code builds clean without any errors or warnings - [X] The PR follows the [SK Contribution Guidelines](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md) and the [pre-submission formatting script](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md#development-scripts) raises no violations - [X] All unit tests pass, and I have added new tests where possible - [X] I didn't break anyone 😄
- Loading branch information