Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Python: Upgrade Minimum Onnx Version to enable MacOS Unit Tests #9981

Open
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

nmoeller
Copy link
Contributor

@nmoeller nmoeller commented Dec 16, 2024

Closes : #9979

enabled test cases for mac os
upgraded uv.lock file

Motivation and Context

Using version 0.4.0 did not provide a pip package for MacOS, which forced us to disabled Unit Tests on MacOs
With version 0.5.0 available we can enable the unit Tests for MacOS.

Using Version 0.5.0 will enable following features for users :

  1. Phi3.5 and Phi3.5 MoE
  2. MacOS Support without Building the Code from Source
  3. LoRa Adapter Swapping

Contribution Checklist

enabled test cases for mac os
upgraded uv.lock file
@markwallace-microsoft markwallace-microsoft added the python Pull requests for the Python Semantic Kernel label Dec 16, 2024
@github-actions github-actions bot changed the title Upgrade Minimum Onnx Version to enable MacOS Unit Tests Python: Upgrade Minimum Onnx Version to enable MacOS Unit Tests Dec 16, 2024
Copy link
Member

@eavanvalkenburg eavanvalkenburg left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Very nice, don't we also have integration tests for onnx? Any updates needed there?

@moonbox3
Copy link
Contributor

Very nice, don't we also have integration tests for onnx? Any updates needed there?

@nmoeller thanks for handling this. To Eduard's question, I do see the following in our integration tests:

skip_on_mac_available = platform.system() == "Darwin"

@nmoeller
Copy link
Contributor Author

@eavanvalkenburg @moonbox3 i am currently working on this. I also want to fully integrate my tests into your ci/cd and currently exploring options.

I think i would have to adjust the github actions to download the onnx model.
So i am thinking about adding the following steps to you CI/CD Integration Test Pipeline :

  1. Install hugging-face[cli]
  2. Download a Phi3.5 into the Agent
  3. Set the onnx env variable to the downloaded folder

But i am currently try to setup, the integration Pipeline on my Fork to test my CI/CD Changes.

@eavanvalkenburg
Copy link
Member

@nmoeller thanks for this work, adding integration tests is a bit complex because they first have to be in, before they run since they have access to all the keys, etc. So I would focus on the new code in this PR and then we can collaborate on adding a integration test for this separately, if you want to have a look, we just split the ollama tests into separate ones to get a bit less time taken for the merge, and also make it easier to debug, I would suggest the same for this.

@nmoeller nmoeller marked this pull request as ready for review December 18, 2024 14:24
@nmoeller nmoeller requested review from a team as code owners December 18, 2024 14:24
@markwallace-microsoft
Copy link
Member

markwallace-microsoft commented Dec 18, 2024

Python Test Coverage

Python Test Coverage Report •
FileStmtsMissCoverMissing
TOTAL16775184989% 
report-only-changed-files is enabled. No files were changed during this commit :)

Python Unit Test Overview

Tests Skipped Failures Errors Time
2966 4 💤 0 ❌ 0 🔥 1m 20s ⏱️

Copy link
Member

@eavanvalkenburg eavanvalkenburg left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Couple of small notes, could you also add the same marks for onnx as are in the test_chat_completions for ollama (and add that tag to pyproject)?

@@ -22,6 +21,8 @@
from semantic_kernel.connectors.ai.google.vertex_ai import VertexAIChatCompletion, VertexAIChatPromptExecutionSettings
from semantic_kernel.connectors.ai.mistral_ai import MistralAIChatCompletion, MistralAIChatPromptExecutionSettings
from semantic_kernel.connectors.ai.ollama import OllamaChatCompletion, OllamaChatPromptExecutionSettings
from semantic_kernel.connectors.ai.onnx import OnnxGenAIChatCompletion, OnnxGenAIPromptExecutionSettings
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ideally ONNXTemplate is also available from .onnx, could you check and make the change if needed?

@@ -22,6 +21,8 @@
from semantic_kernel.connectors.ai.google.vertex_ai import VertexAIChatCompletion, VertexAIChatPromptExecutionSettings
from semantic_kernel.connectors.ai.mistral_ai import MistralAIChatCompletion, MistralAIChatPromptExecutionSettings
from semantic_kernel.connectors.ai.ollama import OllamaChatCompletion, OllamaChatPromptExecutionSettings
from semantic_kernel.connectors.ai.onnx import OnnxGenAIChatCompletion, OnnxGenAIPromptExecutionSettings
from semantic_kernel.connectors.ai.onnx.utils import ONNXTemplate
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
from semantic_kernel.connectors.ai.onnx.utils import ONNXTemplate
from semantic_kernel.connectors.ai.onnx import ONNXTemplate

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
python Pull requests for the Python Semantic Kernel
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Python : Upgrade Onnx Connector to Version 0.5.0
4 participants