You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
github-actionsbot
changed the title
New Feature: Support ResponseFormat in OllamaPromptExecutionSettings
Python: New Feature: Support ResponseFormat in OllamaPromptExecutionSettings
Dec 10, 2024
github-actionsbot
changed the title
Python: New Feature: Support ResponseFormat in OllamaPromptExecutionSettings
.Net: New Feature: Support ResponseFormat in OllamaPromptExecutionSettings
Dec 10, 2024
For anyone trying to be on the cutting edge with these structured outputs, just use the OpenAI connector and change the client url to "localhost:xyzport/v1" and apikey "ollama". Then, OpenAI execution settings will still work even though it's making a request to the local instance.
name: Feature request
about: Suggest an idea for this project
Now as Ollama supports structured outputs, it's
OllamaPromptExecutionSettings
can support ResponseFormat similar toOpenAIPromptExecutionSettings
The text was updated successfully, but these errors were encountered: