You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thanks for creating llamabot. I'm enjoying experimenting with LLMs programmatically.
Question for you -- As I understand it, Ollama is used to run the LLM model locally (like on localhost), and llamabot interacts with Ollama (presumably through something like a REST API).
Would it be possible to extend llamabot to use LMStudio's server mode? LMStudio's UI is very clean and easy to use, arguably easier than Ollama's minimal menubar interface.
Just a thought on how to make the "onboarding process" for llamabot even easier.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Thanks for creating llamabot. I'm enjoying experimenting with LLMs programmatically.
Question for you -- As I understand it, Ollama is used to run the LLM model locally (like on localhost), and llamabot interacts with Ollama (presumably through something like a REST API).
Would it be possible to extend llamabot to use LMStudio's server mode? LMStudio's UI is very clean and easy to use, arguably easier than Ollama's minimal menubar interface.
Just a thought on how to make the "onboarding process" for llamabot even easier.
Beta Was this translation helpful? Give feedback.
All reactions