Replies: 8 comments 12 replies
-
$ git clone https://github.com/hqnicolas/devika |
Beta Was this translation helpful? Give feedback.
-
To make it work, you'll need to launch Ollama as it's being described. I'll explain each step assuming you haven't installed Ollama yet. If that's the case:
|
Beta Was this translation helpful? Give feedback.
-
Being able to access the API from ollama does NOT mean it will work. It will encounter all kinds of issues wth the local model itself, not producing the same output as GPT-4. |
Beta Was this translation helpful? Give feedback.
-
Would like to see easy setup feature for Ollama |
Beta Was this translation helpful? Give feedback.
-
running dockerized version I had to set Ollama endpoint to "http://ollama-service:11434" as the container running ollama in devika-subnetwork will be accessible through the host "ollama-service". After that all good, time to play with local llama3 :) |
Beta Was this translation helpful? Give feedback.
-
still can;t run devika with local ollama |
Beta Was this translation helpful? Give feedback.
-
Have you tried OpenDevin?
…On Wed, 14 Aug 2024, 01:16 Alberto Torres, ***@***.***> wrote:
still can;t run devika with local ollama
—
Reply to this email directly, view it on GitHub
<#308 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/A4QSQG2ZAHSWXDJY2G7SX63ZRKHWDAVCNFSM6AAAAABFTTTGHSVHI2DSMVQWIX3LMV43URDJONRXK43TNFXW4Q3PNVWWK3TUHMYTAMZTGEYTEMY>
.
You are receiving this because you commented.Message ID:
***@***.***>
|
Beta Was this translation helpful? Give feedback.
-
I need this help on the configuration to do
Beta Was this translation helpful? Give feedback.
All reactions