-
code example comes from https://abrahimzaman360.medium.com/introduction-to-autollm-c8cd31be2a5f after run it ,throws error as following:
any ideas? thanks |
Beta Was this translation helpful? Give feedback.
Replies: 3 comments 6 replies
-
maybe more documents on query_engine? |
Beta Was this translation helpful? Give feedback.
-
Hello @yangboz. Please refer to readme for the most up-to-date usage: https://github.com/safevideo/autollm?tab=readme-ov-file#create-a-query-engine-in-seconds |
Beta Was this translation helpful? Give feedback.
-
despite of previous error, after export OPEN_AI_KEY, calling queryEngine.query ,coming error as following:
any idea? thanks. |
Beta Was this translation helpful? Give feedback.
You just have to provide an ollama model name as
llm_model="ollama/llama2"
and ollama endpoint URL asllm_api_base="http://localhost:11434"
in query_engine.Feel free to open a PR for adding this information into readme :) @yangboz