You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello @wukl001! I'm here to help you solve bugs, answer questions, and assist you in becoming a contributor. I'm a bot designed to support you while you wait for a human maintainer. Let’s tackle your issue together!
I found some similar discussions that might be relevant to your issue:
Regarding your specific issue, the error message "IndexError: list index out of range" suggests that the list available_llms is empty when trying to access its first element. This could be due to incorrect or missing configurations in your model_settings.yaml file. Ensure that the file is correctly configured with at least one language model specified under LLM_MODEL_CONFIG and MODEL_PLATFORMS. You might need to verify that the models are correctly set up and available for use [3].
Additionally, the connection error with the Xinference server indicates a possible misconfiguration in the server URL. Double-check the URL format and ensure that the server is running and accessible at the specified address.
chatchat kb -r初始化知识库失败,报错内容说没开启ollama的11434的端口。我进入model_settings.yaml中删除除了xinfernece的一切配置,再次初始化,报错更离谱了。
2025-04-22 02:45:04.157 | WARNING | chatchat.server.utils:detect_xf_models:109 - error when connect to xinference server(http://:192.168,1.5:9997): Failed to parse: http://:192.168,1.5:9997/v1/cluster/auth
from chatchat.server.knowledge_base.utils import KnowledgeFile
File "/root/anaconda3/envs/langchain/lib/python3.11/site-packages/chatchat/server/knowledge_base/utils.py", line 19, in
from chatchat.server.utils import run_in_process_pool, run_in_thread_pool
File "/root/anaconda3/envs/langchain/lib/python3.11/site-packages/chatchat/server/utils.py", line 220, in
model_name: str = get_default_llm(),
^^^^^^^^^^^^^^^^^
File "/root/anaconda3/envs/langchain/lib/python3.11/site-packages/chatchat/server/utils.py", line 206, in get_default_llm
f"using {available_llms【0】} instead")IndexError: list index out of range
The text was updated successfully, but these errors were encountered: