Skip to content

Issues: alexrozanski/LlamaChat

Beta
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Author
Filter by author
Loading
Label
Filter by label
Loading
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Loading
Milestones
Filter by milestone
Loading
Assignee
Filter by who’s assigned
Sort

Issues list

Support downloadable models enhancement New feature or request ux-improvements Improvements to the app's UI or UX
#6 opened Apr 12, 2023 by umaar v2.0
Failed to load model for eachadea/ggml-vicuna-7b-1.1 bug Something isn't working
#15 opened Apr 16, 2023 by fakechris v2.0
Feature Request: detect models in folder (and subfolders) enhancement New feature or request future Potential future support
#27 opened Apr 25, 2023 by mkellerman Future
Support for Falcon future Potential future support models Additional support for other models
#31 opened Jun 11, 2023 by alelordelo
Support for Open LLaMa? models Additional support for other models
#32 opened Jun 22, 2023 by KnowledgeGarden v2.0
Add iOS support ?
#33 opened Jul 13, 2023 by realcarlos
Support ggmlv3
#34 opened Jul 19, 2023 by jingsam
Error using pth format model
#36 opened Jul 30, 2023 by realalexsun
installation environment problem
#37 opened Aug 14, 2023 by p5ydn0
[Feature Request] Support InternLM
#40 opened Aug 29, 2023 by vansinhu
support for amd gpu (macos)
#42 opened Sep 4, 2023 by sukualam
Ollama support
#43 opened Oct 8, 2023 by aptonline
Broken link in README
#44 opened Nov 15, 2023 by mdznr
Llamachat is spouting gibberish
#45 opened Nov 16, 2023 by jdblack
llama3 support
#46 opened Apr 23, 2024 by pentateu
ProTip! Find all open issues with in progress development work with linked:pr.