-
Notifications
You must be signed in to change notification settings - Fork 5.1k
Issues: ollama/ollama
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Could you please support deepseek v2 ?
model request
Model requests
#4703
opened May 29, 2024 by
netspym
Introduce regular support releases
feature request
New feature or request
#4702
opened May 29, 2024 by
dcasota
Quick model updates with New feature or request
ollama pull
feature request
#4701
opened May 29, 2024 by
LaurentBonnaud
Computing Context Embeddings, Instead of averagning token embeddings
feature request
New feature or request
#4699
opened May 29, 2024 by
Demirrr
ValueError: Error raised by inference API HTTP code: 500, {"error":"failed to generate embedding"}
#4698
opened May 29, 2024 by
uzumakinaruto19
Problems with using multiple GPUs
bug
Something isn't working
#4696
opened May 29, 2024 by
yuwencool
0.1.39 doesn't work with Nvidia Xavier NX
bug
Something isn't working
#4693
opened May 29, 2024 by
ZanMax
Supress spinner option or add file out option
feature request
New feature or request
#4686
opened May 28, 2024 by
tyson-nw
Model download finally fails behind company firewall
bug
Something isn't working
#4684
opened May 28, 2024 by
berndgoetz
Please support Baichuan series models
feature request
New feature or request
#4678
opened May 28, 2024 by
Han-Huaqiao
Ollama create test -f ./Modelfile error ==> Error: command must be one of "from", "license", "template", "system", "adapter", "parameter", or "message"
bug
Something isn't working
#4676
opened May 28, 2024 by
farmountain
Error: llama runner process has terminated: exit status 0xc0000409
bug
Something isn't working
#4675
opened May 28, 2024 by
FreemanFeng
any command but serve get errors,when using proxy
bug
Something isn't working
#4674
opened May 28, 2024 by
lingfengchencn
llama3 8b BF16 error
bug
Something isn't working
importing
Issues relating to ollama create
#4670
opened May 27, 2024 by
ccbadd
The ability to log the generated text response
feature request
New feature or request
#4669
opened May 27, 2024 by
MarkWard0110
Add the contents of "/api/ps" endpoint to "docs/api.md"
feature request
New feature or request
#4667
opened May 27, 2024 by
mili-tan
ollama doesn't create a model from modelfile and gives an error
bug
Something isn't working
#4666
opened May 27, 2024 by
tMrMorgan
Previous Next
ProTip!
Follow long discussions with comments:>50.