You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, we do not have any plans nor any bandwidth currently to implement ollama support. However, we may consider this in the (near?) future. Nevertheless, feel free to start on this if you are interested in contributing!
Regarding quantized models, all our models are listed here and available here. They all come with GGUF and AWQ variants.
It would be great to have ollama support (now it supports json mode ) and quantized 4bit models.
The text was updated successfully, but these errors were encountered: