We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Since you guys were so awesome to allow local models via Ollama (and OAI compatible too, but couldnt get it to work yet the docs dont really help :(
Tried changing the Autocomplete model, but it doesnt allow to use Anthropic and using Ollama also didnt yield in results.
Is there a way to change in in the settings.json or do we need to hack Cody?
Would like to swap the autocomplete to my fine tuned and merged Deepseek Coder V2 with higher CTX and more project specific
The text was updated successfully, but these errors were encountered:
No branches or pull requests
Since you guys were so awesome to allow local models via Ollama (and OAI compatible too, but couldnt get it to work yet the docs dont really help :(
Tried changing the Autocomplete model, but it doesnt allow to use Anthropic and using Ollama also didnt yield in results.
Is there a way to change in in the settings.json or do we need to hack Cody?
Would like to swap the autocomplete to my fine tuned and merged Deepseek Coder V2 with higher CTX and more project specific
The text was updated successfully, but these errors were encountered: