Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

bug: Autocomplete Issues with Experimental Ollama Provider #5581

Open
sascharo opened this issue Sep 15, 2024 · 0 comments
Open

bug: Autocomplete Issues with Experimental Ollama Provider #5581

sascharo opened this issue Sep 15, 2024 · 0 comments
Labels
bug Something isn't working clients/vscode

Comments

@sascharo
Copy link

sascharo commented Sep 15, 2024

Version

sourcegraph.cody-ai v1.35.1726240556 (pre-release)
vs code 1.94.0-insider (user setup)
ollama v0.3.10

Describe the bug

When using the following settings:

"cody.autocomplete.advanced.provider": "experimental-ollama",
"cody.autocomplete.experimental.ollamaOptions": {
    "model": "granite-code:8b-base-q8_0",
    "url": "http://localhost:11434"
}

Autocomplete behavior with Cody's experimental Ollama provider does not work as expected across all indentation levels. For example:

  • Using granite-code:8b-base-q8_0 with Cody, suggestions only appear at the root indentation level. If the cursor is at an inner indentation level (e.g., within a function), no suggestions are provided unless I hit backspace to move the cursor back to the root level.
  • Using the yi-coder:9b-base-q8_0 and deepseek-coder-v2:16b-lite-base-q6_K models with Cody's autocomplete gives inconsistent results. There's a ~50% chance of getting a suggestion at the correct indentation level, but only the first token is suggested rather than a complete line of code. Going back to the root indentation level results in no suggestions at all, unlike when using the Granite models.

In comparison, using these models with the Continue.continue extension provides correct autocomplete functionality, including suggestions at all indentation levels.

I made sure other extensions with the feature to do autosuggestions weren't active.

Expected behavior

Autocomplete should function consistently at all indentation levels, providing suggestions regardless of cursor placement in the code. Additionally, models such as yi-coder:9b-base-q8_0 and deepseek-coder-v2:16b-lite-base-q6_K should provide complete code suggestions rather than partial tokens, and models like granite-code:8b-base-q8_0 should work similarly to how they function in the Continue.continue extension, offering suggestions at any indentation level without needing manual cursor adjustment.

Additional context

The issues were experienced while running Cody's experimental Ollama provider on the following configurations:

A

"cody.autocomplete.advanced.provider": "experimental-ollama",
"cody.autocomplete.experimental.ollamaOptions": {
    "model": "granite-code:8b-base-q8_0",
    "url": "http://localhost:11434"
}

B

"cody.autocomplete.advanced.provider": "experimental-ollama",
"cody.autocomplete.experimental.ollamaOptions": {
    "model": "yi-coder:9b-base-q8_0",
    "url": "http://localhost:11434"
}

C

"cody.autocomplete.advanced.provider": "experimental-ollama",
"cody.autocomplete.experimental.ollamaOptions": {
    "model": "deepseek-coder-v2:16b-lite-base-q6_K",
    "url": "http://localhost:11434"
}

Both the Continue extension and Cody's experimental autocomplete are using the same model configurations.

@sascharo sascharo added bug Something isn't working clients/vscode labels Sep 15, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working clients/vscode
Projects
None yet
Development

No branches or pull requests

1 participant