Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

bug: "Use Cody offline with Ollama" button not working #5572

Open
jp0707 opened this issue Sep 13, 2024 · 3 comments
Open

bug: "Use Cody offline with Ollama" button not working #5572

jp0707 opened this issue Sep 13, 2024 · 3 comments
Labels
bug Something isn't working clients/vscode

Comments

@jp0707
Copy link

jp0707 commented Sep 13, 2024

Version

1.34.2

Describe the bug

"Use Cody offline with Ollama" button does not do anything from the sign-in screens.

Steps to reproduce:

  1. Install Cody VSCode extension afresh
  2. Set-up Ollama (guide)
  3. Go offline (disconnect internet)
  4. Restart VSCode
  5. Open Cody left side panel in VSCode. This should show the "Cody could not start due to a connection issue." screen with "Use Cody offline with Ollama"
  6. Click on "Use Cody offline with Ollama"
Screenshot 2024-09-13 at 2 42 54 PM

Expected behavior

Should be taken to appropriate UI to start using Cody offline with Ollama

Additional context

Debug Logs

█ AuthProvider:init:lastEndpoint: Token recovered from secretStorage https://sourcegraph.com/
█ CodyLLMConfiguration: {}
█ ModelsService: Setting primary models: []
█ telemetry-v2: recordEvent: cody.auth/failed  {
  "parameters": {
    "version": 0
  },
  "timestamp": "2024-09-13T18:39:45.058Z"
}
█ featureflag: refreshed
█ ContextFiltersProvider: fetchContextFilters  {}
█ ChatsController:constructor: init
█ CodyCompletionProvider:notSignedIn: You are not signed in.
█ CodyCompletionProvider:notSignedIn: You are not signed in.
█ ChatController: updateViewConfig  {
  "agentIDE": "VSCode",
  "agentExtensionVersion": "1.34.2",
  "uiKindIsWeb": false,
  "serverEndpoint": "https://sourcegraph.com/",
  "experimentalNoodle": false,
  "smartApply": true,
  "webviewType": "sidebar",
  "multipleWebviewsEnabled": true,
  "internalDebugContext": false
}
█ ChatController: updateViewConfig  {
  "agentIDE": "VSCode",
  "agentExtensionVersion": "1.34.2",
  "uiKindIsWeb": false,
  "serverEndpoint": "https://sourcegraph.com/",
  "experimentalNoodle": false,
  "smartApply": true,
  "webviewType": "sidebar",
  "multipleWebviewsEnabled": true,
  "internalDebugContext": false
}
█ ChatController: updateViewConfig  {
  "agentIDE": "VSCode",
  "agentExtensionVersion": "1.34.2",
  "uiKindIsWeb": false,
  "serverEndpoint": "https://sourcegraph.com/",
  "experimentalNoodle": false,
  "smartApply": true,
  "webviewType": "sidebar",
  "multipleWebviewsEnabled": true,
  "internalDebugContext": false
}
█ ContextFiltersProvider: fetchContextFilters  {}
█ UpstreamHealth: Failed to ping upstream host  {
  "error": {
    "message": "request to https://sourcegraph.com/healthz failed, reason: getaddrinfo ENOTFOUND sourcegraph.com",
    "type": "system",
    "errno": "ENOTFOUND",
    "code": "ENOTFOUND"
  }
}
█ ContextFiltersProvider: fetchContextFilters  {}



@jp0707
Copy link
Author

jp0707 commented Sep 13, 2024

This currently blocking me from using Cody with Ollama. Is there any other way to bypass the log-in screen so I can start using Cody with local Ollama?

@sascharo
Copy link

This currently blocking me from using Cody with Ollama. Is there any other way to bypass the log-in screen so I can start using Cody with local Ollama?

What about switching to an offline model after you logged in?

@jyoti-re-qr
Copy link

What about switching to an offline model after you logged in?

But that requires me to log in in the first place. I'd like to avoid having to log in just to use an offline model :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working clients/vscode
Projects
None yet
Development

No branches or pull requests

3 participants