Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for custom OpenAI base url (LocalAI integration) #268

Open
0x326 opened this issue Feb 8, 2024 · 4 comments
Open

Add support for custom OpenAI base url (LocalAI integration) #268

0x326 opened this issue Feb 8, 2024 · 4 comments
Labels
feature New feature or request pending triage

Comments

@0x326
Copy link

0x326 commented Feb 8, 2024

Feature request

Adding a OpenAI url setting allows integration with https://localai.io/, a locally-run API that is compatible with the OpenAI API specification.

Why?

Running the AI server locally provides enhanced data privacy.

Alternatives

No response

Additional context

No response

@0x326 0x326 added feature New feature or request pending triage labels Feb 8, 2024
@0x326 0x326 changed the title Add support for custom OpenAI url (LocalAI integration) Add support for custom OpenAI base url (LocalAI integration) Feb 8, 2024
@joshuacox
Copy link

maybe the most important feature request I've seen

@dangnhdev
Copy link

In the mean time:

  1. Locate the npm package for aicommits. In windows, normal nodejs installation it's C:\Program Files\nodejs\node_modules\aicommits
  2. Edit /dist/cli.mjs: Open this file, find api.openai.com and replace it with your URL. There is only one occurrence of this text.

@iHunterDev
Copy link

In the mean time:

  1. Locate the npm package for aicommits. In windows, normal nodejs installation it's C:\Program Files\nodejs\node_modules\aicommits
  2. Edit /dist/cli.mjs: Open this file, find api.openai.com and replace it with your URL. There is only one occurrence of this text.

Run npm root -g to get the global node_modules path

@joshuacox
Copy link

well almost, I also had to setup nginx and tell node to ingore my self-signed cert, and change out gpt-3.5-turbo with a model I had pulled with ollama already, I did this in nix with something like this:

{ config, lib, pkgs, ... }:
let
in
{
  services = {
    nginx = {
      enable = true;
      virtualHosts = let
        base = locations: {
          inherit locations;
          forceSSL = true;
          #enableACME = true;
        };
        proxy = port: base {
          "/".proxyPass = "http://127.0.0.1:" + toString(port) + "/";
        };
      in {
        # Define example.com as reverse-proxied service on 127.0.0.1:3000
        "localhost" = proxy 11434 // {
          default = true; 
          sslCertificate = /etc/localhost/localhost.pem;
          sslCertificateKey = /etc/localhost/localhost-key.pem;
        };
      };
    };
    ollama = {
      enable = true;
      acceleration = "cuda";
    };
  };
  systemd.services = {
    ollama.serviceConfig.DynamicUser = lib.mkForce false;
  };
  environment.systemPackages = with pkgs; [ 
    ollama
  ];
}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature New feature or request pending triage
Projects
None yet
Development

No branches or pull requests

4 participants