Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Local llm mutli-gpu support #1391

Merged
merged 10 commits into from
Jan 12, 2024
Merged

Local llm mutli-gpu support #1391

merged 10 commits into from
Jan 12, 2024

Conversation

rounak610
Copy link
Collaborator

@rounak610 rounak610 commented Jan 12, 2024

Description

Added multi-gpu support in local llms feature
Command to build without local llms: docker compose -f docker-compose.yaml up --build
Command to build with local llms: docker compose -f docker-compose-gpu.yml up --build

Related Issues

Solution and Design

Test Plan

Type of change

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to change)
  • Docs update

Checklist

  • My pull request is atomic and focuses on a single change.
  • I have read the contributing guide and my code conforms to the guidelines.
  • I have documented my changes clearly and comprehensively.
  • I have added the required tests.

@CLAassistant
Copy link

CLA assistant check
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you all sign our Contributor License Agreement before we can accept your contribution.
1 out of 2 committers have signed the CLA.

✅ rounak610
❌ Ubuntu


Ubuntu seems not to be a GitHub user. You need a GitHub account to be able to sign the CLA. If you have already a GitHub account, please add the email address used for this commit to your account.
You have signed the CLA already but the status is still pending? Let us recheck it.

Copy link

codecov bot commented Jan 12, 2024

Codecov Report

Attention: 1 lines in your changes are missing coverage. Please review.

Comparison is base (ea2a0b6) 58.64% compared to head (c4e856a) 58.67%.
Report is 4 commits behind head on main.

Files Patch % Lines
superagi/llms/openai.py 94.44% 0 Missing and 1 partial ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main    #1391      +/-   ##
==========================================
+ Coverage   58.64%   58.67%   +0.02%     
==========================================
  Files         230      230              
  Lines       11199    11213      +14     
  Branches     1209     1212       +3     
==========================================
+ Hits         6568     6579      +11     
- Misses       4292     4294       +2     
- Partials      339      340       +1     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@rounak610 rounak610 changed the title Local llm fixes Local llm mutli-gpu support Jan 12, 2024
Dockerfile Outdated Show resolved Hide resolved
@Fluder-Paradyne Fluder-Paradyne merged commit 7411a01 into main Jan 12, 2024
8 of 9 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants