forked from ax-llm/ax
-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
bugfix/fix issue unbounded func #1
Open
TYRONEMICHAEL
wants to merge
127
commits into
main
Choose a base branch
from
bugfix/fix-issue-unbounded-func
base: main
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
* Implement Lares Smart Home Assistant Using llmclient * Fix comment in overview * Hide the location of the dog from the llm
Added a new request rate controller to prevent hitting rate limits this can be used with any model or prompt. For groq it's added by default
This reverts commit e77dce4.
* chore(docs): fix grammar * chore: fix lint * chore(grammar): fix readme chore: minor readme fix
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Implement Lares Smart Home Assistant Using llmclient (Implement Lares Smart Home Assistant Using llmclient ax-llm/ax#23)
feat: added prefix Ax across type namespace
Updated README
Update README.md
feat: multi-modal dsp, use images in input fields
Update README.md
feat: added multi-modal support to anthropic api and other fixes
refactor: change axAI function to AxAI class
[breaking change]: is now , added claude 3.5 sonnet and other fixes
feat: ai balancer to pick the cheapest llm with automatic fallback
chore: start of official releases
chore(release): 9.1.0
Create npm-publish.yml
chore: testing npm publish from github
Update npm-publish.yml
Update package.json
fix: openai and anthropic function calling issues
fix: google gemini function calling fix
fix: cohere and gemini function calling
chore: release stuff
chore: release 9.0.8
fix: build issue
fix: added default ratelimiter to groq
fix: build issue
fix: several issues with agents
Update README.md
Update README.md
fix: build fixes
fix: build issue
fix: issue with function results
fix: Export hugging face data Loader and fix summarize example (Export hugging face data Loader and fix summarize example ax-llm/ax#32)
docs: ax presentation
fix: input fields type image rejected Error "Image type is not supported in output fields." for Input field ax-llm/ax#33 (fix: input fields type image rejected #33 ax-llm/ax#34)
chore: version upgrade
fix: issues with prompt tuning
fix: build issue
feat: gemma 2
feat: gemini code execution added
Fix: Remove casting to a string value in the loader (Remove casting to a string value in the loader ax-llm/ax#37)
Create static.yml
docs: new website
docs: new website
docs: site update
docs: site update
docs: more updates
docs: more updates
Add monorepo (Add monorepo ax-llm/ax#39)
chore: cleanup
feat: add multi format build with CJS build for compatibility (Add multi-format build with CJS build for compatibility with Node require() ax-llm/ax#40)
fix: removed publish from exmples
fix: minor fixes
fix: increased tests timeout
Ava test timeout and "prepare" script changes (Ava test timeout and "prepare" script changes ax-llm/ax#42)
Test improvements - Fix Ava runs on CI (Test improvements - Fix Ava runs on CI ax-llm/ax#43)
Release process improvements (Release process improvements ax-llm/ax#44)
chore: release v9.0.20
fix: Accessing Stream Chunks (Streamed generation) Accessing Stream Chunks (Streamed generation) ax-llm/ax#36
chore: release v9.0.21
fix: release files (fix: release files ax-llm/ax#45)
fix: minor fix
chore: release v9.0.22
Fix node imports (Fix node imports ax-llm/ax#48)
docs: design update
chore: release v9.0.23
docs: design update
docs: design update
docs: design update
docs: design update
docs: updated
docs: update
docs: update
fix: with and without signature base program classes
chore: release v9.0.24
fix: refactor functions
chore: release v9.0.25
feat: docker sandbox function
chore: release v9.0.26
docs: updated
fix: model map issue
chore: release v9.0.27
Auto-add release changelog after the release (Auto-add release changelog after the release ax-llm/ax#49)
fix: issue with model map feature
chore: release v9.0.28
fix: redesigned model map feature
chore: release v9.0.29
fix: more fixes related to model mapping
chore: release v9.0.30
docs: how to use in production
docs: fix links in apidocs
chore: release v9.0.31
docs: minor link fix
fix: corrected embeddings endpoint (corrected ollama embeddings api endpoint ax-llm/ax#51)
feat: added new models for mistral and openai
chore: release v9.0.32
Update README.md
FixL Add terms to spellcheck dictionary (Add terms to spellcheck dictionary ax-llm/ax#54)
Fix: Convert Ollama API files from JS to TS (Enhance Ollama API integration with improved error handling and streaming ax-llm/ax#52)
Revert "Fix: Convert Ollama API files from JS to TS (Enhance Ollama API integration with improved error handling and streaming ax-llm/ax#52)"
feat: new ai-sdk-provider
chore: release v9.0.33
fix: package.json for publishing
chore: release v9.0.34
chore(docs): fix grammar (chore(docs): fix grammar ax-llm/ax#55)
fix: build issues
chore: release v9.0.35
fix: spelling
chore: release v9.0.36
fix: streaming fix in ai sdk provider
chore: release v9.0.37
feat: added a ai sdk agent provider
chore: release v9.0.38
fix: ai sdk agent provider update
chore: release v9.0.39
fix: automatic zod schema creation for ai sdk provider tools
chore: release v9.0.40
fix: ax ai provider
chore: release v9.0.41
fix: updates to ai sdk provider
chore: release v9.0.42
fix: updates to the ai sdk provider
chore: release v9.0.43
fix: updates to the ai sdk provider
fix: updates to the ai sdk provider
chore: release v9.0.44
feat: added reka models
chore: release v9.0.45
Fix: Bind agent functions to instances for correct 'this' context"
What kind of change does this PR introduce? (Bug fix, feature, docs update, ...)
What is the current behavior? (You can also link to an open issue here)
What is the new behavior (if this is a feature change)?
Other information: