-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LLamaIndex.ts support #923
Comments
@jaredpalmer @MaxLeiter @lgrammel Any updates? Will you add support for LLamaIndex.ts in Vercel SDK? |
FYI this library implements vercel ai + llamaindex.ts |
create-llama uses older ai package. Vercel now recommends to use 'ai/rsc'. |
Bump. Any updates on this? |
Hi, @meera , one interesting question, should we use ai/rsc or ai ui? |
Did anyone manage to figure this out? There seems to be some support with Next.js already https://github.com/run-llama/LlamaIndexTS/tree/main?tab=readme-ov-file#react-server-component-nextjs-waku-redwoodjs But I haven't tried it out. Is it working with ai rsc? |
have you tried it? @namanyayg |
The TS version of LlamaIndex has basically no data loaders or similar community modules, so I found it quite useless (I wanted Slack and Jira loaders) I abandoned it and moved to using the Python version |
I was reluctant to go to python as well |
llamaindex-ts maintainer here. First we have create-llama starter template with easy start, supporting both python and js/ts. by @marcusschiesser For ts package we do lack some modules comparing with python but we are working on that. For now we do support react RSC as I'm also a maintainer of waku, but I'm still working on some code refacor (add py modules, reducing unsued imports, improving js bundler) to make sure it runs well. We don't have code example for that now but I think you can follow vercel rsc example and use the same way using llamaindex. |
bump |
@nico-herrera, you can find the latest code for LlamaIndexTS streaming here: https://github.com/run-llama/create-llama/blob/main/templates/components/llamaindex/typescript/streaming/stream.ts - it supports the newest vercel/ai version 3.3.38. @jaredpalmer @MaxLeiter @lgrammel Some time ago, I sent a PR for adding LlamaIndex streaming: #699 - happy to update it if you're interested. |
@marcusschiesser having a llmamaindex integration would be great. we have moved to an llm interface protocol since then. Current possibilities for integration are:
I don't know enough about llamaindex to understnd what would be best here. |
Thanks @lgrammel, I'll have a look |
Thanks @marcusschiesser the llamaindex adapter will be part of |
Feature Description
Hey guys, When will Vercel AI SDK will support LLamaIndex.ts?
Use Case
RAG
Additional context
No response
The text was updated successfully, but these errors were encountered: