Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LLamaIndex.ts support #923

Closed
jsm69 opened this issue Jan 17, 2024 · 16 comments
Closed

LLamaIndex.ts support #923

jsm69 opened this issue Jan 17, 2024 · 16 comments
Labels
ai/provider enhancement New feature or request

Comments

@jsm69
Copy link

jsm69 commented Jan 17, 2024

Feature Description

Hey guys, When will Vercel AI SDK will support LLamaIndex.ts?

Use Case

RAG

Additional context

No response

@Iven2132
Copy link

@jaredpalmer @MaxLeiter @lgrammel Any updates? Will you add support for LLamaIndex.ts in Vercel SDK?

@llermaly
Copy link

FYI this library implements vercel ai + llamaindex.ts

https://www.npmjs.com/package/create-llama

@meera
Copy link

meera commented Mar 21, 2024

create-llama uses older ai package. Vercel now recommends to use 'ai/rsc'.

@lgrammel lgrammel added enhancement New feature or request ai/provider labels May 30, 2024
@kuldeepdaftary
Copy link

Bump. Any updates on this?

@JokeJason
Copy link

create-llama uses older ai package. Vercel now recommends to use 'ai/rsc'.

Hi, @meera , one interesting question, should we use ai/rsc or ai ui?

@namanyayg
Copy link

Did anyone manage to figure this out?

There seems to be some support with Next.js already https://github.com/run-llama/LlamaIndexTS/tree/main?tab=readme-ov-file#react-server-component-nextjs-waku-redwoodjs

But I haven't tried it out. Is it working with ai rsc?

@wakeupmh
Copy link

Did anyone manage to figure this out?

There seems to be some support with Next.js already https://github.com/run-llama/LlamaIndexTS/tree/main?tab=readme-ov-file#react-server-component-nextjs-waku-redwoodjs

But I haven't tried it out. Is it working with ai rsc?

have you tried it? @namanyayg

@namanyayg
Copy link

Did anyone manage to figure this out?
There seems to be some support with Next.js already https://github.com/run-llama/LlamaIndexTS/tree/main?tab=readme-ov-file#react-server-component-nextjs-waku-redwoodjs
But I haven't tried it out. Is it working with ai rsc?

have you tried it? @namanyayg

The TS version of LlamaIndex has basically no data loaders or similar community modules, so I found it quite useless (I wanted Slack and Jira loaders)

I abandoned it and moved to using the Python version

@wakeupmh
Copy link

wakeupmh commented Aug 1, 2024

Did anyone manage to figure this out?
There seems to be some support with Next.js already https://github.com/run-llama/LlamaIndexTS/tree/main?tab=readme-ov-file#react-server-component-nextjs-waku-redwoodjs
But I haven't tried it out. Is it working with ai rsc?

have you tried it? @namanyayg

The TS version of LlamaIndex has basically no data loaders or similar community modules, so I found it quite useless (I wanted Slack and Jira loaders)

I abandoned it and moved to using the Python version

I was reluctant to go to python as well

@himself65
Copy link
Contributor

llamaindex-ts maintainer here.

First we have create-llama starter template with easy start, supporting both python and js/ts. by @marcusschiesser

For ts package we do lack some modules comparing with python but we are working on that. For now we do support react RSC as I'm also a maintainer of waku, but I'm still working on some code refacor (add py modules, reducing unsued imports, improving js bundler) to make sure it runs well. We don't have code example for that now but I think you can follow vercel rsc example and use the same way using llamaindex.

@nico-herrera
Copy link

bump

@marcusschiesser
Copy link
Contributor

@nico-herrera, you can find the latest code for LlamaIndexTS streaming here: https://github.com/run-llama/create-llama/blob/main/templates/components/llamaindex/typescript/streaming/stream.ts - it supports the newest vercel/ai version 3.3.38.

@jaredpalmer @MaxLeiter @lgrammel Some time ago, I sent a PR for adding LlamaIndex streaming: #699 - happy to update it if you're interested.

@lgrammel
Copy link
Collaborator

@marcusschiesser having a llmamaindex integration would be great. we have moved to an llm interface protocol since then. Current possibilities for integration are:

I don't know enough about llamaindex to understnd what would be best here.

@marcusschiesser
Copy link
Contributor

Thanks @lgrammel, I'll have a look

@marcusschiesser
Copy link
Contributor

@lgrammel I started a PR, please have a look: #3064

@lgrammel
Copy link
Collaborator

Thanks @marcusschiesser the llamaindex adapter will be part of [email protected]

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ai/provider enhancement New feature or request
Projects
None yet
Development

No branches or pull requests