-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support loading data into archival with Llama Index connectors #146
Conversation
@sarahwooders are there example commands w/ dummy data I can test this with (eg for runtime error on my end)? |
It's not fully integrated into the CLI yet, but you can run the tests:
And also make sure Also just lmk if stylistically it seems OK before I refactor the CLI/config. |
@sarahwooders tests passed on my end, but I'm getting: ModuleNotFoundError: No module named 'llama_index' when testing backcompat with Should we add llama_index as a default dependency? |
Did you run |
Alright LGTM, backcompat looks fine. Will merge when you're back. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM 👩🍳
Support loading data into archival with Llama Index connectors
Llama Index has implemented a variety of connectors for reading, parsing, and embedding data. This pull request integrates with Llama index to allow users to ingest more data sources into archival memory.
Data can be loaded into archival with the following command
where supported connectors are currently
directory
andwebpage
.Remaining to-dos before merging: