Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error: Cannot initialize VectorStoreIndex without nodes or indexStruct #331

Open
nstylianides opened this issue Sep 27, 2024 · 2 comments
Open

Comments

@nstylianides
Copy link

using latest create llamaindex project with next.js
using pdf in data folder.
We run npm run generate but index_store.json is not created.
We then try to run npm run dev and once we try a chat we get this error.

@leehuwuj
Copy link
Collaborator

leehuwuj commented Oct 1, 2024

Hi @nstylianides what are your other options when creating llama app? I had a try on NextJS template but could not reproduce you issue:

❯ npx create-llama@latest
✔ What is your project named? … my-app
✔ Which template would you like to use? › Agentic RAG (e.g. chat with docs)
✔ Which framework would you like to use? › NextJS
✔ Would you like to set up observability? › No
✔ Please provide your OpenAI API key (leave blank to skip): …
✔ Which data source would you like to use? › Use an example PDF
✔ Would you like to add another data source? › No
✔ Would you like to use LlamaParse (improved parser for RAG - requires API key)? … no / yes
✔ Would you like to use a vector database? › No, just store the data in the file system
✔ Would you like to build an agent using tools? If so, select the tools here, otherwise just press enter ›
✔ How would you like to proceed? › Generate code, install dependencies, and run the app (~2 min)

If you don't have index_store.json in the cache folder then i guess you had chosen the No data source option. Please make sure you had a file in the data folder.

@nstylianides
Copy link
Author

Hi there. I have resolved the issue by using WSL env in my windows.

When using llamaindex index in windows i have concluded to the following:

  1. Python example in Windows it works.
  2. LlamaIndex.ts in windows it always fails to create the index_store.json
  3. LlamaIndex.ts in windows WSL it works like a charm.

I am not sure what is the issue since error reporting is either missing or confusing. e.g. "contextWindow" could be easily confused with browser context windows.

THANK YOU FOR THIS LIBRARY for it can be a swiss knife in every dev toolset.
But, IMHO, it needs much better documentation, reference and howto based on cases.

Is there an official Course on how to better utilize the Llmaindex library?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants