An index is a software component building block, which becomes a database when wrapped with the data management system. We will see more and more traditional databases to add a vector-search index, for instance pgvector makes a vector database out of PostgreSQL.
The LLM App is meant to be self-sufficient and takes a "batteries included" approach to system development - rather than combine several separate applications into a large deploymet, that includes databases, orchestrators, ETL pipelines it combines several software components, such as connectors and indexes into a single app which can be directly deployed with no extra dependencies.
Such an approach should make the deployments easier (there are fewer moving parts to monitor and service), while also being more hackable - e.g. adding some more logic on top of nearest neighbor retrieval is easy and adds only a few statements to the code.
- https://github.com/pathwaycom/llm-app/blob/main/llm_app/path... for the simplest contextless app
- https://github.com/pathwaycom/llm-app/blob/main/llm_app/path... for the default app that builds a reactive index of context documents
- https://github.com/pathwaycom/llm-app/blob/main/llm_app/path... for the contextful app reading data from s3
- https://github.com/pathwaycom/llm-app/blob/main/llm_app/path... for the app using locally available models