Embedded vector store for local-first AI applications
A single-file vector database written in Rust. Dense + sparse hybrid search, HNSW indexing, transactions, and crash-safe persistence — all in a portable .vdb file.
Dense vectors with HNSW + sparse BM25 keyword retrieval, fused with linear combination or RRF.
Everything in one portable .vdb file. Crash-safe WAL, file locking, snapshots and backup.
Import large datasets efficiently with deferred index rebuilds via bulk_ingest() and bulkIngest().
MongoDB-style operators: $eq, $gt, $in, $contains, $exists, with nested dot-path access.
Atomic batched writes with rollback. Context manager support in Python, try/catch in Node.js.
Native Rust core with Python (PyO3) and Node.js (napi-rs) bindings.
import vectlite
db = vectlite.open("knowledge.vdb", dimension=384)
db.upsert("doc1", embedding, {"source": "blog", "title": "Auth Guide"})
db.upsert("doc2", embedding2, {"source": "notes", "title": "Billing"})
results = db.search(query_embedding, k=5, filter={"source": "blog"})
# For large imports, prefer bulk_ingest()
records = [
{"id": f"doc{i}", "vector": embeddings[i], "metadata": {"source": "corpus"}}
for i in range(len(embeddings))
]
db.bulk_ingest(records, batch_size=5000)
for r in results:
print(r["id"], r["score"])
db.compact()