Infinite Memory for AI.
Locally.

A local-first vector memory engine for Python. No API keys. No cloud bills. Just high-performance, offline RAG for your LLM agents.

Read The Docs
bash — 80x24
â–ˆ

Data Flow Architecture

User Query
L1 Cache
(O(1) Hash)
Vector DB
(Chroma)
LLM Agent
Hot Path (Cache Hit) Cold Path (Vector Search)

System Capabilities

âš¡

O(1) Semantic Cache

Why search vectors twice? MemLoop hashes queries to intercept repeated questions instantly. Improves latency by 99% for recurring topics.

🔒

100% Offline

Your data never leaves localhost. We use ChromaDB and lightweight SentenceTransformers that run on your CPU. Perfect for sensitive contracts, medical data, or PII.

📂

Universal Ingestion

Point to a folder of .pdf, .csv, or .txt files. MemLoop handles the ETL pipeline automatically.

🔖

Page-Level Citations

Hallucination killer. Every retrieval comes with source metadata: {source: "manual.pdf", page: 42}.

Developer documentation

Install and run your first agent in under 30 seconds.

# 1. Install via pip pip install memloop # 2. Create a python script from memloop import MemLoop brain = MemLoop() print("Ingesting Knowledge...") # Learn from the web brain.learn_url("https://docs.python.org/3/") # Ask a question answer = brain.recall("How do decorators work?") print(answer)

The main interface for integrating with LLMs.

class MemLoop: def __init__(self, db_path="./memloop_data"): """Initialize local vector store""" pass def learn_local(self, folder_path): """Ingest PDFs, CSVs, and TXT files recursively""" pass def recall(self, query, n_results=3): """Returns context string with citations""" pass

Use the terminal for quick testing and data management.

$ memloop [SYSTEM]: Initializing Neural Link... > /learn https://en.wikipedia.org/wiki/Artificial_intelligence [SYSTEM]: Absorbed 45 chunks. > What is the history of AI? [MEMLOOP]: "AI history began in antiquity..." (Source: Wikipedia)