Frequently Asked Questions

What is xMem?

xMem is a memory orchestrator for LLMs, combining long-term and session memory for smarter, more relevant AI responses.

Which LLMs are supported?

xMem works with any open-source LLM, including Llama, Mistral, and more. You can configure your preferred provider in the orchestrator setup.

How do I store custom data?

Use the /api/memory endpoint to add, update, or delete memory items.

Is there a dashboard?

Yes! xMem includes a dashboard for monitoring, configuration, and memory management.

Can I use xMem with my own vector database?

Absolutely. xMem supports pluggable vector stores, including ChromaDB and others.