No description
Find a file
2025-11-21 18:45:10 +01:00
src init: initial commit 2025-11-21 18:45:10 +01:00
tests init: initial commit 2025-11-21 18:45:10 +01:00
.gitignore init: initial commit 2025-11-21 18:45:10 +01:00
Cargo.toml init: initial commit 2025-11-21 18:45:10 +01:00
Dockerfile init: initial commit 2025-11-21 18:45:10 +01:00
init.sql init: initial commit 2025-11-21 18:45:10 +01:00
LICENSE init: initial commit 2025-11-21 18:45:10 +01:00
mcp-manifest.json init: initial commit 2025-11-21 18:45:10 +01:00
README.md init: initial commit 2025-11-21 18:45:10 +01:00

MCP Memory Server

A Model Context Protocol (MCP) server that provides AI assistants with persistent memory capabilities using PostgreSQL and OpenAI vector embeddings.

Description

This server implements the Model Context Protocol to give AI assistants the ability to store, search, update, and delete memories using semantic vector search. Each AI context maintains completely isolated memory spaces for security and privacy.

Features

  • Persistent Memory: Store text memories with vector embeddings for semantic search
  • Vector Search: Find similar memories using OpenAI embeddings and PostgreSQL pgvector
  • Context Isolation: Each AI context has separate, secure memory spaces
  • CRUD Operations: Create, read, update, and delete memories
  • MCP Compliance: Standard Model Context Protocol implementation
  • Flexible Configuration: Support for different OpenAI models and endpoints

Prerequisites

  • Rust 1.75+
  • PostgreSQL with pgvector extension
  • OpenAI API key

Build

  1. Clone the repository:
git clone <repository-url>
cd mcp-memory
  1. Build the project:
cargo build --release

The executable will be created at target/release/mcp-memory.

Configuration

Environment Variables

Variable Description Default
DATABASE_URL PostgreSQL connection string postgresql://localhost/mcp_memory
OPENAI_API_KEY OpenAI API key (required) -
BIND_ADDRESS Server bind address 127.0.0.1:3001
EMBEDDING_MODEL OpenAI embedding model text-embedding-3-small
EMBEDDING_SIZE Vector dimension size 1536
OPENAI_BASE_URL Custom OpenAI endpoint (optional) -

Database Setup

  1. Install PostgreSQL with pgvector extension
  2. Create database:
createdb mcp_memory

The server will automatically initialize the database and run migrations on first use.

Manual Migration

You can run migrations manually before starting the server:

./target/release/mcp-memory migrate

This is useful for:

  • Pre-deployment database setup
  • Controlled migration timing
  • Troubleshooting database issues

Migration System

The server includes a robust migration system that:

  • Automatically initializes fresh databases
  • Applies schema updates when upgrading versions
  • Tracks migration history in a schema_version table
  • Supports both automatic and manual migration modes

Database schema changes are applied incrementally and safely.

Run

  1. Set required environment variables:
export DATABASE_URL=postgresql://localhost/mcp_memory
export OPENAI_API_KEY=your-api-key-here
  1. Start the server:
./target/release/mcp-memory

The server will be available at http://localhost:3001.

Health Checks

The server provides health check endpoints for monitoring:

  • GET /health - Overall system health and migration status
  • GET /ready - Readiness check (returns 200 only when fully ready)
  • GET /live - Liveness check (returns 200 if service is running)

These endpoints are useful for container orchestration and load balancers.

Quick Start with Docker

export OPENAI_API_KEY=your-api-key-here
docker-compose up -d

Tools

The server provides four MCP tools:

  • store_memory - Store text with vector embedding
  • search_memory - Search memories by similarity
  • update_memory - Update existing memory by ID
  • delete_memory - Remove memory by ID

Integration

Use the provided mcp-manifest.json for MCP client configuration, or configure your client to connect to the server endpoints directly.

License

ISC License