Web Analytics

core

⭐ 1205 stars English by RedPlanetHQ

🌐 Language

CORE logo

CORE: Your Digital Brain - Memory + Actions for AI Tools

Add to Cursor Deploy on Railway

Website Docs Discord


Your critical info is scattered across tools that don't talk to each other. Your AI conversation starts with "let me give you some context." Your experiences and learnings are still in your head and your head doesn't scale.

CORE remembers. Not a database. Not a search box. A digital brain that learns what matters, connects what's related, and surfaces what you need.

For Developers

CORE gives your AI tools persistent memory and the ability to act in the apps you use.

---

What You Can Do

1. Never repeat yourself, context flows automatically

CORE becomes your persistent memory layer for coding agents. Ask any AI tool to pull relevant context.

Search core memory for architecture decisions on the payment service
What my content guidelines from core to create the blog?
core_as_memory


2. Take actions in your apps from Claude/Cursor

Connect your apps once, take actions from anywhere.

actions


3. Pick up where you left off claude code/cursor

Switching back to a feature after a week? Get caught up instantly.

What did we discuss about the checkout flow? Summarize from memory.
Refer to past discussions and remind me where we left off on the API refactor
claude-code-in-core


What Makes CORE Different

---

🚀 Quick Start

Choose your path:

| | CORE Cloud | Self-Host | |---|---|---| | Setup time | 5 minutes | 15 minutes | | Best for | Try quickly, no infra | Full control, your servers | | Requirements | Just an account | Docker, 4GB RAM |

Cloud

Self-Host

Quick Deploy

Deploy on Railway

Or with Docker

git clone https://github.com/RedPlanetHQ/core.git
cd core

OPENAI_API_KEY=your_openai_api_key

docker-compose up -d

Once deployed, you can configure your AI providers (OpenAI, Anthropic) and start building your memory graph.

👉 View complete self-hosting guide

Note: We tried open-source models like Ollama or GPT OSS but facts generation were not good, we are still figuring out how to improve on that and then will also support OSS models.

🛠️ Installation

Recommended

Install in Claude Code CLI

  • Run this command in your terminal to connect CORE with Claude Code:
claude mcp add --transport http --scope user core-memory https://mcp.getcore.me/api/v1/mcp?source=Claude-Code

  • Type /mcp and open core-memory MCP for authentication

Install in Cursor

Since Cursor 1.0, you can click the install button below for instant one-click installation.

Install MCP Server

OR

  • Go to: Settings -> Tools & Integrations -> Add Custom MCP
  • Enter the below in mcp.json file:
{
  "mcpServers": {
    "core-memory": {
      "url": "https://mcp.getcore.me/api/v1/mcp?source=cursor",
      "headers": {}
    }
  }
}

Install in Claude Desktop

  • Copy CORE MCP URL:
https://mcp.getcore.me/api/v1/mcp?source=Claude

  • Navigate to Settings → Connectors → Click Add custom connector
  • Click on "Connect" and grant Claude permission to access CORE MCP

CLIs

Install in Codex CLI

Option 1 (Recommended): Add to your ~/.codex/config.toml file:

[features]
rmcp_client=true

[mcp_servers.memory] url = "https://mcp.getcore.me/api/v1/mcp?source=codex"

Then run: codex mcp memory login

Option 2 (If Option 1 doesn't work): Add API key configuration:

[features]
rmcp_client=true

[mcp_servers.memory] url = "https://mcp.getcore.me/api/v1/mcp?source=codex" http_headers = { "Authorization" = "Bearer CORE_API_KEY" }

Get your API key from app.getcore.me → Settings → API Key, then run: codex mcp memory login

Install in Gemini CLI

See Gemini CLI Configuration for details.

  • Open the Gemini CLI settings file. The location is ~/.gemini/settings.json (where ~ is your home directory).
  • Add the following to the mcpServers object in your settings.json file:
{
  "mcpServers": {
    "corememory": {
      "httpUrl": "https://mcp.getcore.me/api/v1/mcp?source=geminicli",
      "timeout": 5000
    }
  }
}

If the mcpServers object does not exist, create it.

Install in Copilot CLI

Add the following to your ~/.copilot/mcp-config.json file:

{
  "mcpServers": {
    "core": {
      "type": "http",
      "url": "https://mcp.getcore.me/api/v1/mcp?source=Copilot-CLI",
      "headers": {
        "Authorization": "Bearer YOUR_API_KEY"
      }
    }
  }
}

IDEs

Install in VS Code

Enter the following in the mcp.json file:

{
  "servers": {
    "core-memory": {
      "url": "https://mcp.getcore.me/api/v1/mcp?source=Vscode",
      "type": "http",
      "headers": {
        "Authorization": "Bearer YOUR_API_KEY"
      }
    }
  }
}

Install in VS Code Insiders

Add to your VS Code Insiders MCP config:

{
  "mcp": {
    "servers": {
      "core-memory": {
        "type": "http",
        "url": "https://mcp.getcore.me/api/v1/mcp?source=VSCode-Insiders",
        "headers": {
          "Authorization": "Bearer YOUR_API_KEY"
        }
      }
    }
  }
}

Install in Windsurf

Enter the following in the mcp_config.json file:

{
  "mcpServers": {
    "core-memory": {
      "serverUrl": "https://mcp.getcore.me/api/v1/mcp/source=windsurf",
      "headers": {
        "Authorization": "Bearer "
      }
    }
  }
}

Install in Zed

  • Go to Settings in Agent Panel -> Add Custom Server
  • Enter the code below in the configuration file and click on the Add server button
{
  "core-memory": {
    "command": "npx",
    "args": ["-y", "mcp-remote", "https://mcp.getcore.me/api/v1/mcp?source=Zed"]
  }
}

Coding Agents

Install in Amp

Run this command in your terminal:

amp mcp add core-memory https://mcp.getcore.me/api/v1/mcp?source=amp

Install in Augment Code

Add to your ~/.augment/settings.json file:

{
  "mcpServers": {
    "core-memory": {
      "type": "http",
      "url": "https://mcp.getcore.me/api/v1/mcp?source=augment-code",
      "headers": {
        "Authorization": "Bearer YOUR_API_KEY"
      }
    }
  }
}

Install in Cline

  • Open Cline and click the hamburger menu icon (☰) to enter the MCP Servers section
  • Choose the Remote Servers tab and click the Edit Configuration button
  • Add the following to your Cline MCP configuration:
{
  "mcpServers": {
    "core-memory": {
      "url": "https://mcp.getcore.me/api/v1/mcp?source=Cline",
      "type": "streamableHttp",
      "headers": {
        "Authorization": "Bearer YOUR_API_KEY"
      }
    }
  }
}

Install in Kilo Code

  • Go to SettingsMCP ServersInstalled tab → click Edit Global MCP to edit your configuration.
  • Add the following to your MCP config file:
{
  "core-memory": {
    "type": "streamable-http",
    "url": "https://mcp.getcore.me/api/v1/mcp?source=Kilo-Code",
    "headers": {
      "Authorization": "Bearer your-token"
    }
  }
}

Install in Kiro

Add in Kiro → MCP Servers:

{
  "mcpServers": {
    "core-memory": {
      "url": "https://mcp.getcore.me/api/v1/mcp?source=Kiro",
      "headers": {
        "Authorization": "Bearer YOUR_API_KEY"
      }
    }
  }
}

Install in Qwen Coder

See Qwen Coder MCP Configuration for details.

Add to ~/.qwen/settings.json:

{
  "mcpServers": {
    "core-memory": {
      "httpUrl": "https://mcp.getcore.me/api/v1/mcp?source=Qwen",
      "headers": {
        "Authorization": "Bearer YOUR_API_KEY",
        "Accept": "application/json, text/event-stream"
      }
    }
  }
}

Install in Roo Code

Add to your Roo Code MCP configuration:

{
  "mcpServers": {
    "core-memory": {
      "type": "streamable-http",
      "url": "https://mcp.getcore.me/api/v1/mcp?source=Roo-Code",
      "headers": {
        "Authorization": "Bearer YOUR_API_KEY"
      }
    }
  }
}

Install in Opencode

Add to your Opencode configuration:

{
  "mcp": {
    "core-memory": {
      "type": "remote",
      "url": "https://mcp.getcore.me/api/v1/mcp?source=Opencode",
      "headers": {
        "Authorization": "Bearer YOUR_API_KEY"
      },
      "enabled": true
    }
  }
}

Install in Copilot Coding Agent

Add to Repository Settings → Copilot → Coding agent → MCP configuration:

{
  "mcpServers": {
    "core": {
      "type": "http",
      "url": "https://mcp.getcore.me/api/v1/mcp?source=Copilot-Agent",
      "headers": {
        "Authorization": "Bearer YOUR_API_KEY"
      }
    }
  }
}

Install in Qodo Gen

  • Open the Qodo Gen chat panel in VSCode or IntelliJ
  • Click Connect more tools, then click + Add new MCP
  • Add the following configuration:
{
  "mcpServers": {
    "core-memory": {
      "url": "https://mcp.getcore.me/api/v1/mcp?source=Qodo-Gen"
    }
  }
}

Terminals

Install in Warp

Add in Settings → AI → Manage MCP servers:

{
  "core": {
    "url": "https://mcp.getcore.me/api/v1/mcp?source=Warp",
    "headers": {
      "Authorization": "Bearer YOUR_API_KEY"
    }
  }
}

Install in Crush

Add to your Crush configuration:

{
  "$schema": "https://charm.land/crush.json",
  "mcp": {
    "core": {
      "type": "http",
      "url": "https://mcp.getcore.me/api/v1/mcp?source=Crush",
      "headers": {
        "Authorization": "Bearer YOUR_API_KEY"
      }
    }
  }
}

Desktop Apps

Install in ChatGPT

Connect ChatGPT to CORE's memory system via browser extension:

  • Install Core Browser Extension
  • Generate API Key: Go to Settings → API Key → Generate new key → Name it "extension"
  • Add API Key in Core Extension and click Save

Install in Gemini

Connect Gemini to CORE's memory system via browser extension:

  • Install Core Browser Extension
  • Generate API Key: Go to Settings → API Key → Generate new key → Name it "extension"
  • Add API Key in Core Extension and click Save

Install in Perplexity Desktop

  • Add in Perplexity → Settings → Connectors → Add Connector → Advanced:
{
  "core-memory": {
    "command": "npx",
    "args": ["-y", "mcp-remote", "https://mcp.getcore.me/api/v1/mcp?source=perplexity"]
  }
}

  • Click Save to apply the changes
  • Core will be available in your Perplexity sessions

Development Tools

Install in Factory

Run in terminal:

droid mcp add core https://mcp.getcore.me/api/v1/mcp?source=Factory --type http --header "Authorization: Bearer YOUR_API_KEY"

Type /mcp within droid to manage servers and view available tools.

Install in Rovo Dev CLI

  • Edit mcp config:
acli rovodev mcp

  • Add to your Rovo Dev MCP configuration:
{
  "mcpServers": {
    "core-memory": {
      "url": "https://mcp.getcore.me/api/v1/mcp?source=Rovo-Dev"
    }
  }
}

Install in Trae

Add to your Trae MCP configuration:

{
  "mcpServers": {
    "core": {
      "url": "https://mcp.getcore.me/api/v1/mcp?source=Trae"
    }
  }
}

🔨 Available Tools

CORE Memory MCP provides the following tools that LLMs can use:

---

How it Works

Memory Ingestion

memory-ingest-diagram

When you save context to CORE, it goes through four phases:

Example: "We wrote CORE in Next.js" becomes:

memory-ingest-eg

Memory Recall

memory-search-diagram

When you query CORE:

CORE doesn't just recall facts — it recalls them in context, with time and story, so agents respond the way you would remember.


🛠️ For Agent Builders

Building AI agents? CORE gives you memory infrastructure + integrations infrastructure so you can focus on your agent's logic.

What You Get

Memory Infrastructure

Integrations Infrastructure

Examples Projects

core-cli — A task manager agent that connects to CORE for memory and syncs with Linear, GitHub Issues.

holo — Turn your CORE memory into a personal website with chat.

Resources

---

🔥 Research Highlights

CORE memory achieves 88.24% average accuracy in the Locomo dataset across all reasoning tasks, significantly outperforming other memory providers.

benchmark

| Task Type | Description | |-----------|-------------| | Single-hop | Answers based on a single session | | Multi-hop | Synthesizing info from multiple sessions | | Open-domain | Integrating user info with external knowledge | | Temporal reasoning | Time-related cues and sequence understanding |

View benchmark methodology and results →


🔒 Security

CASA Tier 2 Certified — Third-party audited to meet Google's OAuth requirements.

Your data, your control:

For detailed security information, see our Security Policy.

Vulnerability Reporting: harshith@poozle.dev

Documentation

Explore our documentation to get the most out of CORE

🧑‍💻 Support

Have questions or feedback? We're here to help:

Usage Guidelines

Store:

Don't Store:

👥 Contributors

--- Tranlated By Open Ai Tx | Last indexed: 2026-01-10 ---