Plurality Network

Connect AI Context Flow to any Tool Using MCP

How to Connect AI Context Flow Memory to Any AI Tool Using MCP Servers

Connect AI Context Flow to Any Tool With MCP Servers

By Hira • Feb 26, 2026

Connect AI Context Flow to any Tool Using MCP

💡 What’s Covered Inside

This article covers how to add Plurality’s Open Context MCP server with tools like ChatGPT, Claude, Gemini, Lovable, Bolt, Claude Code, GitHub Copilot, and more. The Open Context MCP server allows you to create your context once in the memory studio and then keep using and enriching it from any tool or website.

What is an MCP Server?

A Model Context Protocol (MCP) server is a lightweight connector that gives AI agents and assistants access to external tools, data, and memory through a standardized interface. Rather than every AI platform building its own one-off integrations, MCP provides a universal “plug-in” standard, so that once something is exposed as an MCP server, any compatible AI client can connect to it. For users, this means you can wire your own memory, documents, knowledge bases, or tools directly into the AI assistants you already use, letting them work with your personal context instead of starting from scratch every time.

Plurality’s Open Context MCP

AI Context Flow is built on top of the Open Context layer: a user-owned, portable memory layer that stores your context, documents, and notes in a way that travels with you across AI platforms and websites. While the browser-based experience handles context flowing between websites and AI tools automatically, Plurality’s Open Context MCP extends that reach further.

It acts as a connector to the Open Context layer for any environment outside the browser e.g. desktop agents, CLI tools, coding assistants, or any agent that supports MCP. Plurality’s Open Context MCP is an OAuth-secured MCP server that gives any compatible AI client (Claude Code, Claude Desktop, ChatGPT, Cursor, and more) full read and write access to your Plurality memory, including your documents, notes, and files stored across memory buckets, keeping your context portable, private, and always within reach.

What Operations are Possible?

When you connect Plurality’s Open Context MCP to another tool, you get access to the following operations:

Tool Description
get_user_memory_buckets List all memory buckets (AI profiles) for the user
list_items_in_memory_bucket List stored items in a specific bucket (metadata only)
search_memory Semantic search across buckets with relevance scoring
read_context Read the full content of a stored item with pagination
save_memory Save text content to a specific memory bucket
save_conversation Save a conversation (chat history) to a memory bucket
create_memory_bucket Create a new memory bucket for organizing saved content

MCP Integration with Tools

MCP can be added to different tools in different ways. Here we discuss different tools and how to connect them to Plurality’s Open Context MCP (or any other MCP server).

ChatGPT

  1. Open Settings → Apps → Create app
  2. Enter a name (e.g. “Plurality Memory”) and paste https://app.plurality.network/mcp as the URL
  3. Save the connector and ChatGPT will discover the OAuth metadata automatically
  4. On first use in a chat, ChatGPT opens a browser window for OAuth login
  5. After authenticating, the memory is now available in your conversations

To add MCP server, developer Mode must be enabled by a workspace admin under Settings → Admin → Developer Mode.

ChatGPT MCP Connector
ChatGPT MCP Connector

Claude Desktop / Web

Claude supports MCP integration on both free and paid plans, albeit in different ways.

Easy setup (paid plans — Pro, Max, Team, Enterprise):

  1. Open Settings → Connectors
  2. Click Add → paste https://app.plurality.network/mcp
  3. Claude opens a browser window for OAuth login where you must sign in with your Plurality account
  4. Once authenticated, the Plurality tools appear in the chat input

Development mode (free plan — Desktop app only):

Free-plan users can connect the Desktop app via the mcp-remote bridge by editing the config file directly. This does not work with the web app , only the native Desktop app reads this config.

  1. Open the config file:
    • Windows: %APPDATA%\\Claude\\claude_desktop_config.json
    • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  2. Add the mcpServers block:
{
  "mcpServers": {
    "plurality-memory": {
      "command": "npx",
      "args": ["mcp-remote", "<https://app.plurality.network/mcp>"]
    }
  }
}

Windows note: If you get “Connection closed” errors, wrap with cmd /c:

{ "command": "cmd", "args": ["/c", "npx", "mcp-remote", "<https://app.plurality.network/mcp>"] }
  1. Fully restart Claude Desktop (quit and reopen, not just close the window).
  2. On first use, mcp-remote opens your browser for OAuth login. After authenticating, tokens are cached locally.
  3. Look for the connectors icon in Claude Desktop’s chat input and you should see the Plurality memory connector.
    Claude MCP Connector
    Claude MCP Connector

Claude Code

Use the following command on CLI

claude mcp add --transport http plurality-memory <https://app.plurality.network/mcp>

Then authenticate inside Claude Code:

> /mcp

GitHub Copilot (VS Code)

GitHub Copilot supports MCP servers via VS Code’s native MCP configuration, available in VS Code 1.99 and later.

  1. Open VS Code and press Cmd+Shift+P (macOS) or Ctrl+Shift+P (Windows/Linux)
  2. Run MCP: Add Server and select HTTP (HTTP or Server-Sent Events)
  3. Enter https://app.plurality.network/mcp as the server URL
  4. Give the server a name, e.g. plurality-memory
  5. Choose whether to save this to your User settings (all projects) or Workspace settings (project-only)

Alternatively, add directly to your settings.json:

{
  "mcp": {
    "servers": {
      "plurality-memory": {
        "type": "http",
        "url": "https://app.plurality.network/mcp"
      }
    }
  }
}

  1. Open GitHub Copilot Chat, switch to Agent mode, and the Plurality Open Context MCP tools will appear in the available tools list.
  2. On first use, VS Code will prompt you to authenticate via OAuth in your browser.

Requires GitHub Copilot subscription and VS Code 1.99+.

Cursor

Cursor supports MCP through a simple JSON config file. You can configure it globally (available in all projects) or per-project.

Global setup: edit ~/.cursor/mcp.json (create it if it doesn’t exist):

{
  "mcpServers": {
    "plurality-memory": {
      "url": "https://app.plurality.network/mcp"
    }
  }
}

Project-level setup: create .cursor/mcp.json in your project root with the same content above.
After saving the config, restart Cursor. Navigate to Settings → MCP to verify the server shows a green active status. On first use, Cursor will open your browser to complete OAuth authentication with your Plurality account.

Windsurf

  1. Edit (or create) ~/.codeium/windsurf/mcp_config.json:

    { "mcpServers": { "plurality-memory": { "serverUrl": "https://app.plurality.network/mcp" } } }
  2. Restart Windsurf to pick up the new config.
  3. Open the Cascade panel, the Plurality Open Context MCP tools will be available to the AI agent.
  4. On first use, Windsurf will open your browser for OAuth login.

LM Studio

LM Studio supports MCP from version 0.3.5 onward, allowing locally-running models to call external tools like Plurality’s Open Context memory.

  1. Open LM Studio and navigate to the Developer tab (enable it in Settings → Advanced if not visible)
  2. Click Add MCP Server
  3. Enter the server URL: https://app.plurality.network/mcp
  4. Give it a label, e.g. Plurality Open Context
  5. LM Studio will perform the OAuth handshake — authenticate in the browser window that opens
  6. Switch to the Chat tab, start a session with any loaded model, and enable tools in the chat toolbar

MCP tool calling performance depends on the local model’s instruction-following capability. Models fine-tuned for tool use (e.g. Mistral, Llama 3.1+, Qwen2.5) will get the best results.

Lovable

Lovable supports MCP integration (on its paid plan) directly within its builder environment, letting your AI-generated apps read and write to your Plurality memory as part of the build process.

  1. Open your Lovable project
  2. Navigate to Settings → Connectors → Personal Connectors → New MCP Servers
  3. Click Connect MCP and paste https://app.plurality.network/mcp
  4. Authenticate via the OAuth flow that opens in your browser
  5. Once connected, you can reference your Plurality memory and documents directly in Lovable prompts

This is particularly powerful for building personalized apps as your context, documents, and notes become live data sources for whatever you’re building.

Lovable MCP configuration
Lovable MCP configuration

Replit

  1. Open a Replit project and start the Agent
  2. Click the Tools icon in the agent panel
  3. Select Add tool → MCP Server
  4. Enter https://app.plurality.network/mcp and confirm
  5. Complete the OAuth authentication in the browser prompt
  6. Plurality Open Context tools are now available to the agent when generating or editing your code

Other MCP Clients

Any MCP client that supports streamable HTTP transport and OAuth2 with Dynamic Client Registration (DCR) can connect by pointing to: https://app.plurality.network/mcp 

Your Memory, Your Rules

Most AI tools treat your context as ephemeral i.e. useful for this session, gone by the next. Plurality flips that. With Plurality’s Open Context MCP, your memory is a first-class, portable asset that moves with you: from your browser to your IDE, from a chat assistant to a coding agent, from one platform to another.

Write once. Remember everywhere. That’s the point.

Frequently Asked Questions

What is Plurality's Open Context MCP?

Plurality’s Open Context MCP is an OAuth-secured MCP (Model Context Protocol) server that connects any compatible AI tool to your Plurality memory layer. It gives tools like ChatGPT, Claude, Cursor, and GitHub Copilot read and write access to your documents, notes, and conversation history stored in Plurality.

Any AI tool that supports MCP with streamable HTTP transport and OAuth2 with Dynamic Client Registration (DCR) can connect. Currently supported tools include ChatGPT, Claude Desktop, Claude Code, GitHub Copilot (VS Code), Cursor, Windsurf, LM Studio, Lovable, and Replit.

For most tools (ChatGPT, Claude paid plans, Cursor, Windsurf, Lovable, Replit), no installation is required. You just provide the server URL and authenticate. For Claude Desktop on a free plan, you’ll need Node.js installed to run mcp-remote.

It depends on the tool. Claude Desktop (free plan) supports it via the mcp-remote config method. ChatGPT requires a paid plan (Plus, Pro, Team, Enterprise, or Edu). GitHub Copilot requires a paid subscription. For most coding tools like Cursor, Windsurf, and LM Studio, no paid plan is required for MCP support.

A memory bucket is an organized container within your Plurality memory layer. You can think of it as a folder for a specific AI profile, project, or context. It can store documents, notes, conversations, and files that are relevant to that use case. You can create as many buckets as you need and control which tools have access to them.

The Open Context layer is the underlying memory infrastructure that Plurality’s Open Context MCP connects to. It’s a user-owned, portable memory store where your context, documents, and notes live. Unlike platform-specific memory features (e.g. ChatGPT’s memory or Claude’s Projects), Open Context is not tied to any single AI tool. It travels with you across all compatible platforms.

Yes. Authentication is handled via OAuth2, meaning no AI tool ever handles your Plurality credentials directly. Each tool is granted access only after you explicitly authorize it through a browser-based login flow. Tokens are cached locally on your device.

Yes. Because all tools connect to the same Open Context layer, they all read from and write to the same memory buckets. This means context you save in Cursor is immediately available in Claude or ChatGPT without any manual syncing.

Yes, via LM Studio (version 0.3.5 or later). LM Studio lets locally-running models call external MCP tools, including Plurality’s Open Context MCP. Performance depends on the local model’s ability to follow tool-calling instructions. Models like Mistral, Llama 3.1+, and Qwen2.5 work best.

The most common cause is not fully restarting the tool after adding the config. For Claude Desktop and Cursor, a full quit-and-reopen (not just closing the window) is required. For VS Code, try reloading the window. Also confirm the server URL is exactly https://app.plurality.network/mcp with no trailing slash or extra characters.

Platform-native memory features are siloed. ChatGPT’s memory only works in ChatGPT, and Claude’s Projects only work in Claude. Plurality’s Open Context layer is platform-agnostic: the same memory is accessible from every tool you connect, and you own the data regardless of which platform you’re using.

🍪 This website uses cookies to improve your web experience.