Why Your MCP Apps Feels Slower on ChatGPT Than Claude
TL;DR: ChatGPT creates a new MCP session for every tool call. Claude reuses one. This means extra handshake overhead on every turn — and your in-memory state gets thrown away. The Discovery We're b...

Source: DEV Community
TL;DR: ChatGPT creates a new MCP session for every tool call. Claude reuses one. This means extra handshake overhead on every turn — and your in-memory state gets thrown away. The Discovery We're building mcpr, an open-source proxy for MCP servers. Our cloud dashboard tracks every MCP request at the protocol level — including session lifecycle. While monitoring a production MCP server that serves both ChatGPT and Claude, we noticed something odd in the session data: ChatGPT (openai-mcp): 2 tool calls → 2 separate sessions Claude (claude-ai): 2 tool calls → 1 session Same server. Same tools. Completely different session behavior. What's Actually Happening Here's the timeline from our dashboard for a simple interaction where the AI calls two tools: ChatGPT Session 1: 04:02:40 PM initialize 3ms ok 04:02:40 PM tools/call create_matching_question 12ms ok ── session ended ── Session 2: 04:03:47 PM initialize 3ms ok 04:03:47 PM tools/call submit_answer 12ms ok ── session ended ── Two sessions