Beyond Claude: Building MCP Tools That Work Everywhere
Anthropic created MCP, but the protocol is open. Here's how to build tools that outlast any single AI system.
·10 min read
The Multi-Model Future
Today's reality:
- Claude is great at reasoning and analysis
- GPT-4 excels at certain creative tasks
- Open models offer deployment flexibility
- Specialized models handle domain-specific work
Tomorrow's reality:
- Teams will use multiple AI systems
- Different tasks route to different models
- Context needs to be available to all of them
Why Protocol-First Matters
The Vendor Lock-In Trap
// Tightly coupled to Claude
import { claude } from '@anthropic-ai/sdk'
const response = await claude.messages.create({
system: "You have access to our CRM via custom functions...",
tools: [{ name: "get_customer", claude_specific_format: true }]
})
// Problem: Switch models = rebuild everythingThe Protocol-First Approach
// MCP Server (model-agnostic)
server.exposeResource("customers", customerData)
server.exposeTool("get_customer", getCustomerFn)
// Any MCP client can connect
// Claude Desktop? Works.
// Future GPT integration? Works.
// Open source agent? Works.
// Your custom app? Works.Building for Portability
Principle 1: Clean Separation
// BAD: AI-specific logic in your server
server.exposeTool("analyze", async (input) => {
// Assume Claude's response format
return { anthropic_format: true, result: ... }
})
// GOOD: Standard formats, let the client adapt
server.exposeTool("analyze", async (input) => {
const result = await yourAnalysisLogic(input)
return {
type: "analysis_result",
data: result,
format: "application/json"
}
})Principle 2: Rich Metadata
// Help ANY AI understand what you provide
server.exposeResource("quarterly_metrics", {
uri: "metrics://q4-2024",
name: "Q4 2024 Metrics",
description: "Revenue, churn, NPS, and growth metrics for Q4 2024. Updated weekly. Contains sensitive financial data.",
mimeType: "application/json",
schema: {
type: "object",
properties: {
revenue: { type: "number", description: "Total revenue in USD" },
churn_rate: { type: "number", description: "Monthly churn percentage" },
nps: { type: "integer", description: "Net Promoter Score (-100 to 100)" }
}
}
})Principle 3: Standard Schemas
// Use widely-understood formats
{
// Schema.org for entities
"@type": "Organization",
"name": "Acme Corp",
"industry": "Technology"
}
{
// JSON Schema for tools
"inputSchema": {
"type": "object",
"properties": {...},
"required": [...]
}
}
// Avoid: Custom formats that only one AI understandsThe Ecosystem Is Growing
Current MCP Clients
- Claude Desktop: Native support
- Claude Code: CLI with MCP
- Zed Editor: MCP integration
- Continue.dev: IDE plugin with MCP
Emerging Patterns
- OpenAI's function calling → MCP adapters
- LangChain tool integration → MCP wrappers
- Custom agent frameworks → MCP as standard interface
Future-Proofing Your Investment
// Your MCP server becomes more valuable over time
Year 1: Claude Desktop uses it
Year 2: GPT integration added, works immediately
Year 3: Internal agent system connects, no changes needed
Year 4: Partner's AI system integrates, just shares the connection
// One investment, compounding returnsBuilding Bridges
MCP to OpenAI Functions
// Adapter that converts MCP tools to OpenAI format
const mcpToOpenAI = (mcpTool) => ({
type: "function",
function: {
name: mcpTool.name,
description: mcpTool.description,
parameters: mcpTool.inputSchema
}
})
// Your MCP server works with GPT tooMCP to LangChain
// LangChain tool wrapper
import { Tool } from "langchain/tools"
class MCPTool extends Tool {
constructor(mcpClient, toolName) {
super()
this.mcpClient = mcpClient
this.toolName = toolName
}
async _call(input) {
return await this.mcpClient.callTool(this.toolName, input)
}
}The Interoperability Bet
Two possible futures:
Future A: Fragmentation
- Every AI has its own integration format
- You maintain N different implementations
- Switching costs keep you locked in
Future B: Standardization
- MCP (or similar) becomes the standard
- One implementation works everywhere
- Competition happens on AI quality, not lock-in
Building on MCP bets on Future B. If Future A happens, your MCP server logic is still reusable—the transport layer changes, but the business logic doesn't.
Build for portability. The AI landscape is shifting, but good abstractions outlast any single provider.
Protocol-Native Context
Xtended's MCP server works with any compliant AI system. Your context, available everywhere.
Try Universal Context