Creating MCP Clients in Swift: Integrating Model Context Protocol

5 min readMar 24, 2025

In this post, we’ll explore how to integrate the Model Context Protocol (MCP) in Swift applications to enable AI models to access local and remote resources. I’ll show you how to create your own MCP clients and add them in to your applications.

What is MCP?

The Model Context Protocol (MCP) is an open protocol that allows AI models to interact with local and remote resources through standardized server implementations. Think of it as a bridge that lets AI models discover and use tools in your application.

At its core, MCP enables:

  • Tool discovery — AI models can find out what tools are available
  • Parameter-based function calls — Models can call these tools with specific inputs
  • Standardized response handling — Results are returned in a consistent format

For iOS engineers, this means you can create applications where an AI assistant can access files, make API calls, or run code in a controlled manner.

Why Use MCP?

The Model Context Protocol (MCP) streamlines the integration of AI models with external tools and data sources, offering several key advantages:

  • Pre-built integrations: Access a growing ecosystem of ready-to-use server implementations that your AI models can immediately leverage, reducing development time
  • Provider flexibility: Easily switch between different AI vendors and providers while maintaining the same tool integrations and workflows
  • Standardized interfaces: Use a consistent protocol for different data sources and tools, eliminating the need to build custom integrations for each service
  • Separation of concerns: Maintain clean boundaries between your application, AI models, and external services with a well-defined architecture
  • Reusable components: Create tools once and use them across different AI-powered applications, regardless of which model provider you’re using

Without a standardized protocol like MCP, you would need to:

  1. Create custom integration code for each AI service and data source combination
  2. Develop different interfaces for similar functionality across providers
  3. Rebuild these integrations whenever APIs change
  4. Manage complex dependencies between your application and external services

Getting Started with MCPSwiftWrapper

I’ve created a lightweight wrapper called MCPSwiftWrapper around the excellent mcp-swift-sdk package. This library integrates with my other libraries SwiftOpenAI and SwiftAnthropic to enable seamless tool usage.

Prerequisites

Before you begin, you’ll need:

  • macOS 14.0+ and Xcode 16.0+
  • npm/npx installed on your system
  • Swift 6.0+

Installation

Add the package to your project using Swift Package Manager:

dependencies: [
.package(url: "https://github.com/jamesrochabrun/MCPSwiftWrapper", from: "1.0.0")
]

Or add it directly in Xcode through File > Add Package Dependencies.

⚠️ Important: When building a client macOS app, you need to disable app sandboxing since the app needs to run processes for each MCP server.

Creating Your First MCP Client

Let’s create a simple GitHub MCP client that allows AI models to interact with GitHub repositories. Here’s how to do it step by step:

Step 1: Import the necessary libraries

import MCPSwiftWrapper
import Foundation

Step 2: Create your client class

let githubClient = try await MCPClient(
info: .init(name: "GithubMCPClient", version: "1.0.0"),
transport: .stdioProcess(
"npx",
args: ["-y", "@modelcontextprotocol/server-github"],
verbose: true), // If you want to see the logs
capabilities: .init())

This creates a client that runs the @modelcontextprotocol/server-github npm package for GitHub interaction via MCP. In this example, we are using npx but you can also use uvx if installed.

Using the MCP Client with AI Models

After creating your client, you can use it with different AI models. The MCPSwiftWrapper package provides convenient extensions for both Anthropic and OpenAI models.

Using with Anthropic’s Claude


// Get available tools from MCP and convert them to Anthropic format
let tools = try await client.anthropicTools()

// Create Anthropic parameters with the tools
let parameters = AnthropicParameters(
model: .claude37Sonnet,
messages: [AnthropicMessage(role: .user, content: .text("Help me check the issues in my GitHub repo"))],
maxTokens: 1000,
tools: tools
)

// Make the API call
let response = try await anthropicService.createMessage(parameters)

// Handle tool calls from Claude
for contentItem in response.content {
switch contentItem {
case .text(let text, _):
print("Claude says: \(text)")

case .toolUse(let tool):
print("Claude wants to use tool: \(tool.name)")

// Call the tool via MCP
let toolResponse = await client.anthropicCallTool(
name: tool.name,
input: tool.input,
debug: true
)

// Send the tool result back to Claude
if let toolResult = toolResponse {
// Continue conversation with tool result...
}
case .thinking:
break
}
}

Using with OpenAI

MCPSwiftWrapper also works seamlessly with OpenAI models:

// Get MCP tools in OpenAI format
let tools = try await client.openAITools()
// Create OpenAI parameters with tools
let parameters = OpenAIParameters(
messages: [OpenAIMessage(role: .user, content: .text("Check my GitHub repo for issues"))],
model: .gpt4o,
toolChoice: .auto,
tools: tools
)
// Make the API call
let response = try await openAIService.startChat(parameters: parameters)
// Handle tool calls if any
if let toolCalls = response.choices?.first?.message.toolCalls, !toolCalls.isEmpty {
for toolCall in toolCalls {
let function = toolCall.function
let toolName = function.name ?? ""

// Parse arguments as dictionary
let arguments = try JSONSerialization.jsonObject(
with: function.arguments.data(using: .utf8) ?? Data(),
options: []
) as? [String: Any] ?? [:]

// Call the tool
let toolResponse = await client.openAICallTool(
name: toolName,
input: arguments,
debug: true
)

// Send result back to OpenAI...
}
}

Chat Example

For a complete implementation, check out the example application included in the MCPSwiftWrapper package under Example/MCPClientChat. It demonstrates:

  1. Managing chat state with both OpenAI and Anthropic models
  2. Handling tool calls and responses
  3. Building a complete UI for conversational interactions

Available MCP Clients

The Model Context Protocol servers repository provides a variety of MCP implementations you can use, such as GitHub, file system access, and more.

You can even use Claude Code as an MCP server! Here’s how to set it up:

let claudeCodeClient = try await MCPClient(
info: .init(name: "ClaudeCodeMCPClient", version: "1.0.0"),
transport: .stdioProcess(
"claude",
args: ["mcp", "serve"],
verbose: true),
capabilities: .init())

This follows the Claude Code MCP setup instructions and enables Claude to execute code directly from your application.

Conclusion

The Model Context Protocol opens up powerful capabilities for AI integration in your Swift applications. With MCPSwiftWrapper, you can easily bridge your app with AI models while following standardized patterns and avoiding duplicated integration work.

By following the steps in this post, you can quickly add AI capabilities that extend beyond conversation to actually interacting with systems and resources. This enables creating truly useful AI-powered applications that can help users accomplish real tasks.

If you found this helpful, check out my other tutorials on SwiftOpenAI, SwiftAnthropic, and more on my GitHub.

References:

--

--

Responses (1)