An Implementation to Build Dynamic AI Systems with the Model Context Protocol (MCP) for Real-Time Resource and Tool Integration
In this tutorial, we discover the Advanced Model Context Protocol (MCP) and exhibit how to use it to tackle one in all the most unusual challenges in trendy AI programs: enabling real-time interplay between AI fashions and exterior knowledge or instruments. Traditional fashions function in isolation, restricted to their coaching knowledge, however by means of MCP, we create a bridge that allows fashions to entry reside assets, run specialised instruments, and adapt dynamically to altering contexts. We stroll by means of constructing an MCP server and shopper from scratch, exhibiting how every part contributes to this highly effective ecosystem of clever collaboration. Check out the FULL CODES here.
import json
import asyncio
from dataclasses import dataclass, asdict
from typing import Dict, List, Any, Optional, Callable
from datetime import datetime
import random
@dataclass
class Resource:
uri: str
title: str
description: str
mime_type: str
content material: Any = None
@dataclass
class Tool:
title: str
description: str
parameters: Dict[str, Any]
handler: Optional[Callable] = None
@dataclass
class Message:
position: str
content material: str
timestamp: str = None
def __post_init__(self):
if not self.timestamp:
self.timestamp = datetime.now().isoformat()
We start by defining the basic constructing blocks of MCP: assets, instruments, and messages. We design these knowledge buildings to signify how data flows between AI programs and their exterior environments in a clear, structured manner. Check out the FULL CODES here.
class MCPServer:
def __init__(self, title: str):
self.title = title
self.assets: Dict[str, Resource] = {}
self.instruments: Dict[str, Tool] = {}
self.capabilities = {"assets": True, "instruments": True, "prompts": True, "logging": True}
print(f"✓ MCP Server '{title}' initialized with capabilities: {record(self.capabilities.keys())}")
def register_resource(self, useful resource: Resource) -> None:
self.assets[resource.uri] = useful resource
print(f" → Resource registered: {useful resource.title} ({useful resource.uri})")
def register_tool(self, software: Tool) -> None:
self.instruments[tool.name] = software
print(f" → Tool registered: {software.title}")
async def get_resource(self, uri: str) -> Optional[Resource]:
await asyncio.sleep(0.1)
return self.assets.get(uri)
async def execute_tool(self, tool_name: str, arguments: Dict[str, Any]) -> Any:
if tool_name not in self.instruments:
increase ValueError(f"Tool '{tool_name}' not discovered")
software = self.instruments[tool_name]
if software.handler:
return await software.handler(**arguments)
return {"standing": "executed", "software": tool_name, "args": arguments}
def list_resources(self) -> List[Dict[str, str]]:
return [{"uri": r.uri, "name": r.name, "description": r.description} for r in self.resources.values()]
def list_tools(self) -> List[Dict[str, Any]]:
return [{"name": t.name, "description": t.description, "parameters": t.parameters} for t in self.tools.values()]
We implement the MCP server that manages assets and instruments whereas dealing with execution and retrieval operations. We guarantee it helps asynchronous interplay, making it environment friendly and scalable for real-world AI functions. Check out the FULL CODES here.
class MCPClient:
def __init__(self, client_id: str):
self.client_id = client_id
self.connected_servers: Dict[str, MCPServer] = {}
self.context: List[Message] = []
print(f"n✓ MCP Client '{client_id}' initialized")
def connect_server(self, server: MCPServer) -> None:
self.connected_servers[server.name] = server
print(f" → Connected to server: {server.title}")
async def query_resources(self, server_name: str) -> List[Dict[str, str]]:
if server_name not in self.connected_servers:
increase ValueError(f"Not linked to server: {server_name}")
return self.connected_servers[server_name].list_resources()
async def fetch_resource(self, server_name: str, uri: str) -> Optional[Resource]:
if server_name not in self.connected_servers:
increase ValueError(f"Not linked to server: {server_name}")
server = self.connected_servers[server_name]
useful resource = await server.get_resource(uri)
if useful resource:
self.add_to_context(Message(position="system", content material=f"Fetched useful resource: {useful resource.title}"))
return useful resource
async def call_tool(self, server_name: str, tool_name: str, **kwargs) -> Any:
if server_name not in self.connected_servers:
increase ValueError(f"Not linked to server: {server_name}")
server = self.connected_servers[server_name]
end result = await server.execute_tool(tool_name, kwargs)
self.add_to_context(Message(position="system", content material=f"Tool '{tool_name}' executed"))
return end result
def add_to_context(self, message: Message) -> None:
self.context.append(message)
def get_context(self) -> List[Dict[str, Any]]:
return [asdict(msg) for msg in self.context]
We create the MCP shopper that connects to the server, queries assets, and executes instruments. We preserve a contextual reminiscence of all interactions, enabling steady, stateful communication with the server. Check out the FULL CODES here.
async def analyze_sentiment(textual content: str) -> Dict[str, Any]:
await asyncio.sleep(0.2)
sentiments = ["positive", "negative", "neutral"]
return {"textual content": textual content, "sentiment": random.alternative(sentiments), "confidence": spherical(random.uniform(0.7, 0.99), 2)}
async def summarize_text(textual content: str, max_length: int = 100) -> Dict[str, str]:
await asyncio.sleep(0.15)
abstract = textual content[:max_length] + "..." if len(textual content) > max_length else textual content
return {"original_length": len(textual content), "abstract": abstract, "compression_ratio": spherical(len(abstract) / len(textual content), 2)}
async def search_knowledge(question: str, top_k: int = 3) -> List[Dict[str, Any]]:
await asyncio.sleep(0.25)
mock_results = [{"title": f"Result {i+1} for '{query}'", "score": round(random.uniform(0.5, 1.0), 2)} for i in range(top_k)]
return sorted(mock_results, key=lambda x: x["score"], reverse=True)
We outline a set of asynchronous software handlers, together with sentiment evaluation, textual content summarization, and data search. We use them to simulate how the MCP system can execute numerous operations by means of modular, pluggable instruments. Check out the FULL CODES here.
async def run_mcp_demo():
print("=" * 60)
print("MODEL CONTEXT PROTOCOL (MCP) - ADVANCED TUTORIAL")
print("=" * 60)
print("n[1] Setting up MCP Server...")
server = MCPServer("knowledge-server")
print("n[2] Registering assets...")
server.register_resource(Resource(uri="docs://python-guide", title="Python Programming Guide", description="Comprehensive Python documentation", mime_type="textual content/markdown", content material="# Python GuidenPython is a high-level programming language..."))
server.register_resource(Resource(uri="knowledge://sales-2024", title="2024 Sales Data", description="Annual gross sales metrics", mime_type="utility/json", content material={"q1": 125000, "q2": 142000, "q3": 138000, "this autumn": 165000}))
print("n[3] Registering instruments...")
server.register_tool(Tool(title="analyze_sentiment", description="Analyze sentiment of textual content", parameters={"textual content": {"kind": "string", "required": True}}, handler=analyze_sentiment))
server.register_tool(Tool(title="summarize_text", description="Summarize lengthy textual content", parameters={"textual content": {"kind": "string", "required": True}, "max_length": {"kind": "integer", "default": 100}}, handler=summarize_text))
server.register_tool(Tool(title="search_knowledge", description="Search data base", parameters={"question": {"kind": "string", "required": True}, "top_k": {"kind": "integer", "default": 3}}, handler=search_knowledge))
shopper = MCPClient("demo-client")
shopper.connect_server(server)
print("n" + "=" * 60)
print("DEMONSTRATION: MCP IN ACTION")
print("=" * 60)
print("n[Demo 1] Listing out there assets...")
assets = await shopper.query_resources("knowledge-server")
for res in assets:
print(f" • {res['name']}: {res['description']}")
print("n[Demo 2] Fetching gross sales knowledge useful resource...")
sales_resource = await shopper.fetch_resource("knowledge-server", "knowledge://sales-2024")
if sales_resource:
print(f" Data: {json.dumps(sales_resource.content material, indent=2)}")
print("n[Demo 3] Analyzing sentiment...")
sentiment_result = await shopper.call_tool("knowledge-server", "analyze_sentiment", textual content="MCP is a tremendous protocol for AI integration!")
print(f" Result: {json.dumps(sentiment_result, indent=2)}")
print("n[Demo 4] Summarizing textual content...")
summary_result = await shopper.call_tool("knowledge-server", "summarize_text", textual content="The Model Context Protocol allows seamless integration between AI fashions and exterior knowledge sources...", max_length=50)
print(f" Summary: {summary_result['summary']}")
print("n[Demo 5] Searching data base...")
search_result = await shopper.call_tool("knowledge-server", "search_knowledge", question="machine studying", top_k=3)
print(" Top outcomes:")
for lead to search_result:
print(f" - {end result['title']} (rating: {end result['score']})")
print("n[Demo 6] Current context window...")
context = shopper.get_context()
print(f" Context size: {len(context)} messages")
for i, msg in enumerate(context[-3:], 1):
print(f" {i}. [{msg['role']}] {msg['content']}")
print("n" + "=" * 60)
print("✓ MCP Tutorial Complete!")
print("=" * 60)
print("nKey Takeaways:")
print("• MCP allows modular AI-to-resource connections")
print("• Resources present context from exterior sources")
print("• Tools allow dynamic operations and actions")
print("• Async design helps environment friendly I/O operations")
if __name__ == "__main__":
import sys
if 'ipykernel' in sys.modules or 'google.colab' in sys.modules:
await run_mcp_demo()
else:
asyncio.run(run_mcp_demo())
We convey every part collectively into a whole demonstration the place the shopper interacts with the server, fetches knowledge, runs instruments, and maintains context. We witness the full potential of MCP because it seamlessly integrates AI logic with exterior data and computation.
In conclusion, the uniqueness of the drawback we clear up right here lies in breaking the boundaries of static AI programs. Instead of treating fashions as closed packing containers, we design an structure that allows them to question, motive, and act on real-world knowledge in structured, context-driven methods. This dynamic interoperability, achieved by means of the MCP framework, represents a significant shift towards modular, tool-augmented intelligence. By understanding and implementing MCP, we place ourselves to construct the subsequent era of adaptive AI programs that may assume, be taught, and join past their unique confines.
Check out the FULL CODES here. Feel free to try our GitHub Page for Tutorials, Codes and Notebooks. Also, be happy to comply with us on Twitter and don’t neglect to be a part of our 100k+ ML SubReddit and Subscribe to our Newsletter. Wait! are you on telegram? now you can join us on telegram as well.
The submit An Implementation to Build Dynamic AI Systems with the Model Context Protocol (MCP) for Real-Time Resource and Tool Integration appeared first on MarkTechPost.