- Terminal-based Chat Client with MCP Server Integration
Terminal-based Chat Client with MCP Server Integration
This project demonstrates how to build a terminal-based chat client interface that connects to an MCP server and integrates with OpenAI's API. It includes a simple weather service as an example of MCP functionality.
Prerequisites
- Python 3.8 or higher
- UV package manager (a fast, reliable Python package installer and resolver)
Installation
1. Install UV
UV is a modern Python package manager that offers significant performance improvements over traditional tools like pip. It's written in Rust and provides:
- Faster package installation
- Reliable dependency resolution
- Built-in virtual environment management
- Compatible with existing Python tooling
To install UV, run:
curl -LsSf https://astral.sh/uv/install.sh | sh
2. Project Setup
- Initialize a new project:
uv init
- Create and activate a virtual environment:
uv venv
source .venv/bin/activate # On Unix/macOS
# or
.venv\Scripts\activate # On Windows
- Install required packages:
uv pip install httpx mcp[cli] openai python-dotenv
Project Structure and Implementation Guide
The project consists of two main components: a chat client (client.py
) and a weather service (weather.py
). Let's walk through how each component was built and what each part does.
Building the Chat Client (client.py)
The chat client is built as an asynchronous Python application that connects to both an MCP server and OpenAI's API. Here's how it was constructed:
-
Imports and Setup
import asyncio import os import sys from typing import Optional from contextlib import AsyncExitStack from dotenv import load_dotenv import openai from mcp import ClientSession, StdioServerParameters from mcp.client.stdio import stdio_client
asyncio
: For asynchronous programmingAsyncExitStack
: Manages cleanup of async resourcesdotenv
: Loads environment variables from .env filemcp
: Core MCP functionality for server communication
-
MCPClient Class The main client class handles:
- Connection to the MCP server
- OpenAI API integration
- Message processing
- Tool execution
Key methods:
connect_to_server()
: Establishes connection to the MCP serverprocess_query()
: Handles user queries and tool executionchat_loop()
: Manages the interactive chat sessioncleanup()
: Ensures proper resource cleanup
-
Main Function
async def main(): client = MCPClient() try: await client.connect_to_server(sys.argv[1]) await client.chat_loop() finally: await client.cleanup()
- Entry point that initializes the client
- Connects to the specified server
- Runs the chat loop
- Ensures proper cleanup
Building the Weather Service (weather.py)
The weather service is built as an MCP server that provides weather information through the National Weather Service API:
-
Service Initialization
from mcp.server.fastmcp import FastMCP mcp = FastMCP("weather")
- Creates an MCP server instance named "weather"
- Sets up the server infrastructure
-
API Integration
NWS_API_BASE = "https://api.weather.gov" USER_AGENT = "weather-app/1.0"
- Defines constants for the National Weather Service API
- Sets up proper user agent for API requests
-
Helper Functions
make_nws_request()
: Handles API requests with proper error handlingformat_alert()
: Formats weather alerts into readable text
-
MCP Tools Two main tools are implemented:
a.
get_alerts(state)
:- Fetches active weather alerts for a US state
- Returns formatted alert information
b.
get_forecast(latitude, longitude)
:- Retrieves weather forecast for a location
- Returns detailed forecast information
-
Server Execution
if __name__ == "__main__": mcp.run(transport="stdio")
- Runs the MCP server using stdio transport
- Enables communication with the chat client
Usage
- Create a
.env
file with your OpenAI API key:
OPENAI_API_KEY=your_api_key_here
- Start the MCP server:
python weather.py
- In a separate terminal, run the chat client:
python client.py weather.py
- Interact with the chat interface:
- Ask general questions to chat with the AI
- Use weather-related queries to get weather information
- Example: "What's the weather in California?" or "Are there any alerts in New York?"
Using with Cursor's Agent Mode
This MCP server can be integrated directly with Cursor's Agent mode (Note: This is different from Cursor's Ask feature and only works in Agent mode). Here's how to set it up:
Adding the MCP Server to Cursor
- Open Cursor Settings
- Navigate to
Features
>MCP
- Click
+ Add New MCP Server
- Fill out the form:
- Type: Select
stdio
- Name: "Weather Service" (or any name you prefer)
- Command: Enter the full path to run the weather server:
python /full/path/to/your/weather.py
- Type: Select
Alternative: Project-Specific Configuration
You can also configure the MCP server for your project by creating a .cursor/mcp.json
file:
- Create the
.cursor
directory in your project root:
mkdir .cursor
- Create
mcp.json
with the following content:
{
"mcpServers": {
"weather": {
"command": "python",
"args": [
"/full/path/to/your/weather.py"
]
}
}
}
Using the Weather Tools
- Open Cursor's Composer (Agent mode)
- The Agent will automatically detect when weather information is needed
- Example queries:
- "What's the current weather in San Francisco?"
- "Are there any weather alerts in California?"
- "Get me the forecast for New York City"
Important Notes
- Tools are only available in Cursor's Agent mode (Composer), not in Ask mode
- By default, Cursor will ask for approval before using MCP tools
- You may need to click the refresh button in the MCP settings to see newly added tools
- The server must be running on your local machine (remote servers require SSE transport)
Features
- Real-time chat interface with OpenAI integration
- MCP server integration for extensible functionality
- Weather service with alerts and forecasts
- Asynchronous operation for better performance
- Proper error handling and resource cleanup
- Environment variable configuration for API keys
Contributing
Feel free to submit issues and enhancement requests!