Remote MCP Server
Remote MCP Server - LLM Integration
LLM Integration Guide
The Tomba Remote MCP Server provides seamless integration with leading Large Language Models (LLMs) and AI assistants through the standardized Model Context Protocol. This guide covers setup, configuration, and best practices for various LLM platforms.
Supported LLM Platforms
AI Assistants
Platform | Support Level | Transport | Authentication | Real-time |
---|---|---|---|---|
Claude Desktop | ✅ Full | HTTP/SSE | API Key | ✅ Yes |
ChatGPT | ✅ Full | HTTP/SSE | API Key | ✅ Yes |
GitHub Copilot | ✅ Full | HTTP/SSE | API Key | ✅ Yes |
Cursor AI | ✅ Full | HTTP/SSE | API Key | ✅ Yes |
Codeium | ✅ Full | HTTP/SSE | API Key | ✅ Yes |
Enterprise LLMs
Platform | Support Level | Transport | Authentication | Real-time |
---|---|---|---|---|
Azure OpenAI | ✅ Full | HTTP/SSE | API Key | ✅ Yes |
Google Gemini | ✅ Full | HTTP/SSE | API Key | ✅ Yes |
Anthropic API | ✅ Full | HTTP/SSE | API Key | ✅ Yes |
Quick Start Integration
Claude Desktop Integration
1. Configuration File
Create or update your Claude Desktop configuration:
Code
2. Verify Connection
Start Claude Desktop and verify the integration:
Code
ChatGPT Integration
1. OpenAI Responses API
Example of using our MCP server with OpenAI's Responses API:
Code
Common Issues
Connection Refused
Code
Authentication Errors
- Verify API key format:
ta_xxxxxxxxxx
- Verify secret key format:
ts_xxxxxxxxxx
Rate Limit Exceeded
- Check current usage in dashboard
- Implement exponential backoff
- Consider upgrading account plan
Last modified on