Welcome to Tomba Docs! Create a free Tomba account and gain instant access to 400+ million contacts!
MCP Server

Tomba MCP LLM Integration

LLM Integration Guide

This guide shows how to integrate the Tomba MCP Server with popular Large Language Models and AI applications. The server works with any MCP-compatible client through standardized JSON-RPC communication.

Supported LLM Applications

Primary Support

  • Claude Desktop - Native MCP support
  • Claude API - Via MCP SDK integration
  • Custom LLM Applications - Using MCP SDK
  • Anthropic Workbench - Direct integration

Community Integrations

  • OpenAI GPT - Via MCP bridge implementations
  • Local LLMs - Through MCP-compatible frameworks
  • AI Agents - LangChain, CrewAI, AutoGPT integrations

Integration Architecture

Code
graph TB A[LLM Application] --> B[MCP Client SDK] B --> C[JSON-RPC over stdio] C --> D[Tomba MCP Server] D --> E[Tomba.io API] subgraph "Your Application" A B end subgraph "MCP Server" D F[Tool Registry] G[Type Validation] H[Error Handling] end subgraph "External API" E end

Client Configuration Files

Claude Desktop Configuration

File Location:

  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%\Claude\claude_desktop_config.json

Configuration:

JSONCode
{ "mcpServers": { "tomba": { "command": "npx", "args": ["-y", "tomba-mcp-server"], "env": { "TOMBA_API_KEY": "your_api_key_here", "TOMBA_SECRET_KEY": "your_secret_key_here" } } } }

Claude API Integration

Installation:

TerminalCode
npm install @modelcontextprotocol/sdk

Client Code:

JavascriptCode
import { Client } from "@modelcontextprotocol/sdk/client/index.js"; import { StdioClientTransport } from "@modelcontextprotocol/sdk/client/stdio.js"; const client = new Client( { name: "tomba-client", version: "1.0.0", }, { capabilities: { tools: {}, }, }, ); const transport = new StdioClientTransport({ command: "node", args: ["./dist/index.js"], env: { TOMBA_API_KEY: process.env.TOMBA_API_KEY, TOMBA_SECRET_KEY: process.env.TOMBA_SECRET_KEY, }, }); await client.connect(transport); // Use the tools const result = await client.callTool({ name: "domain_search", arguments: { domain: "github.com", limit: 10, }, });

Configuration Templates

Basic Configuration

Create mcp-config.json:

JSONCode
{ "servers": { "tomba": { "command": "npx", "args": ["-y", "tomba-mcp-server"], "environment": { "TOMBA_API_KEY": "${TOMBA_API_KEY}", "TOMBA_SECRET_KEY": "${TOMBA_SECRET_KEY}" } } } }

Development Configuration

Create mcp-dev-config.json:

JSONCode
{ "servers": { "tomba-dev": { "command": "npm", "args": ["run", "dev"], "cwd": "./tomba-mcp-server", "environment": { "TOMBA_API_KEY": "test_key_123", "TOMBA_SECRET_KEY": "test_secret_456", "NODE_ENV": "development", "DEBUG": "tomba:*" } } } }

Production Configuration

Create mcp-prod-config.json:

JSONCode
{ "servers": { "tomba-prod": { "command": "node", "args": ["./tomba-mcp-server/dist/index.js"], "environment": { "TOMBA_API_KEY": "${TOMBA_API_KEY}", "TOMBA_SECRET_KEY": "${TOMBA_SECRET_KEY}", "NODE_ENV": "production" }, "restart": true, "timeout": 30000 } } }

Usage Examples by LLM

Claude Desktop Usage

Once configured, you can use natural language:

Email Discovery:

Code
"Find all email addresses for github.com and show me the top 10 results"

Contact Verification:

Code
"Verify if the email satya.nadella@microsoft.com is deliverable and show me the confidence score"

Lead Generation:

Code
"Search for the email of John Doe at Apple Inc and enrich it with social profiles"

Claude API Integration

Code
import json import subprocess from typing import Dict, Any class TombaMCPClient: def __init__(self, api_key: str, secret_key: str): self.env = { 'TOMBA_API_KEY': api_key, 'TOMBA_SECRET_KEY': secret_key } def call_tool(self, tool_name: str, arguments: Dict[str, Any]) -> Dict: request = { "jsonrpc": "2.0", "id": 1, "method": "tools/call", "params": { "name": tool_name, "arguments": arguments } } process = subprocess.Popen( ['node', './dist/index.js'], stdin=subprocess.PIPE, stdout=subprocess.PIPE, env=self.env, text=True ) stdout, _ = process.communicate(json.dumps(request)) return json.loads(stdout) def find_domain_emails(self, domain: str, limit: int = 10): return self.call_tool('domain_search', { 'domain': domain, 'limit': limit }) def verify_email(self, email: str): return self.call_tool('email_verifier', { 'email': email }) # Usage client = TombaMCPClient('your_api_key', 'your_secret_key') emails = client.find_domain_emails('github.com', 5) print(json.dumps(emails, indent=2))

Custom LLM Integration

JavascriptCode
// For custom LLM applications class TombaIntegration { constructor(mcpServerPath, apiKey, secretKey) { this.serverPath = mcpServerPath; this.env = { TOMBA_API_KEY: apiKey, TOMBA_SECRET_KEY: secretKey, }; } async initializeConnection() { const { spawn } = require("child_process"); this.serverProcess = spawn("node", [this.serverPath], { env: this.env, stdio: ["pipe", "pipe", "pipe"], }); // Initialize MCP protocol await this.sendRequest({ jsonrpc: "2.0", id: 1, method: "initialize", params: { protocolVersion: "2024-11-05", capabilities: {}, clientInfo: { name: "custom-llm-client", version: "1.0.0", }, }, }); } async sendRequest(request) { return new Promise((resolve) => { this.serverProcess.stdin.write(JSON.stringify(request) + "\n"); this.serverProcess.stdout.once("data", (data) => { resolve(JSON.parse(data.toString())); }); }); } async searchDomain(domain, limit = 10) { return this.sendRequest({ jsonrpc: "2.0", id: Date.now(), method: "tools/call", params: { name: "domain_search", arguments: { domain, limit }, }, }); } } // Usage with your LLM const tomba = new TombaIntegration("./dist/index.js", "api_key", "secret_key"); await tomba.initializeConnection(); // In your LLM tool handling code if (toolCall.name === "search_emails") { const result = await tomba.searchDomain(toolCall.arguments.domain); return result.content[0].text; }

Framework Integrations

LangChain Integration

Code
from langchain.tools import BaseTool from langchain.pydantic_v1 import BaseModel, Field import subprocess import json class DomainSearchInput(BaseModel): domain: str = Field(description="Domain name to search") limit: int = Field(default=10, description="Maximum results") class TombaDomainSearchTool(BaseTool): name = "domain_search" description = "Search for email addresses in a domain" args_schema = DomainSearchInput def _run(self, domain: str, limit: int = 10) -> str: request = { "jsonrpc": "2.0", "id": 1, "method": "tools/call", "params": { "name": "domain_search", "arguments": {"domain": domain, "limit": limit} } } process = subprocess.Popen( ['node', './tomba-mcp-server/dist/index.js'], stdin=subprocess.PIPE, stdout=subprocess.PIPE, env={'TOMBA_API_KEY': 'your_key', 'TOMBA_SECRET_KEY': 'your_secret'}, text=True ) stdout, _ = process.communicate(json.dumps(request)) result = json.loads(stdout) return result['content'][0]['text'] # Usage in LangChain from langchain.agents import initialize_agent, AgentType from langchain.llms import OpenAI llm = OpenAI() tools = [TombaDomainSearchTool()] agent = initialize_agent( tools, llm, agent=AgentType.ZERO_SHOT_REACT_DESCRIPTION ) result = agent.run("Find email addresses for github.com")

CrewAI Integration

Code
from crewai import Agent, Task, Crew from crewai_tools import tool import subprocess import json @tool("Domain Email Search") def search_domain_emails(domain: str) -> str: """Search for email addresses in a domain using Tomba.io""" request = { "jsonrpc": "2.0", "id": 1, "method": "tools/call", "params": { "name": "domain_search", "arguments": {"domain": domain, "limit": 10} } } process = subprocess.run( ['node', './tomba-mcp-server/dist/index.js'], input=json.dumps(request), capture_output=True, text=True, env={'TOMBA_API_KEY': 'your_key', 'TOMBA_SECRET_KEY': 'your_secret'} ) result = json.loads(process.stdout) return result['content'][0]['text'] # Create agents researcher = Agent( role='Email Researcher', goal='Find contact information for target companies', tools=[search_domain_emails], backstory="Expert at finding business contacts" ) # Create tasks research_task = Task( description="Find email addresses for github.com and microsoft.com", agent=researcher ) # Create crew crew = Crew( agents=[researcher], tasks=[research_task] ) result = crew.kickoff()

Template Files

Complete Claude Desktop Config

Save as claude_desktop_config.json:

JSONCode
{ "mcpServers": { "tomba": { "command": "node", "args": ["/absolute/path/to/tomba-mcp-server/dist/index.js"], "env": { "TOMBA_API_KEY": "ta_xxxxxxxxxx", "TOMBA_SECRET_KEY": "ts_xxxxxxxxxx" } } }, "globalShortcuts": { "toggleClaudeDesktop": "Cmd+Shift+C" }, "appearance": { "theme": "auto" } }

Docker Compose Configuration

Save as docker-compose.yml:

YAMLCode
version: "3.8" services: tomba-mcp-server: build: . environment: - TOMBA_API_KEY=${TOMBA_API_KEY} - TOMBA_SECRET_KEY=${TOMBA_SECRET_KEY} volumes: - ./dist:/app/dist:ro stdin_open: true tty: true

Systemd Service File

Save as /etc/systemd/system/tomba-mcp.service:

Code
[Unit] Description=Tomba MCP Server After=network.target [Service] Type=simple User=mcpuser WorkingDirectory=/opt/tomba-mcp-server ExecStart=/usr/bin/node dist/index.js Environment=TOMBA_API_KEY=your_api_key Environment=TOMBA_SECRET_KEY=your_secret_key Restart=always RestartSec=5 [Install] WantedBy=multi-user.target

Security Configuration

Environment Variables

Secure .env setup:

TerminalCode
# Create secure .env file umask 077 echo "TOMBA_API_KEY=your_key" > .env echo "TOMBA_SECRET_KEY=your_secret" >> .env chmod 600 .env

API Key Management

Using environment variables:

TerminalCode
export TOMBA_API_KEY="ta_your_api_key" export TOMBA_SECRET_KEY="ts_your_secret_key"

Using keychain (macOS):

TerminalCode
security add-generic-password -a "tomba-mcp" -s "api-key" -w "your_api_key" security add-generic-password -a "tomba-mcp" -s "secret-key" -w "your_secret"

Reading from keychain:

TerminalCode
export TOMBA_API_KEY=$(security find-generic-password -a "tomba-mcp" -s "api-key" -w) export TOMBA_SECRET_KEY=$(security find-generic-password -a "tomba-mcp" -s "secret-key" -w)

Monitoring & Logging

Log Configuration

JSONCode
{ "logging": { "level": "info", "format": "json", "outputs": [ { "type": "file", "path": "/var/log/tomba-mcp.log", "rotation": "daily" }, { "type": "console", "format": "pretty" } ] } }

Health Check Endpoint

JavascriptCode
// Add to your client application async function healthCheck() { try { const result = await client.listTools(); return result.tools.length > 0; } catch (error) { console.error("Health check failed:", error); return false; } }

Error Handling

Retry Logic

JavascriptCode
async function callToolWithRetry(toolName, arguments, maxRetries = 3) { for (let attempt = 1; attempt <= maxRetries; attempt++) { try { return await client.callTool({ name: toolName, arguments }); } catch (error) { if (attempt === maxRetries) throw error; const delay = Math.pow(2, attempt) * 1000; // Exponential backoff await new Promise((resolve) => setTimeout(resolve, delay)); } } }

Rate Limit Handling

JavascriptCode
async function handleRateLimit(error) { if (error.code === 429) { const retryAfter = error.headers?.["retry-after"] || 60; console.log(`Rate limited. Waiting ${retryAfter} seconds...`); await new Promise((resolve) => setTimeout(resolve, retryAfter * 1000)); return true; // Indicate retry should be attempted } return false; }

Testing LLM Integration

Integration Test Script

JavascriptCode
// test-llm-integration.js const { spawn } = require("child_process"); async function testLLMIntegration() { console.log("🧪 Testing LLM Integration..."); // Start MCP server const serverProcess = spawn("node", ["dist/index.js"], { env: { TOMBA_API_KEY: process.env.TOMBA_API_KEY, TOMBA_SECRET_KEY: process.env.TOMBA_SECRET_KEY, }, stdio: ["pipe", "pipe", "pipe"], }); // Test tool listing const listRequest = { jsonrpc: "2.0", id: 1, method: "tools/list", }; serverProcess.stdin.write(JSON.stringify(listRequest) + "\n"); return new Promise((resolve) => { serverProcess.stdout.on("data", (data) => { const response = JSON.parse(data.toString()); console.log("Tools available:", response.result?.tools?.length); serverProcess.kill(); resolve(response); }); }); } testLLMIntegration();

Deployment

Production Deployment

  1. Build the server:

    TerminalCode
    npm run build
  2. Configure environment:

    TerminalCode
    export TOMBA_API_KEY="production_key" export TOMBA_SECRET_KEY="production_secret" export NODE_ENV="production"
  3. Start with process manager:

    TerminalCode
    pm2 start dist/index.js --name tomba-mcp-server

Scaling Considerations

  • Multiple Instances: Run multiple server instances for load distribution
  • Load Balancing: Use nginx or similar for request distribution
  • Monitoring: Implement health checks and performance monitoring
  • Logging: Centralized logging for debugging and analysis

Your Tomba MCP Server is now ready for LLM integration! Choose the configuration that matches your setup and start leveraging powerful email intelligence in your AI applications.

Last modified on