Skip to main content

Client Integration

Connect the SkyWatch MCP server to your preferred AI assistant. All clients use the same endpoint — no API key or authentication setup required.

MCP Endpoint: https://api.skywatch.co/mcp


Claude Desktop

Claude Desktop supports MCP servers natively.

Configuration file locations:

  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%\Claude\claude_desktop_config.json
  • Linux: ~/.config/Claude/claude_desktop_config.json
{
"mcpServers": {
"skywatch": {
"command": "npx",
"args": ["-y", "mcp-remote", "https://api.skywatch.co/mcp"]
}
}
}

Restart Claude Desktop after saving. Then try:

  • "Find satellite images of Tokyo from last month"
  • "How much would imagery of Manhattan cost?"
  • "What satellites have sub-meter resolution?"

Claude Code (CLI)

# Add SkyWatch MCP server
claude mcp add skywatch --transport http https://api.skywatch.co/mcp

# Verify it's configured
claude mcp list

# Remove if needed
claude mcp remove skywatch

Manual configuration — edit ~/.claude/settings.json:

{
"mcpServers": {
"skywatch": {
"type": "http",
"url": "https://api.skywatch.co/mcp"
}
}
}

Cursor

Via Settings UI:

  1. Open Settings (Cmd/Ctrl + ,)
  2. Navigate to Features > MCP Servers
  3. Click Add Server:
    • Name: skywatch
    • Command: npx
    • Args: -y mcp-remote https://api.skywatch.co/mcp

Manual configuration — edit ~/.cursor/mcp.json:

{
"mcpServers": {
"skywatch": {
"command": "npx",
"args": ["-y", "mcp-remote", "https://api.skywatch.co/mcp"]
}
}
}

Restart Cursor after configuration.


ChatGPT

Requires a paid ChatGPT account (Plus, Pro, Business, Enterprise, or Education).

  1. Go to Settings > Apps
  2. Click Create App
  3. Under Actions, add a Remote MCP Server
  4. Enter the server URL: https://api.skywatch.co/mcp
  5. Save the app

To use:

  1. Start a new chat
  2. Click the tools icon in the message composer
  3. Enable the SkyWatch app
  4. Ask naturally: "Search for satellite imagery of San Francisco"

Gemini CLI

# Add SkyWatch MCP server
gemini mcp add skywatch --transport http https://api.skywatch.co/mcp

# Verify configuration
gemini mcp list

# Remove if needed
gemini mcp remove skywatch

Local LLMs (Ollama, LM Studio)

For locally-hosted LLMs with function calling support, forward tool calls to the MCP endpoint.

import ollama
import requests

MCP_ENDPOINT = "https://api.skywatch.co/mcp"

def call_skywatch(tool_name, arguments):
response = requests.post(MCP_ENDPOINT, json={
"jsonrpc": "2.0",
"id": 1,
"method": "tools/call",
"params": {"name": tool_name, "arguments": arguments}
})
return response.json()

tools = [{
"type": "function",
"function": {
"name": "search_archive_imagery",
"description": "Search satellite imagery archive",
"parameters": {
"type": "object",
"properties": {
"location_query": {"type": "string"},
"start_date": {"type": "string"},
"end_date": {"type": "string"},
"limit": {"type": "integer"}
},
"required": ["location_query"]
}
}
}]

response = ollama.chat(
model="llama3.1",
messages=[{"role": "user", "content": "Find satellite images of Paris"}],
tools=tools
)

if response.message.tool_calls:
for tc in response.message.tool_calls:
args = tc.function.arguments
if isinstance(args, str):
import json
args = json.loads(args)
result = call_skywatch(tc.function.name, args)
print(result)

Direct HTTP

For any client or custom integration, send JSON-RPC 2.0 requests to the MCP endpoint:

curl -X POST https://api.skywatch.co/mcp \
-H "Content-Type: application/json" \
-d '{
"jsonrpc": "2.0",
"id": 1,
"method": "tools/list"
}'

See the MCP Server page for complete request examples.