MCP Server Setup¶
Drydock ships as an MCP (Model Context Protocol) server. This lets AI coding assistants call the analyzers directly as tool calls.
Install¶
Configure Your Assistant¶
Add to ~/.claude/settings.json:
Add to .vscode/mcp.json in your workspace:
Tip
Replace /path/to/Drydock with the actual path where you cloned the repo.
Available Tools¶
Once configured, your AI assistant can call these tools:
| Tool | Ask your assistant... |
|---|---|
drydock_context |
"Give me a full analysis of this project" |
drydock_codemap |
"Map the files and functions in this codebase" |
drydock_boundaries |
"Can I safely extract this component?" |
drydock_dependencies |
"What does this project depend on?" |
drydock_interfaces |
"What does each module expose publicly?" |
drydock_structure |
"How big is this codebase?" |
drydock_platforms |
"What tech stack is this project using?" |
Example¶
After setup, just ask naturally:
"Analyze the boundaries of the project at /home/me/my-app"
Your assistant calls drydock_boundaries and gets back:
{
"summary": {
"total_source_files": 42,
"clusters_found": 3,
"bridge_files": 2,
"orphan_files": 5
},
"extraction_suggestions": [
{
"name": "auth",
"files": 8,
"extraction_risk": "low",
"reason": "No external dependencies - clean extraction"
},
{
"name": "api",
"files": 15,
"extraction_risk": "medium",
"reason": "3 external deps, 2 external dependents"
}
]
}
The assistant can then reason about extraction decisions using real data.