Skip to main content

AI Tool Integrations

Takumo sits between your code and AI assistants. Same protection, any tool.

How it works

Your code

You have code with secrets (API keys, passwords, connection strings)

Tokenize

Takumo detects secrets and replaces them with tokens

AI processes

AI tool sees only tokens, never your real credentials

Rehydrate

Takumo restores tokens back to real secrets in the response

Final code

You get working code with your actual credentials restored

Supported tools

ToolIntegrationStatus
Claude CodeCLI wrapper, MCP serverAvailable
CursorExtension, proxy modeAvailable
WindsurfExtensionAvailable
VS Code + CopilotExtensionComing soon
JetBrains + AI AssistantPluginComing soon

Integration patterns

CLI wrapper

Wrap your AI tool’s CLI to automatically tokenize input and rehydrate output:
# Instead of:
claude "Add retry logic to config.ts"

# Use:
takumo-aegis shield config.ts --prompt "Add retry logic"

MCP server

For tools that support Model Context Protocol, Takumo runs as an MCP server that intercepts file reads:
{
  "mcpServers": {
    "takumo": {
      "command": "takumo-aegis",
      "args": ["mcp-server"]
    }
  }
}

Proxy mode

Route API calls through Takumo’s local proxy to tokenize requests and rehydrate responses:
takumo-aegis proxy --port 8080
Then point your AI tool at http://localhost:8080 instead of the provider’s API.

Choosing an integration

Use caseRecommended
Quick, one-off tasksCLI wrapper
Full IDE experienceExtension/plugin
Custom workflowsAPI + MCP server
CI/CD pipelinesProxy mode
Each tool page has specific setup instructions and configuration options.