Recommended: Clone the Repo
Top AI engineers get the best results by cloning the entire Voltaire repository locally. This provides maximum context for your AI coding assistant.
This is the recommended approach. Having the full source code locally gives AI assistants complete context - they can read implementations, understand patterns, and provide more accurate code.
git clone https://github.com/evmts/voltaire.git
Once cloned, point your AI assistant to the repo when working on Ethereum projects. The codebase includes comprehensive inline documentation, tests, and examples.
llms.txt
Voltaire provides machine-readable documentation at standard endpoints:
These follow the llms.txt standard and can be fetched by AI assistants to understand Voltaire’s API without needing the full repo.
MCP Server
For AI assistants that support Model Context Protocol, Voltaire provides an MCP server with searchable documentation.
Claude Code
Claude Desktop
Cursor
Codex
Amp
OpenCode
Windsurf
claude mcp add --transport http voltaire https://voltaire.tevm.sh/mcp
Add to ~/Library/Application Support/Claude/claude_desktop_config.json:{
"mcpServers": {
"voltaire": {
"command": "npx",
"args": ["mcp-remote", "https://voltaire.tevm.sh/mcp"]
}
}
}
Add to .cursor/mcp.json:{
"mcpServers": {
"voltaire": {
"command": "npx",
"args": ["mcp-remote", "https://voltaire.tevm.sh/mcp"]
}
}
}
Add to ~/.codex/config.toml:[mcp_servers.voltaire]
url = "https://voltaire.tevm.sh/mcp"
Add to ~/.config/amp/settings.json:{
"mcpServers": {
"voltaire": {
"url": "https://voltaire.tevm.sh/mcp"
}
}
}
Add to ~/.config/opencode/opencode.json:{
"mcp": {
"voltaire": {
"type": "remote",
"url": "https://voltaire.tevm.sh/mcp"
}
}
}
Add to ~/.codeium/windsurf/mcp_config.json:{
"mcpServers": {
"voltaire": {
"command": "npx",
"args": ["mcp-remote", "https://voltaire.tevm.sh/mcp"]
}
}
}
The MCP server provides a SearchVoltaire tool that searches across all documentation, returning relevant code examples and API references.
Why Local Context Works Best
The MCP server is convenient, but cloning the repo locally provides:
- Full source code - AI can read actual implementations, not just docs
- Test examples - Real-world usage patterns in test files
- Type definitions - Complete TypeScript types for accurate suggestions
- Build context - Understanding of how modules connect
Most production teams using AI-assisted development keep frequently-used libraries cloned locally for this reason.
API Design for AI
Voltaire’s API is designed to work well with AI assistants:
Mirrors Ethereum specs - LLMs trained on Ethereum documentation can leverage that knowledge directly. No bespoke abstractions to learn.
Minimal abstraction - What you pass to a function is what happens. No hidden retry policies, automatic caching, or magic behavior that confuses debugging.
Predictable patterns - Every primitive follows the same conventions: Type() constructor, .toHex(), .equals(), etc.
// Voltaire API maps directly to Ethereum concepts
const address = Address('0x742d35Cc6634C0532925a3b844Bc9e7595f51e3e');
const hash = Keccak256(new TextEncoder().encode('hello'));
const fn = Abi.Function(transferAbi);
const encoded = fn.encodeParams([to, amount]);
Smart LLM Detection
When AI assistants query Voltaire documentation, we detect the request and return optimized markdown instead of HTML. This reduces token usage and improves response quality.