The article introduces Mcp2cli, a new tool that converts any MCP server or OpenAPI spec into a CLI dynamically at runtime without code generation. It saves up to 96-99% of tokens compared to native MCP schemas. The technology is significant for developers and sysadmins who need to interact with APIs efficiently. Engineers care about this because it reduces the cognitive load and token costs in API interactions, especially when dealing with multiple endpoints.
For sysadmins running Proxmox, Docker, Linux, Nginx, or homelab environments, Mcp2cli could streamline the process of interacting with API services used within these systems. It minimizes token usage in API interactions, potentially reducing operational costs and improving efficiency in managing system resources.
- Dynamic CLI generation at runtime: This feature allows immediate access to new endpoints as they are added without recompiling code, making it highly adaptable for evolving APIs.
- Zero codegen requirement: The tool eliminates the need for manual code generation or schema updates, reducing administrative overhead and speeding up development cycles.
- Token efficiency: Mcp2cli reduces token usage by up to 96-99%, which is crucial in environments where API interaction costs are significant, such as with large LLMs.
- Provider agnosticism: The tool works across different LLM providers including Anthropic's Claude, GPT, and others, providing a consistent CLI interface regardless of the backend system.
- OpenAPI support: Mcp2cli supports both MCP and OpenAPI standards, offering flexibility in API interaction for developers and sysadmins.
- Install mcp2cli using pip or run it directly via npx to start interacting with APIs.
- Add Mcp2cli as a skill to AI coding agents like Claude Code, Cursor, Codex for streamlined API discovery and use.