MCP and Extensions
MCP and Extensions
Section titled “MCP and Extensions”This document explains the underlying standards that enable UV-MCP to function.
The Model Context Protocol (MCP)
Section titled “The Model Context Protocol (MCP)”The Model Context Protocol (MCP) is an open standard designed to solve the connectivity problem between AI models and external systems.
The Problem
Section titled “The Problem”Traditionally, AI models are isolated (“sandboxed”). They cannot read your files, run code, or check your system status without complex, custom-built integrations.
The MCP Solution
Section titled “The MCP Solution”MCP defines a universal language for:
- Exposing Resources: Allowing the AI to read data (logs, files).
- Exposing Tools: Allowing the AI to execute actions (commands, API calls).
- Exposing Prompts: Pre-defined templates for user interaction.
UV-MCP functions as an MCP Server. It translates the AI’s intent into local uv commands.
Gemini Extensions
Section titled “Gemini Extensions”While MCP is the communication protocol, a Gemini Extension is the packaging format used to distribute these capabilities to Gemini-powered interfaces.
Integration
Section titled “Integration”When you install UV-MCP as an extension, you are registering it as a trusted tool provider. The Gemini CLI uses the extension manifest to:
- Discover the server.
- Launch the server process in the background.
- Route relevant user queries to the server.
The Workflow
Section titled “The Workflow”- User Query: “Install numpy.”
- Gemini Client: Identifies that
install_dependency(a tool provided by the UV-MCP extension) is relevant. - Protocol Handshake: The client sends an MCP JSON-RPC request to the UV-MCP server.
- Execution: UV-MCP executes the
uvcommand locally. - Response: The result is sent back via MCP, and Gemini summarizes it for you.