Google Made Every Cloud Service an MCP Server. Here's What That Breaks Open for AI Agents.
Google just handed every AI agent a master key to its cloud empire. Starting March 17, every BigQuery instance automatically ships with a remote MCP server. Maps, Compute Engine, Kubernetes Engine, and Cloud Storage followed. And through Apigee, any API you've already built can become an MCP server with zero code changes.
This isn't a feature announcement. It's an infrastructure shift that changes how AI agents interact with enterprise systems.
MCP Was Anthropic's Idea. Google Made It Enterprise-Grade.
When Anthropic introduced the Model Context Protocol in November 2024, the pitch was elegant: a universal connector standard for AI — "USB-C for LLMs." Instead of building custom integrations for every data source, agents could speak one protocol. Claude, ChatGPT, Cursor, Gemini, VS Code — they all adopted it.
But most MCP servers were community-built, running locally, requiring manual configuration. Google looked at this and did what Google does: threw enterprise infrastructure at it.
The result is fully managed, remote MCP servers that run on Google's existing API infrastructure. No local servers to maintain. No Docker containers to babysit. You point your AI agent at a globally consistent endpoint, authenticate through Cloud IAM, and your agent can query BigQuery, manage Compute Engine instances, or pull Google Maps data — all through natural language.
What's Live Right Now
Google launched MCP servers for four foundational services:
- BigQuery — Agents can interpret schemas, execute SQL queries, and run forecasting models against your enterprise data without moving it into context windows. After March 17, this is enabled by default on every BigQuery instance.
- Google Maps — Location queries, route calculations, and geospatial data accessible through any MCP client.
- Compute Engine — Infrastructure management through natural language: list instances, check statuses, manage resources.
- Kubernetes Engine (GKE) — Cluster management, pod inspection, and deployment operations via MCP.
Cloud Run, Cloud Storage, and Cloud Resource Manager are rolling out in the coming months. Google also released specialized MCP servers for Cloud Observability and Cloud Storage as separate npm packages.
The Apigee Play Is the Real Story
Here's the part most coverage missed. Google integrated MCP support into Apigee, their API management platform. This means any existing API — a product catalog, an internal employee directory, a proprietary ML model endpoint — can be translated into an MCP server without writing new code.
The implications are significant. Enterprise companies with hundreds of internal APIs built over decades can suddenly make all of them agent-accessible. Apigee handles the translation layer, and existing security policies, rate limits, and governance controls carry over automatically.
A retail company with a product catalog API built in 2019 doesn't need to rewrite it. Apigee wraps it in MCP, and now any AI agent in their organization can search products, check inventory, and pull pricing — with the same access controls that already exist.
Security: Google Cloud IAM + Model Armor
Enterprise adoption of AI agents has been stuck on one question: "How do I control what the agent can do?" Google's answer leans on two systems:
Cloud IAM gates every MCP server. You define exactly which service accounts, roles, and users can access which tools through which MCP endpoints. An agent running under a data analyst's credentials can query BigQuery but can't touch Compute Engine instances. Standard IAM — nothing new to learn.
Model Armor is the new piece. It's a firewall designed specifically for agentic workloads that defends against advanced threats like indirect prompt injection and data exfiltration. When an agent tries to access an MCP server, Model Armor inspects the request chain for manipulation attempts — a critical layer when agents can chain multiple tool calls together.
What This Means for the MCP Ecosystem
Before Google's announcement, the MCP ecosystem was dominated by community-built servers — over 10,000 of them — most running locally via npx commands. Useful for individual developers, but a non-starter for enterprise deployment.
Google's managed servers flip the deployment model. Remote, authenticated, governed, and scaled by Google's infrastructure. This creates a template that AWS and Azure will inevitably follow. (AWS already announced MCP support for Bedrock; Azure is reportedly building MCP into Copilot Studio.)
The MCP protocol itself continues evolving. As of March 5, 2026, the community roadmap focuses on next-generation transports, scalable session handling, and enterprise governance — areas where Google's implementation is already pushing the boundaries.
How to Set It Up
For the gcloud MCP server, add this to your MCP client configuration:
{
"gcloud": {
"command": "npx",
"args": ["-y", "@google-cloud/gcloud-mcp"]
}
}
This works with any MCP-compatible client — Claude Code, Gemini CLI, Cursor, VS Code, and more. Authentication uses your existing gcloud auth login credentials.
For BigQuery specifically, no setup is needed after March 17. The remote MCP server is automatically enabled when BigQuery is active in your project.
The Bottom Line
Google didn't invent MCP. Anthropic did. But Google just made it enterprise-deployable at scale. Managed infrastructure, IAM-gated access, prompt injection defense, and a bridge (Apigee) that turns legacy APIs into agent-ready tools.
The companies that move fastest here aren't the ones building new AI systems from scratch. They're the ones that already have hundreds of APIs behind Apigee and can now make every single one of them accessible to AI agents — with a configuration change instead of a development project.
MCP just went from a developer protocol to an enterprise integration layer. And Google is betting its cloud revenue that every serious AI deployment will need one.
Key Takeaways
- ✓BigQuery automatically ships with a remote MCP server after March 17 — no setup required
- ✓Maps, Compute Engine, GKE, and Cloud Storage are already live or rolling out
- ✓Apigee can turn any existing API into an MCP server without code changes
- ✓Cloud IAM + Model Armor provide enterprise-grade security for agentic workloads
- ✓Google's managed MCP servers run remotely — no local Docker containers or npx commands needed
- ✓AWS and Azure are expected to follow with similar managed MCP implementations
- ✓The MCP ecosystem has grown to 10,000+ servers and 97M monthly SDK downloads
Skila AI Editorial Team
The Skila AI editorial team researches and writes original content covering AI tools, model releases, open-source developments, and industry analysis. Our goal is to cut through the noise and give developers, product teams, and AI enthusiasts accurate, timely, and actionable information about the fast-moving AI ecosystem.
About Skila AI →Related Resources
Weekly AI Digest
Get the top AI news, tool reviews, and developer insights delivered every week. No spam, unsubscribe anytime.
Join 1,000+ AI enthusiasts. Free forever.