Where is the MCP server? Deploy your agent with Cloud API Registry on Vertex AI Agent Engine

TL;DR: We’re streamlining how agents connect to Google Cloud services on Vertex AI Agent Engine. The new Cloud API Registry allows you to easily find and use standardized Model Context Protocol (MCP) servers for services like BigQuery. This guide demonstrates building and deploying a “zero-boilerplate” Data Analyst Agent using these tools.

Building a useful agent requires giving it access to real-world data and services. Nowadays, Model Context Protocol (MCP) is the standard for accessing data and services through APIs for LLMs and agents. However, the enterprise adoption of the Model Context Protocol (MCP) for agent deployment is constrained by fragmentation and poor governance. This scenario implies several challenges for developers:

  • Discovering and utilizing MCP servers is difficult, which prevents tool reuse across platforms and leads to wasted effort spent rebuilding MCP capabilities.
  • Manually managing secrets and tokens introduces security risks.
  • Setting up third-party tools requires navigating complex, unsupported interfaces solely for connectivity.

This changes with Cloud API Registry. Cloud API Registry acts as the definitive source of truth for tools across your organization, providing a centralized catalog of Model Context Protocol (MCP) servers on Google Cloud and any organization. For Vertex AI Agent Engine users, it solves fragmentation by helping you understand which APIs and tools you can use, along with their associated policies and restrictions.

For the developer, this facilitates a “registry-first” workflow that transforms how you build agents:

  • Standardized Discovery: Find available tools instantly via the CLI without hunting through documentation.
  • Zero Boilerplate: Consume standardized capabilities—like execute_sql or list_datasets tools for BigQuery—without writing a single line of wrapper code.
  • Unified Security: Rely on configured credentials and standard IAM policies rather than managing granular permissions and local secrets for every integration.

This guide will show you how to build and deploy a Data Analyst Agent to Vertex AI Agent Engine that connects to BigQuery using nothing but the Cloud API Registry.

Building and Deploying the Data Analyst Agent

Step 1: Find available tools

The first step in our “zero-boilerplate” journey is finding the BigQuery MCP server and its available tools.Instead of reading BigQuery SDK documentation for implementing our tools, we use the gcloud CLI to inspect available MCP servers in the registry.

# List all available MCP servers in the registry
gcloud beta api-registry mcp servers list --all --project=[YOUR_PROJECT_ID]

You will see standardized servers like bigquery.googleapis.com or compute.googleapis.com. To use them, we simply enable the server for our project:

gcloud beta api-registry mcp enable bigquery.googleapis.com --project=[YOUR_PROJECT_ID]

Once enabled, your agent effectively gains standardized tools instantly like list_dataset_ids, get_table_info, and execute_sql without you writing a single line of code for them. You can find the enabled server with associated tools in the Tools view in Vertex AI Agent Builder UI as well.

The screenshot displays Vertex AI interface for managing agent tools, showing a list of Google Cloud MCP servers like BigQuery, compute, container, and mapstools, with their enablement status.

Step 2: Defining the Agent Module

To deploy to Vertex AI Agent Engine using Vertex AI SDK, we need to encapsulate our agent in a Python module to avoid serialization issues. The example below demonstrates building a Data Analyst agent using the Google Agent Development Kit (ADK) using ModuleAgent and the following package structure:

my_agent/
    root_agent.py
    startup_scripts/
        check_api_registry.sh
    requirements.txt

First, you define your root_agent.py module. Notice we do not import the BigQuery SDK or write any tool definitions. We simply point the registry to the BigQuery MCP server using the ADK Python SDK ApiRegistry class.

import os
from google.adk.agents import LlmAgent
from google.adk.tools.api_registry import ApiRegistry
from vertexai.preview.reasoning_engines import AdkApp

PROJECT_ID = os.environ.get("PROJECT_ID")
LOCATION = os.environ.get("LOCATION")

def session_service_builder():
    """Create a Vertex AI session service for cloud deployment."""
    from google.adk.sessions import VertexAiSessionService
    return VertexAiSessionService(project=PROJECT_ID, location=LOCATION)

# 1. Initialize the Registry
header_provider = lambda context: {"x-goog-user-project": PROJECT_ID}
tool_registry = ApiRegistry(PROJECT_ID, header_provider=header_provider)

# 2. Fetch the BigQuery Toolset
# We request the specific MCP server by name
registry_tools = tool_registry.get_toolset(
    mcp_server_name=f"projects/{PROJECT_ID}/locations/global/mcpServers/google-bigquery.googleapis.com-mcp"
)

# 3. Define the Agent
agent_app = AdkApp(
    agent=LlmAgent(
        model="gemini-2.5-flash",
        name="bigquery_data_analyst",
        instruction=f"""
        You are a helpful data analyst assistant with access to BigQuery.
        The project ID is: {PROJECT_ID}
        Use the provided tools to explore datasets and execute SQL queries.
        """,
        tools=[registry_tools], # The agent auto-discovers tools
    ),
    session_service_builder=session_service_builder
)

Cloud API Registry integration via ApiRegistry requires a few key parameters:

  • mcp_server_name: The unique resource name of the server you want to connect to (e.g., .../google-bigquery.googleapis.com-mcp).
  • tool_registry.get_toolset: The ADK method that dynamically fetches tool definitions at runtime, replacing manual tool lists, for the agent.

Also, note that header_provider is required to prevent permissions issues.

Step 3: Grant IAM Permissions to Agent Engine

With your agent module now ready, deploying it on Vertex AI Agent Engine requires the deployment service account to have the necessary permissions. This includes access to both the API Registry and the underlying BigQuery data, which is essential for securely running your agent with a managed cloud identity. You can find the list of required IAM roles here.

# Get project number for Agent Engine service account
project_number = !gcloud projects describe {PROJECT_ID} --format="value(projectNumber)"
agent_engine_sa = f"service-{project_number[0]}@gcp-sa-aiplatform-re.iam.gserviceaccount.com"

# Grant to Agent Engine service account
# 1. Allow access to API Registry
!gcloud projects add-iam-policy-binding {PROJECT_ID} --member=serviceAccount:{agent_engine_sa} --role="roles/cloudapiregistry.viewer" --quiet --condition=None
!gcloud projects add-iam-policy-binding {PROJECT_ID} --member=serviceAccount:{agent_engine_sa} --role="roles/mcp.toolUser" --quiet --condition=None

# 2. Allow access to BigQuery Data
!gcloud projects add-iam-policy-binding {PROJECT_ID} --member=serviceAccount:{agent_engine_sa} --role="roles/bigquery.user" --quiet --condition=None
!gcloud projects add-iam-policy-binding {PROJECT_ID} --member=serviceAccount:{agent_engine_sa} --role="roles/bigquery.dataViewer" --quiet --condition=None

print("\n✅ MCP Tool User role granted to Agent Engine service account!")
print("\n💡 The deployed agent can now access MCP tools from API Registry")

Step 4: Configure the Infrastructure hook

Next, you can prepare your startup_scripts/check_api_registry.sh script. This script runs during the container build process to ensure the BigQuery MCP server is enabled and reachable from within the managed environment.

#!/bin/bash
set -e
echo "🔧 Verifying server status..."
gcloud beta api-registry mcp servers list --project="$PROJECT_ID"

You also define the requirements.txt dependencies to deploy the agent.

google-cloud-aiplatform[agent_engines,adk]>=1.101.0

Step 5: Deploy with One Command

Finally, you create the deployment script using agent_engines.create. Critically, we pass our root_agent.py and the startup script in extra_packages. This tells Agent Engine to package these files, install dependencies, and build a dedicated service.

from vertexai import agent_engines

# Deploy the agent
remote_app = agent_engines.create(
    display_name="bigquery-data-analyst",
    agent_engine=agent_engines.ModuleAgent(
        module_name="root_agent",
        agent_name="agent_app",
    ),
    requirements=[
        "google-cloud-aiplatform[agent_engines,adk]>=1.101.0",
    ],
    extra_packages=[
        "root_agent.py",
        "startup_scripts/check_api_registry.sh",
    ],
    env_vars={
        "PROJECT_ID": PROJECT_ID,
        "LOCATION": LOCATION,
    },
)

The deployment process typically takes 10-15 minutes. During this time, when deploying an agent using Cloud API Registry, the Agent Engine initiates a delivery pipeline that:

  1. Installs the ADK.
  2. Runs your startup script to verify registry access.
  3. Deploys your agent, which then connects to the registry to fetch its tools.

You’ll see output like:

Deploying agent...
Deployment complete!
Resource: projects/123.../locations/us-central1/reasoningEngines/456...

Once deployed, you can test the agent by querying it as shown in the example below.

import vertexai
from vertexai import agent_engines

# Connect to your deployed agent
vertexai.init(...)
remote_app = agent_engines.get("projects/...")

# Query the agent
async for event in remote_app.async_stream_query(
    message="What datasets are available in my project?",
    user_id="your-user",
):
  print(event)

Conclusion

Adopting Cloud API Registry with Vertex AI Agent Engine solves the fragmentation and poor governance that limits enterprise adoption of MCP for agent deployment. This tutorial demonstrated how to deploy a secure, scalable Data Analyst Agent using a “zero-boilerplate” approach with no API wrapper code, JSON schema, or credential management required.

This shifts the paradigm from manual, decentralized integration to managed, centralized discovery. Developers can now easily consume a secure, pre-validated capability from a shared repository instead of rewriting BigQuery wrapper functions.

In this way, your agent becomes a managed resource, automatically benefiting from the security, scalability, and service updates provided by Google Cloud services or any organization it utilizes.

What’s Next?

Now that you have the foundation, you can:

  1. Try it yourself: Run the full tutorial notebook to build and deploy this Data Analyst agent in your own environment.
  2. Explore other Google Cloud servers: Use gcloud beta api-registry mcp servers list --all to discover other available servers. You can easily extend this agent to manage infrastructure (Compute Engine) or navigate the real world (Google Maps) just by swapping the MCP server name.
  3. Learn more: Check out the Vertex AI Agent Engine to understand how to scale your agent to production.

Questions or feedback? Connect with me on LinkedIn or X/Twitter.

Thanks for this walkthrough! The ApiRegistry abstraction is exactly what was needed.

I’ve been manually wrapping the BigQuery SDK for my agents, and managing the credential/token passing was always the messiest part. Seeing it reduced to just tool_registry.get_toolset with the IAM role handling is a massive time saver.

Definitely going to test this “zero-boilerplate” flow on my next deployment.

2 Likes

Happy it helps @virginia_love !

1 Like

Thanks for this update, it’s good to have a one stop shop for MCP servers in GCP. Will this support custom MCP server and Oauth discovery for MCP servers (according to the MCP specs) ? This is the main bottleneck we’re having now.

Can i use it also.. Like is it universal code. For everyone can copy or paste

Hi @blcvt and @SHAKIB_SADMAN_EMON

Thank you for sharing your feedback.

I will double-check with the Agent Engine team and get back to you with more info.

Best