Exploring GitMCP with Umbraco Repositories

Today I came across an interesting article on Medium.com, a platform I’ve been following for years because it often points me toward new techniques and ideas. The article was called "10-minute GitMCP setup that saves hours, the developer’s secret weapon” and it introduced a simple but clever concept: using GitMCP to turn any public GitHub repository into AI-readable context.

I was already familiar with Umbraco MCP and some other MCP servers, but this was a different approach. Instead of running your own MCP server locally, GitMCP gives you a remote MCP source based on any GitHub repo.

And it works in the simplest way possible.

What is GitMCP?

GitMCP is a service that instantly turns any public GitHub repository into a Model Context Protocol (MCP) server.
That means AI tools like GitHub Copilot Chat, Claude Desktop, Cursor, Windsurf, or VS Code MCP clients can query actual source code from GitHub in real time without cloning or downloading the repository locally.

How it works?


Just change this:

https://github.com/owner/repo

to this:

https://gitmcp.io/owner/repo
That’s it. No cloning. No indexing. No setup scripts. Just replace the domain and connect it to your AI tool.

You add that URL to your AI tool as a custom MCP server, and suddenly it understands the structure and content of that repository. GitMCP reads files like README.md, llms.txt, and even full source paths, so your AI answers come from actual source code, not hallucinated nonsense

Using GitMCP with Umbraco

As an Umbraco developer, I work mostly with private repositories, so at first I thought this wouldn’t be useful. But then it clicked, Umbraco has a lot of public repositories: CMS core, UI, documentation, search, community resources, etc. Perfect material for GitMCP.

So I decided to connect the most important Umbraco repos:

GitMCP Configuration for Visual Studio 2026 (GitHub Copilot)

Below is my current setup, using GitHub Copilot Chat with MCP support in Visual Studio 2026 insiders

 

{
  "inputs": [],
  "servers": {
    "Umbraco-CMS": {
      "type": "stdio",
      "command": "npx",
      "args": ["mcp-remote", "https://gitmcp.io/umbraco/Umbraco-CMS"],
      "env": {}
    },
    "Umbraco-UI": {
      "type": "stdio",
      "command": "npx",
      "args": ["mcp-remote", "https://gitmcp.io/umbraco/Umbraco.UI"],
      "env": {}
    },
    "UmbracoDocs": {
      "type": "stdio",
      "command": "npx",
      "args": ["mcp-remote", "https://gitmcp.io/umbraco/UmbracoDocs"],
      "env": {}
    },
    "Umbraco-Cms-Search": {
      "type": "stdio",
      "command": "npx",
      "args": ["mcp-remote", "https://gitmcp.io/umbraco/Umbraco.Cms.Search"],
      "env": {}
    },
    "Umbraco-Community-Resources": {
      "type": "stdio",
      "command": "npx",
      "args": ["mcp-remote", "https://gitmcp.io/umbraco/Umbraco.Community.Site.Resources"],
      "env": {}
    }
  }
}


You can repeat that block for each repository you want to connect.
Tip: You don’t have to clone the repo locally—GitMCP streams file content on demand.

For now, I’m starting with Umbraco 16 sources to make sure I get the relevant context. But you can also point to specific repos if you prefer, for example, the latest LTS.

Why This Is Useful

Once I configured this, I could finally do something I’ve wanted for a long time: ask AI questions about Umbraco and get answers based on the real source code, not random guesses.

Example question to Copilot Chat in Visual Studio:

Can you help me understand, using UmbracoDocs and Umbraco.Cms.Search, how can I implement a custom search extension in Umbraco 16+?

The result? It showed me an actual, simple blueprint for an implementation. It even mentioned important interfaces like `IIndexPopulator` and `ISearchProvider`.

 

 What’s Next?

This is still a work-in-progress experiment. Over the next few weeks, I’ll:

  • Use GitMCP for real Umbraco migration work
  • Try combining multiple MCP servers in a single AI conversation
  • Share useful MCP prompts for Umbraco developers

I’ll post updates here and on my blog as I go.