Build an MCP server with Laravel Loop

By Fernando Sánchez

Our Laravel apps have useful features like model management, data processing, business rules, and report generation. But when working with Claude or Cursor, none of those features are available. AI agents can't see your model’s data, can't call your APIs, and can't leverage any of your application's capabilities.

Laravel Loop changes that by turning your Laravel functionality into tools that AI assistants can discover, understand, and use.

Laravel Loop and MCP

Laravel Loop is a powerful Model Context Protocol (MCP) server: a standard that lets AI agents discover and invoke tools from external systems. With Laravel Loop, you can serve Laravel functionality through MCP by creating tools.

Here's what an interaction with an MCP server looks like:

Claude discovered MCP tools defined in the server, understood what it does from its definition, and called it.

The application

To show how this works, let’s use a real Laravel application containing a prompt library that manages 200+ curated AI prompts for content analysis, writing, and data extraction.

The idea is to retrieve and use these prompts while chatting with an agent.

We’ll cover how to create tools for discovering and composing prompts with our content. The same patterns apply whether you’re managing prompts, products, users, or any other resource.

The full application code is available here. To install Laravel Loop, check the setup instructions here.

Building your first tool

The simplest tool transforms any logic into something AI can use:

namespace App\Mcp;
 
use Kirschbaum\Loop\Tools\CustomTool;
 
CustomTool::make('list_prompts', description: 'Get a complete list of all available prompts with name and category.')
 
->using(function () {
 
$prompts = Prompt::active()
 
->orderBy('title')
 
->get(['name', 'title', 'category']);
 
$result = "## Available Prompts ({$prompts->count()} total)\n\n";
 
return $result . $prompts->map(function ($prompt) {
 
return <<<MD
 
### {$prompt->name}
 
Title: {$prompt->title}
 
Description: {$prompt->description}
 
Category: {$prompt->category}
 
MD;
 
})->join("\n");
 
);

That's it! We can now ask an LLM, “What prompts do you have?" and get a list directly from the Laravel app.

Adding parameters

Tools will often need parameters. Here’s how to make AI assistants pass the right data:

CustomTool::make(
name: 'search_prompts',
description: 'Search for prompts by keyword in title, description, or name. Returns prompts from all sources ready to use.',
)
->withStringParameter(
name: 'query',
description: 'Search query to find prompts',
required: true
)
->withStringParameter(
name: 'limit',
description: 'Maximum number of results to return (default: 10)',
required: false
)
->using(function (string $query, string $limit = '10') {
$prompts = Prompt::active()
->public()
->where(function ($q) use ($query) {
$q->where('title', 'like', "%{$query}%")
->orWhere('description', 'like', "%{$query}%")
->orWhere('name', 'like', "%{$query}%");
})
->orderBy('title')
->get();
 
if ($prompts->isEmpty()) {
return "No prompts found matching '{$query}'.";
}
 
$result = "## Search results for '{$query}' ({$prompts->count()} found):\n\n";
 
return $result . $prompts->map(function ($prompt) {
return <<<MARKDOWN
### {$prompt->name}
**Title**: {$prompt->title}
**Description**: {$prompt->description}
**Usage**: `compose_prompt` with prompt_name `{$prompt->name}`
 
MARKDOWN;
})->join("\n");
});

Now LLMs can understand requests like:

  • “Find prompts for analyzing content”

  • “Show me writing prompts”

  • “Search for summarization tools”

The AI assistant figures out the parameters automatically based on the tool definition and the parameter descriptions.

Using existing logic

You can use existing logic in your tools, or if you need more complex logic, extract it into separate classes:

CustomTool::make(
name: 'compose_prompt',
description: 'Apply a prompt template to your content for analysis or processing'
)
->withStringParameter('prompt_name', 'Name of prompt to use')
->withStringParameter('content', 'Your content to process with this prompt')
->withStringParameter('additional_context', 'Extra context or instructions', required: false)
->using(function (string $prompt_name, string $content, string $additional_context = '') {
$prompt = Prompt::active()->where('name', $prompt_name)->first();
 
// Resolve any existing class
$composedPrompt = app(PromptService::class)->compose(
prompt: $prompt,
inputContent: $content,
additionalContext: $additional_context
);
 
return "EXECUTE THIS PROMPT: " . $composedPrompt;
});

Now you can ask the assistant: "Use the analyze_claims prompt on this @article.md" and it will automatically:

  • Find the right prompt

  • Compose it with the passed-in content

  • Return the ready-to-execute prompt

  • Execute the analysis

Registering the toolkit

To expose the tools to the server, we can group them into toolkits:

namespace App\Mcp;
 
use Kirschbaum\Loop\Collections\ToolCollection;
use Kirschbaum\Loop\Contracts\Toolkit;
 
class PromptLibraryToolkit implements Toolkit
{
public function getTools(): ToolCollection
{
return new ToolCollection([
$this->createListPromptsTool(),
$this->createSearchPromptsTool(),
$this->createGetPromptDetailsTool(),
$this->createComposePromptTool(),
$this->createListCategoresTool(),
]);
}
 
// Tool definitions...
}

And then make the Toolkit available to the MCP server in AppServiceProvider:

use Kirschbaum\Loop\Facades\Loop;
use App\Mcp\PromptLibraryToolkit;
 
public function boot(): void
{
Loop::toolkit(new PromptLibraryToolkit());
}

Seeing it in action

With the toolkit implementation registered, we can now connect it to an MCP client and start using the tools.

Transports explained

Laravel Loop supports these MCP transport methods:

STDIO transport

STDIO transport executes your Laravel application as a subprocess, ideal for local development.

HTTP+SSE transport

Server-Sent Events transport operates over HTTP with persistent connections. You can use it for local development, and it is ideal for remote servers in production environments.

Configure the MCP client

Generate the MCP configuration for your client by running this command and following the instructions:

php artisan loop:mcp:generate-config

This command creates configuration you paste into your AI client settings. Once connected, your AI client lists all available tools automatically:

Claude code:

Cursor:

With the MCP server connected and running, you can start having conversations with the assistant, and it will have access to all the tools.

Conversation example

Here’s what a real conversation with the prompts library looks like:

The AI discovers tools, calls them with the right parameters, and uses the results - all while having a natural conversation.

Last thoughts

We covered how to define tools and serve them through MCP with Laravel Loop. By implementing the patterns demonstrated, you can create intuitive tools for any AI agent of your preference.

I hope this post gives you an idea of the power of MCP. Connecting Laravel applications to AI models opens endless interaction possibilities for AI-assisted applications and development tooling.

To name a few ideas, imagine:

  • Sharing code components between projects

  • Advanced code refactoring using team-defined patterns

  • Well-formatted and structured documentation generation and maintenance

  • E2E testing generation

The possibilities are vast, and I believe we don’t yet fully understand its potential.

Thanks for reading! Give Laravel Loop a try, and discover what’s possible.

Additional Resources

  • Laravel Loop Documentation

  • Model Context Protocol Specification

  • Prompt Library Application

Fernando Sánchez
Software Developer
Author Image

Interested in speaking with a developer?

Connect with us.
©2025 Kirschbaum Development Group LLC Privacy Policy Terms of Service