Skip to main content

Integrate with MCP

The Model Context Protocol (MCP) is an open protocol developed by Anthropic to standardize how language models interact with external systems—such as tools, memory backends, and context managers. MCP enables structured, dynamic communication between an LLM and its environment, empowering models to access external tools, retrieve real-time information, and perform complex, multi-step reasoning.

Using MCP with Ailoy

Ailoy Agents can seamlessly integrate with existing MCP-compliant servers. For example, the following code connects to the official GitHub MCP server:

from mcp import StdioServerParameters

agent.add_tools_from_mcp_server(
StdioServerParameters(
"github",
command="npx",
args=["-y", "@modelcontextprotocol/server-github"]
)
)

This launches the GitHub MCP server as a subprocess using standard I/O for communication. The agent automatically discovers the tools exposed by the server and registers them into its internal toolset.

Querying via MCP Tools

Once the tools are registered, the agent can invoke them as needed when you call the query() method:

question = "Search the repository named brekkylab/ailoy and describe what it does based on its README.md."
for resp in agent.query(question):
agent.print(resp)

This demonstrates how the agent utilizes the GitHub MCP tools to search repositories and summarize their contents.


╭─ Tool Call: github-get_file_contents ────────────────────────────────╮
│ {                                                                    │
│   "repo": "ailoy",                                                   │
│   "path": "README.md",                                               │
│   "owner": "brekkylab",                                              │
│   "branch": "feature/add-qwen3-big-models"                           │
│ }                                                                    │
╰──────────────────────────────────────────────────────────────────────╯
╭─ Tool Result ────────────────────────────────────────────────────────╮
│ [                                                                    │
│   "{\"name\": \"README.md\", \"path\": \"README.md\", \"sha\":       │
│ \"563dde166b65319e7614b81a9d8330eee06537d3\", \"size\": 4443,        │
│ \"url\":                                                             │
│ \"https://api.github.com/repos/brekkylab/ailoy/contents/README.md?re │
│ f=feature/add-qwen3-big-models\", \"html_url\":                      │
│ \"https://github.com/brekkylab/ailoy/blob/feature/add-qwen3-big-mode │
│ ls/README.md\", \"git_url\":                                         │
│ \"https://api.github.com/repos/brekkylab/ailoy/git/blobs/563dde166b6 │
│ 5319e7614b81a9d8330eee06537d3\", \"download_url\":                   │
│ \"https://raw.githubusercontent....(truncated)                       │
╰──────────────────────────────────────────────────────────────────────╯
The repository **brekkylab/ailoy** is a **lightweight library** for building AI applications, such as **agent systems** or **RAG (Retrieval-Augmented Generation) pipelines**. It is designed to simplify the integration and usage of AI models, allowing developers to import and use AI capabilities with minimal effort.

### Key Features:
- **Support for Local and Cloud AI Models**: It enables the use of local AI models (e.g., Qwen3 variants) and cloud APIs (e.g., OpenAI, Gemini, Claude).
- **Multi-turn Conversations**: Supports conversational agents with customizable system messages.
- **Reasoning and Tool Calling**: Facilitates reasoning-based workflows and integration with tools (including `MCP`).
- **Vector Store Support**: Built-in integration with vector databases like `Faiss` and `ChromaDB`.
- **Cross-Platform Compatibility**: Works on Windows, macOS (Apple Silicon), and Linux, with specific hardware requirements for local model execution.

### Supported Models:
- **Local Language Models**: Qwen3 variants (0.6B, 1.7B, 4B, 8B, 14B, 32B, 30B-A3B).
- **Cloud Models**: OpenAI, Gemini, and Claude.
- **Embedding Models**: BAAI/bge-m3.

### Use Cases:
- **Chatbots**: Build simple or advanced chatbots with local or cloud models.
- **RAG Pipelines**: Combine retrieval and generation for enhanced AI applications.
- **Custom AI Agents**: Create agents with reasoning capabilities and tool integration.

### Requirements:
- **Hardware**: At least 8GB of GPU memory is recommended for most models, with higher requirements for larger models like Qwen3-8B (12GB).
- **OS**: Windows, macOS (Apple Silicon), or Linux with specific versions and drivers.

### Getting Started:
- **Node.js**: Install via `npm install ailoy-node` and use TypeScript examples.
- **Python**: Install via `pip install ailoy-py` and use Python examples.

This repository is in an **early development stage**, and APIs may change. For more details, refer to the [official documentation](https://brekkylab.github.io/ailoy/).

Complete Example

Here is the full source code to set up an agent, connect it to the GitHub MCP server, and issue a query:

from ailoy import Runtime, Agent, LocalModel
from mcp import StdioServerParameters

rt = Runtime()
with Agent(rt, LocalModel("Qwen/Qwen3-8B")) as agent:
# Add tools from Github MCP server
agent.add_tools_from_mcp_server(
"github",
StdioServerParameters(
command="npx",
args=["-y", "@modelcontextprotocol/server-github"]
)
)

question = "Search the repository named brekkylab/ailoy and describe what it does based on its README.md."
for resp in agent.query(question):
agent.print(resp)