Skip to content

OctoPerf MCP Server

With the rapid rise of AI, the emergence of the MCP protocol reshaping human-machine collaboration, and testing tools like OctoPerf making their mark in the DevOps landscape, we’re clearly riding a new tech wave… and it’s got style.

I wanted to dive into this project because it felt both fun and challenging. It was the perfect opportunity to explore what AI, the MCP protocol, and OctoPerf could really offer… and to see how far we could push the possibilities. For this project/demo, I used VS Code with GitHub Copilot, but of course other IDEs and AI models could work just as well.

You will find the source code here if you want to go further you can test it with an OctoPerf free licence.

I will explain along this article the main concepts, technologies used, and there will be a demo section too, to give you a better idea of this MCP OctoPerf server!

OctoPerf is JMeter on steroids!
Schedule a Demo

Table of Contents

  1. Project Overview
  2. MVP Approach and Technical Foundations
  3. Model Context Protocol (MCP) Explanation
  4. Dev Container Configuration
  5. Project Architecture
  6. Demo

Project Overview

What is MCP OctoPerf?

MCP OctoPerf is an intermediary server that allows AI assistants (like Claude, ChatGPT, etc.) to interact directly with the OctoPerf performance testing platform. This AI assistant can now:

  • Discover current user's workspaces and projects
  • Launch performance tests
  • Monitor test execution
  • Analyze results and metrics
  • Generate performance reports
  • Compare results
  • Make recommendations

Why was it created?

Traditional performance testing workflows require: 1. Manual navigation through OctoPerf's web interface 2. Manual correlation of metrics across multiple test runs 3. Demonstrate the possibilities offered by this protocol, and what AI is capable of 4. Productivity gain

With this MCP server, AI assistants can automate these tasks, enabling: - Performance test commands in natural language - Automated test execution and monitoring - Intelligent results analysis and comparison - Streamlined reports and insights


MVP Approach and Technical Foundations

MVP Development Strategy

This project was developed as a Minimum Viable Product (MVP) to demonstrate the feasibility and potential of integrating AI assistants with performance testing platforms via the Model Context Protocol. The MVP approach allows us to:

  • Validate the concept with essential features
  • Demonstrate practical value with core OctoPerf operations
  • Provide a solid foundation for future improvements
  • Enable rapid iteration based on user feedback

MCP Framework Foundations

This implementation builds on the excellent mcp-go framework developed by Mark3 Labs:

πŸ”— Framework Repository: https://github.com/mark3labs/mcp-go

The mcp-go framework provides: - Robust MCP protocol implementation following the official specification - Type-safe Go bindings for MCP messages and tool definitions - Built-in error handling and validation - Extensible architecture for custom tool development - Active maintenance and community support

This choice demonstrates how developers can quickly implement MCP servers using existing, well-maintained frameworks rather than building protocol handlers from scratch.

Extensibility and Future Improvements

While this MVP focuses on certain operations offered by OctoPerf, the architecture is designed for easy server evolution.

Potential Improvements

  • Additional OctoPerf Features: Support for more API endpoints
  • Advanced Analysis: Custom metric calculations and comparisons
  • Review and Optimization: Scenario writing review
  • Accelerated Report Analysis: Analysis of Trend Reports and quick result compilation
  • Intelligent Recommendations: Use different models for different types of recommendations

Improvements can be multiple and interesting depending on use cases. Future versions of this MCP server could incorporate trend analysis tools, allowing AI assistants to perform sophisticated performance monitoring and provide proactive recommendations based on historical data patterns.

Why It Matters

This MVP demonstrates that implementing MCP servers is accessible to any developer familiar with modern programming languages. Using established frameworks like mcp-go, teams can quickly bridge their specialized tools with AI assistants, opening new possibilities for:

  • Natural language interfaces for complex technical systems
  • Automated workflows driven by AI understanding
  • Enhanced productivity through intelligent tool integration
  • Democratized access to specialized technical capabilities

Model Context Protocol (MCP) Explanation

What is MCP?

The Model Context Protocol (MCP) is an open standard that allows AI applications to securely connect to external data sources and tools. Think of it as a standardized way for AI assistants to "call functions" in external systems.

MCP Architecture

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”   Messages MCP    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”   Tool Calls        β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚  AI Assistant   │◄─────────────────►│   MCP Server     │◄───────────────────►│  External API   β”‚
β”‚    (Claude)     β”‚    JSON-RPC 2.0   β”‚  (this project)  β”‚     HTTP/REST       β”‚    (OctoPerf)   β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜                   β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜                     β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

Key MCP Concepts

1. Tools

Tools are functions that AI assistants can call. Each tool has: - Name: Unique identifier (e.g., octoperf_run_test) - Description: What the tool does - Parameters: Required and optional inputs - Handler: Go function that executes the tool

2. Servers

MCP servers expose tools to AI assistants. This server exposes 7 tools for OctoPerf operations for demonstration

3. Transport

MCP uses JSON-RPC 2.0 over stdio (standard input/output) for communication between AI assistants and servers.

MCP vs Traditional APIs

Aspect Traditional API MCP
Discovery Static documentation Dynamic tool discovery
Integration Custom code for each AI Standard protocol
Security Custom auth per AI Standardized security model
Error Handling API-specific Standardized error format

Dev Container Configuration

What is a Dev Container?

A Development Container (Dev Container) is a fully configured development environment running in a Docker container. It ensures that every developer has exactly the same environment, regardless of their local machine configuration.

Benefits of our Dev Container

  1. Consistency: Same Go version, dependencies, and tools for everyone
  2. Isolation: No conflicts with local installations
  3. Portability: Works on Windows, macOS, and Linux
  4. Speed: Pre-configured with all necessary tools

Container Configuration

Our Dev Container includes: - Alpine Linux v3.22 (lightweight base) - Go 1.23 (latest stable version) - Git for version control - Common CLI tools: wget, curl, ps, netstat, etc. - Pre-installed VS Code Go Extension

How it Works

# Simplified view of our container
FROM golang:1.23-alpine
RUN apk add --no-cache git wget curl
WORKDIR /workspaces/mvp-mcp-octoperf

When you open the project in VS Code: 1. VS Code detects .devcontainer/devcontainer.json 2. Suggests "Reopen in Container" 3. Builds the container (first time only) 4. Mounts your code into the container 5. You develop inside the container environment


Project Architecture

High-Level Architecture

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚                      AI Assistant                           β”‚
β”‚                     (Claude/GPT)                            β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
                      β”‚ MCP Protocol (JSON-RPC 2.0)
                      β”‚
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β–Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚                 MCP Server (this project)                   β”‚
β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”  β”‚
β”‚  β”‚    main.go  β”‚  β”‚ handler.go  β”‚  β”‚   octoperf/         β”‚  β”‚
β”‚  β”‚   (entry)   β”‚  β”‚ (MCP layer) β”‚  β”‚ (business logic)    β”‚  β”‚
β”‚  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜  β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
                      β”‚ REST HTTP API
                      β”‚
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β–Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚                    OctoPerf API                             β”‚
β”‚            (https://api.octoperf.com)                       β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

Layer Responsibilities

1. MCP Protocol Layer (main.go)

  • Defines available tools and their schemas
  • Manages MCP server lifecycle
  • Routes tool calls to appropriate handlers

2. Handler Layer (handler.go)

  • Extracts parameters from MCP requests
  • Validates input data
  • Formats responses for MCP protocol
  • Handles errors in MCP format

3. Business Logic Layer (octoperf/adapter.go)

  • Handles OctoPerf API authentication
  • Builds HTTP requests
  • Handles API responses and errors
  • Implements business logic

Design Patterns Used

1. Adapter Pattern

The octoperf.Client adapts the OctoPerf REST API to our Go application interface.

2. Handler Pattern

Each MCP tool has a dedicated handler function that processes requests.

3. Dependency Injection

The Handler structure receives an octoperf.Client instance, making it testable.


Demo

Tools Used

We'll see with VSCode and Copilot (free version) how to use this MCP server through very simple prompts.

First, click the Copilot icon to open a conversation

copilot_icon

Select agent mode, and then the AI model. Personally, I quite like using Claude Sonnet 3.5 as I find it relevant in its responses, feel free to test with other models and form your own opinions!

settings

Once you're ready, you can execute the MCP server. There are different ways to do this; in VSCode, MCP servers appear in the extensions, and you can start them from there, or you can use : ⌘ + ⇧ + P, or CTRL + MAJ + P on Windows, and type: MCP List Servers, select the OctoPerf server, and start the server:

list_server

Once started, you can see that all 7 tools are successfully discovered and available:

output_start_mcp

It's time to test one of the first tools and list the workspaces we're connected to:

workspaces_list

Ok, let's check the 'Sandbox' workspace:

project_list_result

I'd like to see if a Runtime is available in this project:

runtime_result

Now let's execute this Runtime:

ask_run

I can even check the status of the test, let's try:

status_run

Ok, the test is complete, let's see what we got:

get_report_metric1

When I ask for more details:

get_all_metrics1

Ok! Let's run another test and compare the results between the 2 last tests we launched:

retry_run

Let's check the results:

result_comparison

And if we ask for some recommendations?

recommendation

The MCP server, combined with AI, makes it possible to compare reports and get meaningful feedback, a bit like OctoPerf’s Insights, a really cool feature that brings a new dimension to analysis

Data Transformation Flow

  1. User Input β†’ Natural language request
  2. AI Processing β†’ Determines required MCP tools
  3. MCP Request β†’ JSON-RPC 2.0 format
  4. Parameter Extraction β†’ Go types from JSON
  5. API Call β†’ HTTP request to OctoPerf
  6. Response Processing β†’ JSON to Go structures
  7. MCP Response β†’ Standardized format
  8. AI Interpretation β†’ Natural language response

Error Handling Flow

// Error propagation chain
OctoPerf API Error
    ↓
adapter.go (HTTP error handling)
    ↓
handler.go (MCP error formatting)
    ↓
MCP Protocol (standardized error)
    ↓
AI Assistant (user-friendly message)

Conclusion

This MCP OctoPerf Proof of Concept demonstrates how to bridge AI assistants with specialized APIs using the Model Context Protocol. The clean architecture separates concerns effectively:

  • MCP layer handles protocol communication
  • Handler layer manages request/response formatting
  • Business layer implements API communication
  • Dev Container ensures consistent development environment

The project serves as a solid foundation for extending AI capabilities into performance testing workflows, enabling natural language interactions with complex testing platforms.

Want to become a super load tester?
Request a Demo