The question that changed Androma's architecture wasn't "how do we get more users?" It was "what if Claude could write a wiki page?"
Not generate text and paste it in. Actually interact with the platform -- search existing content, read prerequisite pages, draft a new article following the style guide, submit it for human review, and respond to feedback. That's what the Model Context Protocol makes possible, and building support for it turned out to be one of the best decisions I've made.
[illustration:devlog-mcp-flow]
## What MCP Actually Is
Model Context Protocol is an open standard (originally from Anthropic, now broadly adopted) that lets AI assistants interact with external tools through a structured API. Instead of the assistant generating text that a human copy-pastes into an application, the assistant calls tools directly: `search_pages`, `get_theorem`, `create_page`, `submit_for_review`. The AI sees the platform's data in its native structure and operates on it with the same semantics as a human user.
For a mathematical wiki, this is transformative. An AI assistant that can read the existing definition of a [Sobolev Space](/page/Sobolev%20Space), check what notation conventions the [Notation Guide](/page/Androma%20Notation) uses, look up prerequisite theorems, and then draft a new page that integrates cleanly with the existing content graph -- that's qualitatively different from an AI that generates isolated text in a vacuum.
## Authentication: Two Paths
Building MCP support meant building a real API, which meant authentication. Two separate systems, because two different classes of client need access.
For programmatic access (Claude Desktop, local scripts, CI pipelines), I implemented Bearer token authentication. Users generate API keys in their account settings. The keys are SHA-256 hashed before storage -- the database never holds a plaintext key. Each key is scoped to specific capabilities: read-only, read-write, or full admin. Revocation is instant.
For third-party applications like ChatGPT's plugin system, I built OAuth 2.0 with dynamic client registration following RFC 7591. The client registers itself, gets credentials, and then goes through the standard authorization code flow. This was significantly more work than Bearer tokens, but it's the standard that OpenAI and other platforms expect.
## Transport Layers
The MCP specification supports multiple transport mechanisms, and I ended up implementing two. Server-Sent Events (SSE) for Claude Desktop and local clients -- it's simple, works over HTTP, and handles the streaming response pattern naturally. Streamable HTTP for remote connections and web-based clients that need a more standard request-response model.
Getting both transports working simultaneously behind the same Express router was a week of careful middleware work. The tool definitions are shared; only the message framing differs. A single tool registry serves both transports.
## 40 Tools and Counting
The tool surface grew organically based on what felt limiting during actual usage. It started with the basics -- search pages, read a page, read a theorem. Then came write operations: create a page, edit a section, submit a change for review. Then the workflow tools: get the style rubric, check notation conventions, look up the prerequisite graph for a topic.
Today there are over 40 MCP tools. An AI assistant can perform a complete editorial workflow: search for pages on a topic, identify gaps, read the [notation guide](/page/Androma%20Notation) and [rubric](/page/Androma%20Rubric), draft new content that follows the conventions, upload diagrams to Backblaze B2 storage, submit the draft for review, and even respond to reviewer comments. The `get_rubric` tool is mandatory -- the server blocks all write operations until the AI has read the [content standards](/page/Androma%20Page%20Standards). This prevents the most common failure mode of AI-generated content: technically correct mathematics wrapped in formatting that doesn't match the platform's conventions.
## The Backblaze Migration
A side story that intersects with MCP: I migrated file storage from Cloudflare R2 to Backblaze B2 during this period. The motivation was cost predictability and the S3-compatible API that both human uploads and AI tool calls could use without abstraction differences. The migration itself was straightforward -- both services speak S3 -- but it forced me to standardize every file upload path in the codebase, which cleaned up a lot of inconsistency.
## The API-First Insight
Here's what I didn't expect: building for machine clients made the platform better for human users. When an AI tool calls `create_page`, it sends structured JSON with explicit fields for title, content, prerequisites, and notation declarations. This forced me to make the data model explicit in ways the web form never required. The web editor now uses the same structured API internally, which means the page creation flow is more consistent and less error-prone for everyone.
Citation autocomplete is another example. I built it initially so AI assistants could resolve `\cite{SobolevSpace}` references to actual wiki pages. Then I exposed the same resolution engine in the web editor as an autocomplete dropdown. Now human authors get instant citation suggestions as they type, powered by infrastructure that was built for machines.
The live editor preview with MathJax rendering, the notation glossary lookup, the prerequisite checker that warns you when you reference a concept without declaring it as a dependency -- all of these features originated in the MCP tool layer and migrated to the web UI because they were too useful to keep behind an API.
## What This Means
The bet I'm making is that mathematical content creation in the near future will be a collaboration between human mathematicians and AI assistants. The human provides insight, taste, and correctness verification. The AI handles the mechanical work of formatting, cross-referencing, notation consistency, and first-draft generation. Androma is built to be the platform where that collaboration happens, with review workflows that ensure nothing goes live without human approval.
Forty-plus tools later, Claude can do a full editorial review of a wiki page. It can't do original mathematics. But it can absolutely ensure that the mathematics a human writes is well-formatted, well-linked, well-cited, and consistent with the rest of the wiki. That's not a small thing.
Teaching AI to Write Mathematics: The MCP Integration
Part of a series
Building Androma
— Part 3 of 6
Delete Comment
Are you sure you want to delete this comment? This action cannot be undone.
0 Comments
Sign in to join the conversation
Sign InNo comments yet. Be the first to share your thoughts!