How an MCP tool is shaped
A tool is three things:
- Name — a short, snake_case identifier (
list_snippets). - Description — natural language. This is the only signal Claude has for deciding when to call it.
- Input schema — JSON Schema. The MCP SDK accepts Zod schemas and converts them.
Plus a handler that takes the validated input and returns a JSON-serializable result.
1. Replace the placeholder with a real MCP server
Update supabase/functions/mcp/index.ts so the actual SDK handles the request:
import { Hono } from "hono";
import { logger } from "hono/logger";
import { Server } from "@modelcontextprotocol/sdk/server/index.js";
import { StreamableHTTPServerTransport } from
"@modelcontextprotocol/sdk/server/streamableHttp.js";
import { requireAuth, type AuthedUser } from "./auth.ts";
import { supabaseFor } from "./supabase.ts";
import { registerSnippetTools } from "./tools/snippets.ts";
const SUPABASE_URL = Deno.env.get("SUPABASE_URL")!;
const ISSUER = `${SUPABASE_URL}/auth/v1`;
const SELF_URL = Deno.env.get("MCP_SELF_URL")!;
const app = new Hono();
app.use("*", logger());
// Protected Resource Metadata (unchanged from step 5)
app.get("/.well-known/oauth-protected-resource", (c) =>
c.json({
resource: SELF_URL,
authorization_servers: [ISSUER],
scopes_supported: ["openid", "email", "profile"],
bearer_methods_supported: ["header"],
})
);
app.get("/health", (c) => c.json({ ok: true }));
// Mount the MCP RPC behind the auth middleware
app.all("/", requireAuth, async (c) => {
const user = c.get("user") as AuthedUser;
const supabase = supabaseFor(user);
// Build an MCP server bound to this user's request context. We rebuild
// per request because each request carries different auth, and the SDK's
// Server is cheap to construct.
const server = new Server(
{ name: "shared-skills-mcp", version: "0.1.0" },
{ capabilities: { tools: {}, resources: {} } }
);
registerSnippetTools(server, { user, supabase });
const transport = new StreamableHTTPServerTransport({
sessionIdGenerator: () => crypto.randomUUID(),
});
await server.connect(transport);
// Hono's Request and the Web Fetch Request are compatible; pass them straight
// through to the transport.
return await transport.handleRequest(c.req.raw, c.res);
});
Deno.serve(app.fetch);A few things to highlight:
- Per-request
Server. Each Streamable HTTP request gets its ownServerinstance, scoped to one user. The SDK supports session reuse viasessionIdGenerator, but a per-request rebuild is simpler and fast enough at the volumes a team server sees. registerSnippetTools(server, ctx)is where we keep the tool definitions. Splitting them into their own module keepsindex.tsreadable.transport.handleRequest(c.req.raw, c.res)hands the request off to the SDK. It deals with framing, batching, and SSE streaming.
2. The tool module
Create supabase/functions/mcp/tools/snippets.ts:
import type { Server } from "@modelcontextprotocol/sdk/server/index.js";
import type { SupabaseClient } from "@supabase/supabase-js";
import { z } from "zod";
import type { AuthedUser } from "../auth.ts";
type Ctx = { user: AuthedUser; supabase: SupabaseClient };
export function registerSnippetTools(server: Server, ctx: Ctx) {
const { supabase } = ctx;
// ---------------------------------------------------------------------
// list_snippets
// ---------------------------------------------------------------------
server.tool(
"list_snippets",
{
description:
"List prompt snippets the caller can see. " +
"Use when the user asks for their snippets, the team's snippets, " +
"or to find one by name or tag.",
inputSchema: z.object({
workspace_id: z.string().uuid().optional()
.describe("Filter to a specific workspace. Omit to list across all workspaces the user belongs to."),
tag: z.string().optional()
.describe("Filter to snippets carrying this tag."),
search: z.string().optional()
.describe("Substring match against title or body."),
limit: z.number().int().min(1).max(50).default(20),
}),
},
async ({ workspace_id, tag, search, limit }) => {
let q = supabase
.from("snippets")
.select("id, workspace_id, title, tags, visibility, updated_at")
.order("updated_at", { ascending: false })
.limit(limit);
if (workspace_id) q = q.eq("workspace_id", workspace_id);
if (tag) q = q.contains("tags", [tag]);
if (search) q = q.or(`title.ilike.%${search}%,body.ilike.%${search}%`);
const { data, error } = await q;
if (error) throw new Error(error.message);
return {
content: [{
type: "text",
text: JSON.stringify(data, null, 2),
}],
};
}
);
// ---------------------------------------------------------------------
// get_snippet
// ---------------------------------------------------------------------
server.tool(
"get_snippet",
{
description:
"Fetch the full body of one snippet by id. " +
"Use after list_snippets when the user picks one to read or edit.",
inputSchema: z.object({
id: z.string().uuid(),
}),
},
async ({ id }) => {
const { data, error } = await supabase
.from("snippets")
.select("id, workspace_id, created_by, title, body, tags, visibility, created_at, updated_at")
.eq("id", id)
.maybeSingle();
if (error) throw new Error(error.message);
if (!data) throw new Error("snippet not found or not visible to you");
return {
content: [{ type: "text", text: JSON.stringify(data, null, 2) }],
};
}
);
// ---------------------------------------------------------------------
// save_snippet
// ---------------------------------------------------------------------
server.tool(
"save_snippet",
{
description:
"Create a new snippet, or update an existing one when `id` is provided. " +
"Use when the user says 'save this', 'add a snippet', 'update the X snippet'.",
inputSchema: z.object({
id: z.string().uuid().optional()
.describe("Provide to update an existing snippet. Omit to create."),
workspace_id: z.string().uuid()
.describe("Target workspace. Required for new snippets; ignored on update."),
title: z.string().min(1).max(120),
body: z.string().min(1).max(60_000),
tags: z.array(z.string().min(1).max(40)).max(20).default([]),
visibility: z.enum(["private", "workspace", "public"]).default("workspace"),
}),
},
async (input) => {
if (input.id) {
// UPDATE — RLS lets the creator or a workspace owner write
const { data, error } = await supabase
.from("snippets")
.update({
title: input.title,
body: input.body,
tags: input.tags,
visibility: input.visibility,
})
.eq("id", input.id)
.select("id, workspace_id, title, visibility, updated_at")
.maybeSingle();
if (error) throw new Error(error.message);
if (!data) throw new Error("snippet not found or you can't edit it");
return {
content: [{
type: "text",
text: `Updated snippet "${data.title}" (${data.id})`,
}],
};
}
// CREATE
const { data, error } = await supabase
.from("snippets")
.insert({
workspace_id: input.workspace_id,
title: input.title,
body: input.body,
tags: input.tags,
visibility: input.visibility,
created_by: ctx.user.sub, // RLS will reject if this doesn't match auth.uid()
})
.select("id, workspace_id, title, visibility, updated_at")
.single();
if (error) throw new Error(error.message);
return {
content: [{
type: "text",
text: `Saved snippet "${data.title}" (${data.id}) in workspace ${data.workspace_id}.`,
}],
};
}
);
}Worth noticing:
- All three tools rely on RLS, not app-level checks. A user trying to
save_snippetinto a workspace they aren't a member of will get a clean "row-level security violation" from Postgres, which we surface as the error message. - Tool descriptions name the trigger phrases. The model uses the description as its routing signal. "Use when the user asks for their snippets..." beats a dry "lists snippets" — that's the difference between Claude calling the tool when needed and ignoring it.
save_snippetdoes upsert based on whetheridis provided. Simpler tool surface than separatecreate_snippet+update_snippet.
3. Test from the command line
Get a token (as in step 6) and make a JSON-RPC call directly:
TOKEN=$(curl -s -X POST \
"https://<ref>.supabase.co/auth/v1/token?grant_type=password" \
-H "apikey: <your-anon-key>" \
-H "Content-Type: application/json" \
-d '{"email":"you@example.com","password":"your-password"}' \
| jq -r .access_token)
# List tools
curl -s \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d '{"jsonrpc":"2.0","id":1,"method":"tools/list"}' \
http://127.0.0.1:54321/functions/v1/mcp/ | jq
# Call list_snippets
curl -s \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d '{"jsonrpc":"2.0","id":2,"method":"tools/call","params":{"name":"list_snippets","arguments":{}}}' \
http://127.0.0.1:54321/functions/v1/mcp/ | jqIf you see a tools array with three entries and a clean (probably empty) list back from list_snippets, the wiring is correct.
4. Test from the MCP Inspector
Anthropic ships a GUI inspector:
npx @modelcontextprotocol/inspectorIn the inspector:
- Choose Streamable HTTP transport
- URL:
http://127.0.0.1:54321/functions/v1/mcp/ - Authentication: Bearer Token, paste your
$TOKEN - Click Connect
You should see all three tools listed with their descriptions and input forms. Try a save_snippet — fill in the workspace_id (from your workspaces table), a title, and a body. Hit Send. Then list_snippets should show it.
5. The two failure modes to confirm
Insufficient permissions — try save_snippet with a workspace_id you don't belong to. RLS should reject:
{ "error": "new row violates row-level security policy for table snippets" }Tool unknown — call tools/call with name: "drop_all_snippets". The SDK returns:
{ "code": -32602, "message": "Tool not found: drop_all_snippets" }Both are reassuring. The first proves RLS is gating writes; the second proves Claude can't invent tools you didn't expose.
Three tools down. Step 9 adds the sharing tools — share_snippet, list_workspaces, invite_to_workspace — and exercises the workspace-owner role checks.