design·ai
A design studio, drawn by two hands

A canvas both designers and agents can draw on.

Walls, rooms, furniture, materials, light — every edit is a single SceneOp, whether it comes from a cursor or a tool call. Humans and AI don't take turns. They share the plan.

Built on the Model Context Protocol
Bedroom153 sq ftLiving280 sq ftBathDining12' 6"14' 5"N
Fig. 1 — Plan of a two-bed apartment, 690 sq ft. Dimensions in feet. Walls auto-miter; openings mapped by arc-length.
  • Floor plans → 3D, instantly
  • AI drafts, you approve
  • Ops-based history
  • MCP-native API
For designers

A studio that respects the gesture.

Chained polyline walls with 15° axis snap. Curved walls with a draggable control point. Doors that swing in or out. Zones you can color for legends. Every action goes through a scene-op ledger — so undo, redo, and an AI's proposal all ride the same track.

  • R-rotate, ⇧F zoom to selection, ? for the rest
  • Catalog → drag → material swatches per wall
  • Share a link; client opens to walk mode
draw · place · revise
For agents

A canvas that speaks tools.

The MCP server exposes the same scene model your UI uses. scene.addWall, catalog.search, layout.suggest — every tool is a Zod schema in @design-ai/shared. Diff-preview the proposal, accept per op, revert with one call.

  • Stdio + HTTP/SSE transports
  • Per-user API keys scope access
  • Audit trail with actor="ai"
addWalladdItemapplyOpstool → op → scene
Three vignettes

What an afternoon in the studio looks like.

  1. 01

    Sketch the shell

    14' × 10' 6"

    Drop a 13×16 bedroom template. Nudge a wall into a bay. Toggle a window to sill 3'6". The joints re-miter as you drag.

  2. 02

    Furnish the thing

    Search the catalog by size: “fits in 7' 3" wall.” Favorite a sofa. Rotate with R. The 3D viewer shows real W × D × H — not boxes.

  3. 03

    Let the agent finish

    + SOFA-001+ LAMP-PENDANT+ RUG-002+ PLANT-003

    Prompt Claude with “cozy Scandi, under $3k.” Fifteen ops come back. Reject the pendant. Accept the rest. Version it as “Client review v1.”

The canvas is the API

One schema, four entry points.

Web, server, MCP server, render worker — all of it imports the same SceneOp union. No parallel type files, no “v2” adapters. When you add a wall on the canvas, the agent sees the same document a millisecond later.

scene.*
read, applyOps, versions, share
catalog.*
search, byId, bySkus
layout.*
suggest, apply (per-op diff)
render.*
enqueue, status, list
export.*
CSV, PDF, GLB
packages/shared/src/scene/ops.tsZod · strict
const AddWallOp = z.object({
  kind:     z.literal('addWall'),
  floorId:  FloorId,
  wallId:   WallId,
  a:        Vec2,
  b:        Vec2,
  thickness: z.number().positive().optional(),
  height:    z.number().positive().optional(),
});

// tRPC, MCP, the worker — all import this.
export const SceneOp = z.discriminatedUnion('kind', [
  AddWallOp,
  AddItemOp,
  ReplaceMaterialOp,
  SetWallCurveOp,
  /* …twenty more, all invertible */
]);
What it won't do

Straight talk, because the canvas is honest.

Colophon

Set in Fraunces and Inter. Rendered in react-three-fiber. Driven by the Model Context Protocol. Free to try. Open a project.