AI-Generated Content Hub

Artifact Hub

A small hub where teams publish AI-generated artifacts, get them auto-tagged and described, collect structured feedback, and share them via time-boxed links — with first-class MCP support for Claude Desktop.

Next.js 16 App Router MCP (StreamableHTTP) Drizzle + Postgres Auth.js v5 / Google Anthropic SDK
The Problem

AI output lives nowhere

Teams generate mockups, diagrams, notes, and documents with AI tools — but the outputs end up in:

  • Slack threads that vanish after 14 days
  • Random local folders on one person's laptop
  • Email attachments nobody can find later
  • Private chat histories, not shareable with clients

Artifact Hub is the smallest thing that demonstrates the full loop a coaching/AI-content team cares about:

produce → publish → review → share

Architecture

Two services, one data layer

┌─────────────────────────┐ ┌─────────────────────────┐ │ artifact-hub │ │ artifact-hub-mcp │ │ Next.js 16 App Router │ │ Standalone MCP server │ │ UI + REST API │◀─────────│ (Express + SDK) │ │ Google sign-in + API-key│ fetch │ on write │ │ :10000 (Render PORT) │──────────▶ │ └──────────┬──────────────┘ └──────────┬──────────────┘ │ │ ▼ ▼ ┌───────────────┐ ┌─────────────────────┐ │ Render disk │ │ Shared Postgres │ │ /var/data │ │ hub_* tables │ │ (uploads) │ │ (direct reads from │ │ │ │ both services) │ └───────────────┘ └─────────────────────┘

Web service is the sole owner of the uploads disk. MCP publishes by POSTing to the web API; reads go to Postgres directly.

Data Model

Four tables, all prefixed hub_

TablePurpose
hub_artifactsid, title, type enum, file_path, tags[], author_email, timestamps
hub_feedbackauthor_name, content, rating 1–5, nullable parent_id (reply support)
hub_share_linksnanoid(21) token, expires_at, max_views, view_count
hub_api_keysSHA-256 hashed personal access tokens for MCP clients

The hub_ prefix is load-bearing — the local dev DB is shared with another project, so drizzle.config.ts uses tablesFilter: ["hub_*"]. Migrations are generate-only; db:push is banned because it tries to drop sequences it doesn't own.

Product Decisions

What makes it feel right

Upload anything

HTML / image / PDF are first-class (render inline). Everything else degrades to a download button.

AI metadata by default

Only a title is required. Tags and description are inferred at upload time. Users can override — most don't.

Feedback lives with the artifact

Inline in the DB, not scattered across Slack threads. A single "Summarize" button condenses long discussions.

Time-boxed share links

Signed tokens with TTL + optional view cap. No account required for external reviewers.

MCP Integration

Seven tools, one identity

ToolBackend
publish_artifactPOST to web /api/artifacts with x-api-key
get_artifactPostgres direct
search_artifactsPostgres (ilike + tag filter)
list_my_artifactsPostgres (scoped to token's user)
add_feedbackPostgres direct
summarize_feedbackPostgres + Anthropic Sonnet
create_share_linkPostgres (nanoid + TTL)

The token is the identity. Artifacts published via MCP auto-attach to the Google account the token was created under — no authorEmail argument anywhere.

Uploads & Security

Three layers before anything touches disk

1. Size cap · 10 MB

Enforced in the browser before streaming, re-enforced in the route handler with a 413 if bypassed.

2. Content validation

MIME allowlist · magic-byte match against declared MIME · PE/ELF/Mach-O header block · EICAR test-string block.

3. VirusTotal scan · optional

When VIRUSTOTAL_API_KEY is set, every upload is posted to /api/v3/files and polled for up to 15 s. Any malicious or suspicious count rejects. Timeout / missing key falls through — VT is defense-in-depth, not the primary gate.

HTML artifacts render in a sandboxed iframe (sandbox="allow-scripts allow-same-origin"). DOMPurify on ingest is next, so we can drop allow-scripts.

Where LLMs Earn Their Keep

Three single-turn completions, zero agentic loops

Tag generation

At upload: title + first 2000 chars → JSON array of 3–6 lowercase tags. Parsed with /\[[\s\S]*\]/ to strip hallucinated prose.

Description generation

1–2 sentence summary of what the artifact is. Fills in when the user left the field blank.

Feedback summarization

On demand, when ≥2 feedback entries exist. Formats rows into numbered lines with ratings, asks Claude for consensus / disagreement / actionable suggestions.

LLM output never blocks the write path. If the Anthropic API is down, publish still succeeds with empty tags and null description — cheap to backfill later.

Deployment

Two Render services, auto-deploy from main

Web · artifact-hub

Build: npm install && npm run build
Start: npm start (prestart runs db:migrate)

next-server idles at 120–180 MB — fits the 512 MB tier. next dev spikes past the cap and OOMs.

MCP · artifact-hub-mcp

Build: npm install
Start: npx tsx src/mcp-server/index.ts

Reachable at /mcp over StreamableHTTP. Fresh McpServer + transport per request.

Uploads live on a 5 GB Render persistent disk at /var/data. Durable across deploys, but single-region and coupled to one service — first thing to swap for production.

With Another Week

What's next

Try It

Thanks.

Walkthrough

See WRITEUP.md § Walkthrough for a 9-step demo script.

Navigate with , scroll / two-finger swipe on a trackpad, or jump to first/last with Home / End.

Artifact Hub
1 / 11