Selected Work · 01

AI product systems that make creative work scale

  • Hundreds
    Creatives using the product
  • Thousands
    AI variations per day
  • ~5×
    Faster than prior flow
  • Full
    Adoption across creative org

Epsilon · 2021 → present · Staff Software Engineer · Tech lead

AI Creative Production Platform

A production creative tool with AI threaded through concept, layout, and edit — practical UX tooling backed by an eval skill every feature plugs into.

The problem

Creative production didn't scale. Every ad variation was a human-in-the-loop craft act inside Illustrator and Photoshop. We needed a real product workflow where a small team could ship thousands of variations a day without sacrificing brand fidelity, usability, or creative control.

The approach

I led the product architecture for a web-based builder that turns creative intent into production-ready variations. A homegrown agentic orchestration loop over tiered Gemini models handles concepting, layout, and edits, while production-data bridges — like real-time font-ID lookups — keep output tied to the actual brand system.

The durable part

I built a reusable eval skill for any AI feature in our org to plug into. It scores outputs on layout accuracy, color fidelity, and pixel-diff against baselines — and a meta-optimizer reads those scores and tunes the prompts, model tier, pipeline steps, and schemas. Closer to DSPy-style automated pipeline tuning than critique-and-regenerate. Invest once; compound leverage across every AI surface downstream.

Where it is headed

Leading an agent-first refactor — migrating deterministic orchestration into fully agentic workflows, with the eval harness as the safety rail.

Selected Work · 02

Founded Core UI, Epsilon's design system

  • ~10 yrs
    In production
  • Company-wide
    Adoption across products
  • WCAG 2.1 AA
    Embedded, not audited
  • Storybook + Figma
    Surfaces in sync

Epsilon · 2017 → present · Founder & UX engineering lead

Core UI — a design system that became shared infrastructure

A production design system bridging Figma and Storybook, accessibility embedded as a constraint, in production across Epsilon's products for almost a decade.

The problem

Designers and engineers were on different sides of a wall. Designs shipped in Figma, components got rebuilt per product, accessibility was a checklist someone ran at the end, and every team rediscovered the same primitives. The gap between design intent and production reality was where quality leaked.

The approach

I founded the UX Engineering discipline at Epsilon and grew what started as a side project into the company's shared design system. I partnered with designers in Figma and engineers in Storybook so the same component had one canonical surface in both worlds. Tokens, patterns, and documentation lived alongside the code, not in a separate doc that drifted.

The durable part

I treated WCAG 2.1 AA as a design constraint, embedded into every component — not a checklist run at audit. Color contrast, focus order, ARIA semantics, keyboard paths: all baked into the primitives. Same skill-as-abstraction pattern I'd later apply to AI evals: invest once at the substrate, every product team downstream gets the leverage for free. Ship a button, ship accessibility.

Why it still matters

Core UI is the foundation everything else is built on — including the AI-assisted UI work that follows. The design system is where taste, accessibility, and system thinking compound; the AI tooling is where that compounding gets pointed at agents. Live at coreui.epsilon.com.

Selected Work · 03

Shipped solo, runs itself

  • 2 days
    Empty repo → live URL
  • 48
    Curated comedy styles
  • 80+
    Auto-generated SEO posts
  • Auto
    Trend-to-publish pipeline

Solo · 2026 → live · makemea.ai

MakeMeA — a small product shipped fast

An AI photo style-transfer app shipped in 2 days from empty repo to production, wrapped in an autonomous trend-to-publish content engine.

The build

Pair-programmed with agents from empty repo to live URL in 48 hours. Next.js · TypeScript · OpenAI, deployed straight to production. Every design decision optimized for shipability: tailored prompts per style, watermark as a single overlay step, minimal server infra.

The autonomous engine

The longer-term bet: MakeMeA runs a trend-to-publish pipeline without human intervention. A trend finder spots emerging visual styles and spins up a new style (prompt + params), generates sample images, produces short-form videos, and auto-writes SEO/AEO blog posts — 80+ posts, one per style, every style its own organic discovery surface.

The honest result

Live, functional, not profitable. The market is crowded; competitors give away what I charge for. But the exercise validated AI-native velocity as a repeatable baseline, and the auto-content infra is more valuable than the product it happens to wrap. See it live →

What I learned

The system is the moat, not the UI. Taste differentiation only works at the speed you can keep leading it. Shipping in two days is a capability; product-market fit is a separate problem that velocity alone doesn't solve.

Lab

Small tools, agent skills & useful experiments

● Early · Live

Sprawl

Real-time AI generative art. Type a question; Voronoi-stippled dots resolve into SDXL generations through a custom 2D/WebGL renderer.

sprawl.place →

● Early · Published

OSS Contributor

Agent skill that discovers GitHub issues, forks, implements fixes, and opens PRs with full AI disclosure. Same skill-as-abstraction pattern as the Epsilon eval skill.

clawhub.com →

● DX · Published

Visual QA — agent-pipeline DX

Internal tooling for the moment agents started shipping visual breakage faster than humans could catch it. Playwright baselines, pixel-diff gates, and deployment thresholds wired directly into agent runs — so a regression fails the build before it ships, not after a designer notices. Workflow automation that gives every team that plugs in a quality floor for free.

clawhub.com →

● Practical AI Utility · Custom GPT

Printable Social Stories

A custom GPT that helps parents and educators turn a plain-language topic into a child-friendly, printable social story. It starts from a real caregiving need: fast, supportive materials for transitions, routines, and unfamiliar situations.

The interesting part is the product judgment, not the complexity. It packages AI around a focused workflow with clear prompting, a gentle tone, optional customization, and offline-friendly output.

Open custom GPT →
  • Prompt design
  • Education
  • Accessibility
  • Human-centered AI

Foundation

Twelve years of depth behind the work

Civiq Smartscapes · 2013 – 2017

Creative Technology

Large-format interactive installations for major brands. Personalized animated photo experience for Coca-Cola across malls nationwide. Award-winning Kinect-gesture Photo Booth for PepsiCo. 3D interactive experience for Accenture at JFK Airport. The throughline: real-time systems under public load, shipped for brands that measured them.

Kinect · Unity3D · Real-time systems · Creative technology

How I Work

How I like to work

  • Measure the work

    Generation is table stakes. Teams that win in AI measure output quality and tune for it — the eval layer compounds across every feature that plugs in.

  • Package the useful parts

    Package cross-cutting AI work (evals, agents, tools) as composable skills. It's what turns a team into a platform.

  • Simple orchestration, smart meta-optimization

    Homegrown in-memory loops beat heavy frameworks when the AI landscape moves weekly. Keep the substrate debuggable; let the optimizer do the adapting.

  • Humans in the loop today, agents tomorrow

    Every AI moment I ship is a skill an agent can eventually orchestrate. Deterministic today, agentic as the safety rails mature.

  • Production over prototypes

    I write the code that ships and operate the systems that run. Twelve years of making that trade honestly.

  • Accessibility is a design constraint

    Not a checklist. WCAG 2.1 AA embedded into components, not bolted on at audit.

  • AI-assisted dev is the substrate

    Daily Claude Code, Cursor, and MCP — not as a buzzword, as the working surface. CLAUDE.md and AGENTS.md as committed primitives, agent skills published on ClawHub, and a point of view on where this is headed: the loop between designer, engineer, and agent gets tighter every quarter, and the people who treat their tools as a system get the compounding.