Mark Manfrey
Systems Design2025 — Present

Multi-Surface
Experience System

6 coordinated surfaces · Real-time state · UE5 simulation · AI interaction

Client

Glydways

Role

Principal Design Technologist

Year

2025 — Present

Tools

Unreal Engine 5 · Figma · Swift · Arduino · Python · ElevenLabs

01

The Problem

Glydways is building a new category of autonomous transit — small, on-demand pods operating in dedicated guideways. No prior design patterns exist for how a rider should be onboarded, guided, and comforted through an experience that has no driver, no conductor, and no established behavioral script.

The brief had to be written before the work could begin. That meant defining what the experience even was — across physical space, digital surfaces, acoustic environment, and thermal comfort — before designing any of it.

“No established design patterns. The brief had to be written before the work could begin.”

02

System Architecture

Coordinating across six surfaces simultaneously — each with its own timing, fidelity, and failure mode.

Kiosk

Access point · Ticketing · Wayfinding

VMS

Variable message sign · Platform status

In-Cabin Display

Onboarding · Journey state · Egress

Lighting

Interior + exterior state signaling

Audio

AVAS · In-cabin AI interaction

UE5 Sim

Real-time validation environment

03

Hardware Prototyping

Lab prototype — kiosk & station mockup

Lab prototype — kiosk & station mockup

UXSync — prototype coordination (localhost)

UXSync — prototype coordination (localhost)

Physical prototyping was the only way to understand the experience. Coordinating Arduino, Linux, and iOS devices via real-time WebSocket messaging allowed full multi-modal state testing — lighting, audio, and display surfaces responding simultaneously to a single event trigger.

04

In-Cabin Intelligence

Designing an LLM-driven audio interaction system for a context where the rider may be anxious, disoriented, or non-verbal. The system needed to read journey state and respond appropriately — not just answer questions.

Multi-agent architecture with ElevenLabs TTS, tuned for acoustic comfort in a small enclosed space. Deliberation is audible — the system thinks out loud at a volume and cadence matched to the environment.

In-cabin display — journey state (Civic Center)

In-cabin display — journey state (Civic Center)

05

Outcomes

6

Coordinated surfaces

Kiosk · VMS · Display · Lighting · Audio · UE5 — all responding to a single state event

0→1

Zero prior patterns

No existing UX conventions for autonomous transit rider experience to reference or adapt

HW+SW

Fully integrated

Physical prototype delivery spanning Arduino, iOS, Linux, and cloud AI — not a simulation

Next project

Hyphen — Director of DesignFood automation