Mark Manfrey
Glydways · ResearchHardware Prototype

UXR Lab &
Hardware Prototype

How do you validate an experience that doesn't exist yet?

2025 — Present

Client

Glydways

Role

Principal Design Technologist

Type

UX Research · Hardware Prototype

Stack

Arduino · iOS · Linux · WebSocket · M5Stack · ElevenLabs

01

The Question

Autonomous transit is a new category. You can't do a usability study of something that doesn't exist yet, and you can't design for a rider experience by looking at what competitors have built. The research problem is: how do you generate real data about a hypothetical experience?

The answer was to build a physical approximation — a multi-modal lab environment that could simulate the sensory conditions of an autonomous pod ride well enough to surface real rider responses.

“You can't usability-test something that doesn't exist. So we built a close-enough version of it.”

02

The Rig

iOS Controller

Orchestrates state transitions · Sends WebSocket events to all connected devices

Arduino

LED lighting control · Receives state and drives interior + exterior color/brightness

M5Stack Display

In-cabin display surface · Shows UI states synchronized to ride phase

Linux Audio

AVAS playback · AI voice interaction via ElevenLabs TTS pipeline

All four surfaces respond simultaneously to a single state event fired from the iOS controller — boarding, in-transit, arrival, emergency. The rig made it possible to run structured UXR sessions with real participants experiencing a coordinated multi-modal response, not a screen prototype.

03

What We Learned

Acoustic environment dominates

Participants consistently cited sound before visual feedback when describing their comfort level. Audio sequencing became a primary design constraint, not a secondary one.

State transitions need anticipation

Sudden state changes — even positive ones like arrival — caused anxiety. A 2–3 second anticipatory cue before each transition measurably reduced stress responses.

Physical props outperform screen prototypes

Participants suspended disbelief much more readily in the physical rig than in any screen-based simulation. Embodied experience generates qualitatively different feedback.

Next project

Hyphen — Director of DesignHMI for a robotic kitchen