Automotive / Concept · CMU × Ford · 2024
Ford inVision
An augmented reality helmet interface for semi-autonomous electric bicycles — designed to surface what riders need without pulling their eyes from the road.
Timeline
8 Weeks
Role
Lead Designer
Team
3 Designers
Type
Multimodal Design

Brief
Ford Motor Company asked us to reimagine rider-vehicle interaction — not for cars, but for electric bicycles. In 8 weeks, our 3-person CMU team explored a paradigm shift: what if the interface lived in the helmet, not on the handlebars?
Problem
"How do we display information to the rider without distracting them to an unsafe level?"
Distracted driving is a factor in 27% of all crashes. Existing e-bike interfaces force riders to look down, breaking their connection to the road and their surroundings.
Research
Three interviews. Three clear signals.
"Screens either don't show enough information, or are too crowded."
"Interface controls were nearly impossible to use while riding."
"Riders use e-bikes specifically to feel connected to their surroundings."



Solution
Information in your line of sight.
An AR interface embedded in the helmet visor. By putting information where riders already look — instead of on the handlebars — we could display more while demanding less attention. And because it requires wearing a helmet, it promotes safety by design.

Status bar · Unlocked

Status bar · Locked (riding)
Feature 01
Navigation
Turn-by-turn AR directions overlaid in the rider's field of view. Set a destination without stopping or reaching for your phone.

Feature 02
My Ride
Post-ride power consumption broken down by segment. Understand your battery usage on familiar routes and plan your next trip accordingly.

Feature 03
Battery
Range estimates per speed mode. Balance how fast you want to go against how far you need to get — before you're stranded.

Safety · Level 3 Autonomy
The Takeover Moment
When the bike's autonomy can't safely handle a situation, control has to return to the rider — immediately. We layered three channels: vibrating handlebars for haptic feedback, a flashing AR overlay with an obstacle outline, and a clear on-screen prompt for what to do next. No single channel is enough on its own.
.gif)
Physical Design
Less buttons. More feel.
We redesigned the handlebar — curved for natural grip, reduced to just two buttons: [enter] and [back]. Menu navigation runs on eye-tracking. Physical controls are the fallback, not the primary input.

.jpg)
Companion App
When you're not riding.
The AR interface handles everything while in motion. The mobile app handles everything else — GPS tracking, ride history, speed mode and lights control, and navigation input via a real keyboard.

Reflection
Designing for a medium you can't fully test is a forcing function.
We built most of this project without access to AR hardware — which meant committing to information hierarchy and interaction patterns before we could validate them in the actual medium. When we finally got hands on a Microsoft HoloLens 2, the constraints we'd imposed on ourselves turned out to be the right ones. If this shipped, the real next step would be contextual testing: real routes, real riders, real motion.