TRACKPAD  H A P T I C S

Guiding navigation through haptic feedback, without relying on visuals.

Role: Interaction Designer

Context: University Project

Duration: 8 Weeks

Tools: Microcontrollers, JavaScript, User Testing

This project explored whether haptic feedback alone could help users understand and navigate a graphical user interface.

THE GOAL

To understand how touch communicates structure, boundaries, and direction.

RESEARCH QUESTION

How far can a trackpad-based vibro-tactile system enhance the spatial experience of a GUI?

Early Exploration

I created a simple experiment with four shapes, each with a distinct vibration. Participants explored visually first, then by touch, and finally with eyes closed. This revealed how vibrations communicate shape and boundaries.

Physical Setup: Haptic Trackpad

Digital Setup: Shape GUI

The Shape Language

Each geometry was paired with a haptic signature to explore what people can actually perceive.

SQUARE

Strong, rough vibration → Sharp boundaries

CIRCLE

Smooth, fluid pulses → Continuous edges

BLOB

Irregular, mixed rhythms → Ambiguous perception

PATTERN

Rhythmic "bumps" → Textured surface

“I thought I was creating the texture, but it was creating me.”

The feedback didn’t just add texture, it changed how the shapes were understood. When it matched what users expected, it felt natural. When it didn’t, they leaned in. Haptics became something you interpret, not just something you feel.

Exploration Insights

- Haptic feedback changed how people experienced the shapes.
- More ambiguous vibrations actually drew people in.
- Even low-resolution haptics helped with recognizing shapes.

From Shapes to Systems

I tested a GUI with structured haptics, letting participants navigate with limited visuals, relying more on the haptic feedback, to understand how touch can guide interaction.

Physical Prototype: Full Setup

Close-Up of Trackpad Setup

Four interface types

Each interface asked a different question about the feedback:
- Could it guide movement?
- Could it support navigation without visual cues?
- Could it communicate interaction through touch alone?

1. Intro
2. Guidance
3. Context
4. Full Tactile

1. Intro: Familiarize Users

Establishing the 'vocabulary.' Before navigating, users had to learn what specific frequencies felt like.

Shared Haptic Language

Each interface element had its own vibration pattern to help users build a mental map of the system.

HEADERS (H1/H2)

Continuous, deep waves → Signals structural weight

BODY TEXT / SECONDARY

Sharp, rapid pulses → Reflects granular detail

INTERACTION ANTICIPATION

Scaling intensities → Builds target anticipation until reaching item to click

BOUNDARY FEEDBACK

Tactile notch rhythm → Signals interface edges

Mental Models

After the session, participants sketched the interfaces to understand how well they built mental models through touch.

System Insights

- People often moved the cursor around to scan the interface and feel where elements started and ended.
- When the feedback was inconsistent, people started to lose trust in it.
- Longer sessions led to “haptic fatigue,” showing that touch has its limits.

Working with haptics felt like shaping a language of rhythm, intensity, and duration, one that isn’t seen, but gradually understood through touch.