Dot Score Assistant – Helping Athletes Start Smarter

The Challenge: Confusing Inputs, Miscalculated Scores, and Lost Users

The original Dot Score calculation flow lacked guidance, clarity, and educational context—leaving new users uncertain about what inputs were needed, what the numbers meant, or how to progress.

TriDot personalizes training using a proprietary fitness metric called the Dot Score—a benchmark that calibrates daily workouts based on an athlete’s swim, bike, and run performance. But the original onboarding experience asked users to manually input recent race times—such as their best 5K or 10K.

For experienced athletes, this wasn’t an issue. But for new users and fitness-curious runners, this screen became a major blocker. Many didn’t know their times, hadn’t done a recent race, or felt unsure what data to enter. Their options were:

  • Guess and risk inaccuracy
  • Abandon the flow
  • Input incorrect data and get a miscalibrated training plan (which could lead to injury)

Foundational wireframes explored flow, content, and guidance—establishing the Dot Score Assistant’s value through testing

Our Insight: Low Conversion Wasn’t About Motivation—It Was About Clarity

This created friction, decision fatigue, and led to either drop-off or bad data. The experience lacked flexibility, guidance, and assurance—three key principles in modern onboarding UX. Analytics showed users were willing to provide data—but only if they understood it.

When we looked at this step through a user empathy lens, we realized the deeper issue:

We were asking for expert-level input from people who were just starting out.

The Solution: Meet the Dot Score Assistant

To address the high friction around inputting performance metrics, I led the creation of the Dot Score Assistant—a built-in, interactive guide that helps users complete onboarding with less uncertainty and more confidence.

Rather than asking users to recall exact race times or estimate fitness metrics from memory, we introduced:

Predefined answer choices so users could select without typing or guessing

Inline education to reinforce why each input mattered to their training

If a user got stuck, the assistant nudged them forward with gentle, helpful feedback—reducing friction while maintaining momentum. This shifted the tone of onboarding from interrogative to supportive and user-centered, improving both trust and completion rates.

Examples of the Experience

The finished flow: a streamlined, supportive onboarding experience with future-ready AI foundations

The Assistant wasn’t just functional—it was human. It gave the app a voice, a smile, and a sense of support that users could feel.