The Challenge: Confusing Inputs, Miscalculated Scores, and Lost Users


The original Dot Score calculation flow lacked guidance, clarity, and educational context—leaving new users uncertain about what inputs were needed, what the numbers meant, or how to progress.
TriDot personalizes training using a proprietary fitness metric called the Dot Score—a benchmark that calibrates daily workouts based on an athlete’s swim, bike, and run performance. But the original onboarding experience asked users to manually input recent race times—such as their best 5K or 10K.
For experienced athletes, this wasn’t an issue. But for new users and fitness-curious runners, this screen became a major blocker. Many didn’t know their times, hadn’t done a recent race, or felt unsure what data to enter. Their options were:
- Guess and risk inaccuracy
- Abandon the flow
- Input incorrect data and get a miscalibrated training plan (which could lead to injury)





Foundational wireframes explored flow, content, and guidance—establishing the Dot Score Assistant’s value through testing
Our Insight: Low Conversion Wasn’t About Motivation—It Was About Clarity
This created friction, decision fatigue, and led to either drop-off or bad data. The experience lacked flexibility, guidance, and assurance—three key principles in modern onboarding UX. Analytics showed users were willing to provide data—but only if they understood it.
When we looked at this step through a user empathy lens, we realized the deeper issue:
We were asking for expert-level input from people who were just starting out.
The Solution: Meet the Dot Score Assistant

To address the high friction around inputting performance metrics, I led the creation of the Dot Score Assistant—a built-in, interactive guide that helps users complete onboarding with less uncertainty and more confidence.
Rather than asking users to recall exact race times or estimate fitness metrics from memory, we introduced:
01
Simplified prompts with clear, approachable language


02
Predefined answer choices so users could select without typing or guessing
03
Helpful tips and tooltips to guide decisions and explain context


04
Inline education to reinforce why each input mattered to their training
If a user got stuck, the assistant nudged them forward with gentle, helpful feedback—reducing friction while maintaining momentum. This shifted the tone of onboarding from interrogative to supportive and user-centered, improving both trust and completion rates.
Examples of the Experience





The finished flow: a streamlined, supportive onboarding experience with future-ready AI foundations
Impact

Increased activation rate and onboarding completion

Reduced drop-off from performance metric input screens

Improved data accuracy and user trust in their personalized training

Created a framework for future GenAI-based in-app assistants

Introduced a delighter—a friendly, supportive presence that added emotional value to onboarding
The Assistant wasn’t just functional—it was human. It gave the app a voice, a smile, and a sense of support that users could feel.
What Made This Work
This wasn’t just a usability improvement—it was a strategic UX pivot rooted in onboarding best practices:

User-paced flow with a clear fork in the road: manual vs. assistant-guided

Reduced cognitive load through design, not more explanation

Contextual feedback instead of static screens

Immediate emotional win by showing users we understand their needs
Looking Ahead
The success of the Assistant laid the groundwork for AI-guided onboarding across the TriDot ecosystem. Its tone, logic, and structure are now being extended into a next-gen generative AI training guide—ensuring users feel supported not just at signup, but every step of the way.