Back to work Motion · iOS · 2024

One dashboard connected two problems no one had linked

That shift changed the trajectory of the vertical.

RoleSole Designer
Timeline1 month
PlatformiOS
Impact80 → 250+ athletes
Context

p°Motion expanded into youth athletes and families after 3 years serving professional teams. Parents were handed a coach-facing admin view never designed for them. Athletes had no reason to open the app. The youth vertical was at risk before it had a chance to establish itself.

p°Motion · iOS · 2024
The tension

Giving parents more visibility risked making athletes feel monitored. Giving athletes more ownership risked leaving parents in the dark. One dashboard had to solve both without breaking either. The design element that answered the parent's question became the motivation mechanic athletes didn't want to break.

My role

Sole designer · 1 month · iOS
Collaborated with PM, engineering lead & senior designer. Owned design QA and regression testing through to launch.

Outcome
  • 80 → 250+ active athletes
  • +19pt mid-season retention
  • 0 parents asking kids directly
  • One design solved two problems
Final Guardian parent and athlete dashboard side by side
01 — Context

A three-year-old platform expanding into a vertical it wasn't built for

p°Motion is a precision movement platform that uses biomechanics data to identify injury risk and generate personalized corrective programs.

Built by coaches, clinicians, and data scientists with over 30 years of experience, the platform had spent three years serving professional teams before expanding into youth athletics and families.

That shift introduced a fundamentally different dynamic. Professional teams had coaches and performance staff interpreting the data. Families did not.

Parents were given default access to the coach-facing admin view: a roster list showing their child's name, a partially filled progress circle, a compliance percentage in red, and a "Missed" status label.

No context, no history, and no way to tell whether that number was alarming or normal.

Original coach-admin view that parents were given access to — roster list with red compliance percentage and Missed labels
What parents actually saw: a coach-facing admin view never designed for families
Week 1
Most athletes dropped off before completing their first week. Parents still relied on asking directly.

This project was scoped as a parent-facing tool from the start: a progress dashboard built from scratch to give families the visibility the admin view had never provided. Athletes were in scope, but secondary.

I was the sole designer, working within a one-month sprint alongside a PM, an engineering lead, and a senior designer.

02 — The tension

Two users. One screen. Competing needs.

The dashboard had to serve two people with fundamentally different needs. Solving for one carelessly would undermine the other.

Parents needed

Visibility

Not data, just a clear answer to one question: did my kid do their workout today? Every parent was already asking their child directly. The dashboard needed to make that conversation unnecessary.

Athletes needed

Ownership

Motivation, not monitoring. Streaks, progress, a sense of momentum. A dashboard that felt like surveillance would reinforce exactly the dynamic parents were trying to move away from.

Too much detail on the parent side and athletes would feel watched. Too little and parents would keep asking their kids directly, or churn. The dashboard had to answer the parent's question without making the athlete feel like the answer was being reported on them. This tension became the filter for every design decision that followed.
03 — Research & insights

What parents needed to see, athletes needed to feel.

By the time athlete interviews ran, the parent dashboard was nearly finished. The CEO asked whether athletes could use it too, and before committing, we ran research to validate whether the design would actually work for them.

I led both streams end-to-end: writing discussion guides, running sessions, and synthesizing findings back to the team.

Parent research confirmed the direction. Athlete research changed one thing, and it was the most important thing.

Parent research · 5 participants

High-level visibility, not data
All five wanted clean, scannable dashboards, not reports. They wanted to see overall progress at a glance and drill down only if something seemed off.
“I don’t need to see every detail. Just show me if he's improving.”
The app wasn't replacing the conversation
Every parent confirmed their main method of knowing whether their child had completed a workout was asking directly.
“I just ask him if he did it.”

Athlete research · 4 participants

Streaks were the strongest motivator
Streak-based progress was the clearest signal. The Duolingo comparison came up unprompted across all four sessions: something they would want to maintain, not break.
“A streak like Duolingo would be awesome. I wouldn’t want to break it.”
Workouts felt purposeless without context
All four said they did not understand what a stretch was targeting or where they should feel it. Without context, exercises became something to get through, making it easier to stop doing them.
“Sometimes I don’t know what my body’s supposed to do. What should I be feeling?”
Key finding
The calendar strip could not stay a simple progress check for parents. Research showed it also needed to motivate athletes, turning the same element into visibility for one user and momentum for the other.
04 — Design question

Not what data can we show. What does a parent need to know right now?

Before the design had a clear direction, it had too many competing ones. The PRD outlined a comprehensive compliance dashboard: progress bars, monthly summaries, session logs, insights reports, data export. Thorough, and the wrong starting point for this audience.

Early explorations kept landing on calendar-based layouts: monthly grids with color-coded days, status icons for completed, missed, and in-progress sessions. On paper it made sense. In practice, it meant scanning 30 cells to answer a question parents could have just asked their kid.

Early calendar-based layout explorations — monthly grids that felt evaluative rather than encouraging
Early explorations: calendar-based layouts that felt evaluative rather than encouraging
The guiding question

Did they complete their workout today?

The guiding question also created an unexpected alignment between two users. A streak strip at the top answered "did they do it today?" for parents instantly. For athletes, that same strip was something worth maintaining. One element, two motivations.
05 — Key decisions

Every decision filtered through one test: does this motivate or monitor?

Three decisions defined the final design. Each was evaluated against the same question: does this motivate, or does it monitor?

01Lead with the streak, not the calendar
The problem
Early designs used monthly calendar grids, too much to scan for a parent checking in quickly. The layout felt evaluative before it felt reassuring. Parents needed an answer, not a report.
The decision
Simplified to a weekly calendar strip: focused, scannable, built around the parent's question. Did they do it today? Each day showed completed, missed, or in-progress. Parents could see the answer in a glance without reading a month of history.
The shift
When the CEO asked whether athletes could use the dashboard too, research validated the direction and sharpened it. Athletes didn't see completed and missed days. They saw a streak: something worth protecting, something they wouldn't want to break. The visual language shifted from reporting completion to building momentum.
Before
Early monthly calendar grid explorations
After
Final weekly streak strip
02Demote the consistency score
The problem
Early iterations placed the consistency score as the dominant element. In review, the CEO asked directly: should that really be the first thing they see? A percentage leading the screen made opening the app feel like checking a grade, not checking in.
The decision
Returning to the guiding question made the answer clear: a parent's first need is "did they complete their workout today?" The consistency score does not answer that. The streak strip does. The score moved below the streak, and "compliance" was renamed to "consistency." The data was the same, but the word a parent reads first shapes how the whole screen feels. Compliance is a grade. Consistency is progress.
The thinking
I considered removing the score entirely, as some parents might not need it. But research showed 5/5 wanted the ability to drill down if something seemed wrong. The score earned its place; it just didn't earn the hero position. Demoting it rather than deleting it was the right call.
Explorations
Consistency score gauge and dial explorations
Final
Final consistency score in demoted position below streak
03Add focus areas for context
The problem
Athletes consistently said workouts felt repetitive and purposeless. Without understanding what a session was targeting or where they should feel it, exercises became something to get through.
Alternatives explored
Embedding context in the program page, tooltips during workouts, post-session reflections. The program page felt cluttered. Athletes there were in task mode, there to start, not read. Context needed to come before that moment.
The decision
A Focus Area section on the dashboard lets athletes see which body regions their current week targets before opening their program. The data was already available with no additional engineering required. Focus areas answer "why am I doing this?" before the workout begins.
Athlete dashboard with focus areas section highlighted showing targeted body regions
Focus areas give athletes context before they open their workout, answering "why am I doing this?"
The decision I'd revisit first

Streak freeze state: advocated for, not shipped

I pushed for a streak freeze mechanic — a way for athletes to protect their streak if they completed most, but not all, of their weekly workouts.

The risk was clear. If someone loses everything for missing one day, they are more likely to stop than start over.

Engineering pushed back on timeline, so it shipped without it.

If I revisited this, I would scope the backend work earlier so the streak could ship with the protection it needed to actually work.

06 — The expansion

From parent visibility tool to athlete landing screen

Athletes were not in scope for most of the project. Parents were the underserved user, and that was the right place to start.

Turning point

Late in the project, the CEO asked a simple question: could this work for athletes too?

Athletes had no dedicated home screen. They landed directly on their workout list with no context and no sense of where they were in their program. The dashboard could fix that.

Before committing, we ran athlete research to validate the design would work for them, and that is where the streak direction came from.

The adaptation was lighter than expected. The main structural difference was at the top of the screen: where parents saw their child's name and progress, athletes saw a personalized welcome and a motivational prompt. The rest carried over.

Before: athlete experience
Original athlete experience showing bare workout list with no context
After: athlete experience
New athlete landing screen with streak strip, focus areas and personalized welcome
Athlete home
Athlete home screen
Completion
Streak completion state
Parent dashboard
Final parent dashboard
07 — Outcome & impact

Retention up. Users up. Conversations down.

The dashboard launched to the youth vertical over the summer.

The vertical had been difficult to scale precisely because there was no purpose-built experience for families. Onboarding new athletes meant handing parents a tool never designed for them.

250+ Active youth athletes Up from 80 at the start of the season
↑19pt Mid-season retention First measurable increase since the vertical launched
0 Parents asking kids directly Confirmed in follow-up research
2 Users served by one design Parents and athletes — problems nobody had connected

A dashboard built for parents became a motivational entry point for athletes. One design solved two problems that nobody had connected before.

08 — What I learned

Run research before design begins, not alongside it

Run research before design begins, not alongside it

Findings validated late-stage work rather than shaping early decisions. The instincts were right, but that was partly fortunate. Earlier research means sharper constraints and more defensible decisions.

Ship the forgiveness logic or do not ship the streak

Post-launch confirmed the risk: athletes who lost their streak for one missed day were more likely to disengage than restart. The fix was not a UI change. It was a backend scope decision that needed to happen earlier in the project.