OVERVIEW
Building a two-sided platform for students to find care and therapists to find freedom
Evexia is a digital platform connecting university students seeking mental‑health support with licensed therapists who value autonomy and flexibility. In 2024 my team set out to design a two‑sided marketplace that could address two converging problems: students face long wait times and stigma when seeking therapy, while therapists struggle to find work arrangements that support flexible schedules and fair pay.
As the lead product designer, I drove research, strategy, and design from concept to implementation. This case study walks through our process, the decisions we made, and how I balanced the needs of two distinct user groups under real‑world constraints.
ROLE
Lead Product & UX Designer
TIMEFRAME
Ongoing
PROJECT TYPE
HealthTech
METHODOLOGY
Mixed‑methods research, weekly critiques, iterative prototyping, tight loops with engineering
KEY SKILLS
User research
Information architecture
Product strategy
UI/UX design
Prototyping
Design systems
Accessibility
Metrics analysis;
Cross‑functional collaboration
The Problem
University students struggle to access affordable mental‑health services due to a national shortage of providers and average wait times of around six weeks. Many drop off before receiving help, which exacerbates crises. Meanwhile, therapists face burnout and lack flexible, supportive work arrangements; administrative overhead and marketing tasks limit their time with clients and reduce satisfaction.
The Solution
Evexia is a two‑sided platform that matches students with vetted therapists quickly through a simple intake and matching process. It’s built to make the early steps easier: finding someone, feeling like it’s a good fit, and booking without second-guessing. It also gives therapists a way to present themselves clearly, without needing a personal website or a clinic behind them.
Impact
We delivered the MVP on schedule using a reusable component set. Usability sessions showed that a simplified intake led to faster first‑time completion and fewer hesitations. Users typically created a plan in about two minutes and reported knowing what to do on day one.
The app ran on Google Play for about eight months and was acquired. The project reinforced that thoughtful UI and system thinking can help a small team build trust quickly in a sensitive space.
CONTEXT & PROBLEMS
Mental‑health access is limited
Mental‑health services remain difficult to access in North America. Research shows that Canada will be short about 31 000 behavioral‑health practitioners by 2025. As a result, the average wait time to access behavioral‑health services is six weeks and can stretch to months for specialists.
The same report highlights that as many as 50 %–70 % of people living with a mental‑health condition do not receive treatment. Long waiting lists and lack of providers especially affect university students, whose mental‑health crises are often time‑sensitive.
Students and therapists face different pain points
From preliminary desk research and conversations with campus counselling services, we learned that students often:
Struggle to find therapists who understand campus life and identity‑specific challenges.
Cannot afford private therapy and thus rely on limited school resources.
Face waiting lists of 6–8 weeks and may drop off before receiving help.
At the same time, many therapists are burned out and desire more control over their schedules. Traditional employment often requires fixed hours and heavy administrative work, leaving little time for client care. Platforms like Zencare show that therapists value services which streamline marketing and administrative tasks and allow them to acquire motivated clients. Therapists want to focus on therapy, not on paperwork or marketing.
Strategy
Start Small, Prove Fast
We intentionally scoped the MVP to something we could test quickly. We wanted to validate three things: that there was real demand, that people trusted the experience enough to start, and that we could deliver value without overbuilding. Keeping the build lean reduced risk and gave us proof points for future partners or investors.
Success Hypotheses
We aligned on four signals that would indicate we were on the right track:
Quick value
Users generate a plan in under two minutes.
Actionable
60% begin their plan within 24 hours.
Sustainable
40% complete at least
3 sessions in week one.
Clarity
Average 4/5 or higher
on “I know what to do
today.”
Who We Were Building For
From the start, we knew we weren't building this for pro athletes with entire medical teams, but everyday lifters and amateur athletes who:
Scope of MVP
What was in scope
Short onboarding: pain area, tolerance, equipment, time.
AI-generated plan in a clear, digestible format.
Basic progress signals and reminders.
Linked exercise and recovery content.
What we intentionally left out
Complex analytical dashboards for direct communication channels
Features involving complex medical diagnostics or clinical interventions to avoid regulatory complexities
Social & community features such as peer-to-peer messaging, forums, or groups
With that lean scope, our success wasn't going to be about feature count. It was all about user behavior. We landed on a few key hypotheses that would tell us if we were truly creating value for both our users and the business.
We were essentially asking ourselves:
01
Can people get value instantly?
Our goal was a median time from install to a completed plan of under two minutes. If we couldn't do that, the experience was too complicated.
02
Are the plans compelling enough to act on?
We targeted a plan start rate of at least 60%, meaning that of all the people who received a plan, the majority would start their first session within 24 hours.
03
Is it engaging enough to stick with?
Early engagement was key. We aimed for at least 40% of starters to complete three or more sessions in their first week. This would show us the plan was doable and effective.
04
Do people feel confident and clear?
This was our "perceived trust" metric. Through quick post-session surveys, we were looking for a score of 4 out of 5 or higher on the statement, “I know what to do today."
Clarity on vision, audience, and scope gave us alignment from day one. For me, it was a set of sharp constraints: design for speed, simplicity, and trust. Those constraints shaped every UI choice I made, and they set up the challenges I’ll unpack in the next section.
CHALLENGES
Navigating Trust, Speed, and Scale
We had three months, a small team, and a fixed launch date. Because this was a health-focused product, we had to balance speed with building trust. Desk research helped us set guardrails: studies of digital health & wellness tools show that poor usability or apps that take too long drive people away, and users often cite sign‑in or setup problems as a reason they stop using a health app.
At the same time, ethical guidance for digital health products stresses the need for clear communication (using plain language, good readability, and keeping information brief) to help users understand what’s happening. Those insights shaped the rules we designed within: keep the first‑time experience short, limit the number of questions, speak plainly, and present safety cues clearly.
01
Time & Scope
A short timeline and a lean team meant we had to make tough choices. We deliberately kept the scope narrow: mobile only, English first, no diagnostics, and no community features.
02
Question Budget
People want plans that feel personal but don’t want to feel like they’re filling out paperwork. We set ourselves a strict “question budget” of under two minutes. Everything else moved to optional depth.
03
Tone & Safety
We couldn’t sound like doctors or imply a diagnosis. Trust comes from clarity, so all copy used everyday language and was kept as short as possible. We also added obvious “stop” points and red‑flag cues so users knew when to seek professional care.
04
Coherence While Shipping Fast
Work was happening in parallel, and content could vary. The risk was a patchwork feel. To keep things coherent, we used consistent terminology, predictable layouts, and a simple action hierarchy.
These guardrails turned the brief into day‑to‑day decisions. With those rules in place, the next section will show how I worked within them during design and handoff.
PROCESS
Designing the Recovery Experience
As a UI designer on the team, my main job was to translate a set of research findings and UX flows into clear, high‑fidelity screens. I wasn’t leading the research, but I read through the lead designer’s notes and absorbed the core user needs. My focus was on layout, color, typography, and interaction details that would make the experience feel calm, trustworthy, and easy to navigate. I built prototypes, defined tokens, and delivered design specs that engineers could implement without endless questions.
Visual Design
I paid close attention to accessibility, colours met contrast guidelines, tap targets were large enough for comfortable use, and text always had enough breathing room, drawing on guidelines that stress the importance of accessibility and language for diverse user groups
High-Fidelity Screens
Using Figma, I produced over 40 screens covering onboarding, daily plans, progress tracking, and resource pages. Early prototypes helped us see where users hesitated or got lost. For example, we tried an onboarding flow with voice and free-text inputs but found that many testers paused—so we moved those options behind an “Add more detail” link to keep the main path under two minutes. High-fidelity prototypes also helped us refine microcopy, showing users “You’re done for today” or “Add a gentle stretch” instead of clinical or abstract labels.
Component Library & Tokens
To maintain consistency across this many screens, I created a lightweight component library. We defined tokens for colors, spacing, typography, and state changes (e.g., active vs. disabled buttons). This allowed us to update colors or spacing in one place and have it propagate across the entire design. Engineers appreciated that the handoff included clear naming and sizes, which reduced confusion later on.
Usability Testing
Key Findings
We ran a handful of remote usability sessions using clickable prototypes. The sessions were short, usually under 15 minutes, to see if people could complete onboarding and start a plan without guidance. We learned that:
Users skipped open-text or voice input unless they had specific concerns, validating our choice to make those optional.
The order of questions mattered; when we asked about time commitment first, people were more likely to back out. Moving it later kept them engaged.
Participants didn’t always realize the plan was personalized until they saw a “Today” card with their own chosen session length and equipment. Adding a brief explanation (“Here’s your plan for today”) increased confidence.
Iterations from Feedback
Based on those insights, we:
Simplified the question order and moved optional inputs behind a secondary link.
Added a short note on the Today page explaining why certain exercises were suggested.
Made the primary action button persistent on small screens so users never had to hunt for it.
Kept copy friendly and free of jargon, echoing recommendations to use plain language
SOLUTION
Introducing Recovery Hub
Recovery Hub helps people move from uncertainty to action in just a few minutes. After a short assessment, the app generates a day-by-day plan tailored to the user’s injury area, movement tolerance, equipment, and available time. The “Today” view shows a single, clear action with a gentle nudge—such as “10-minute mobility session”—and offers the option to edit or skip without penalty. Progress indicators help users see what they’ve completed, and built-in reminders encourage them to come back.
Core Flows
01
Onboarding
Users answer a handful of questions: where it hurts, what movements they can currently tolerate, what equipment they have, and how much time they want to spend, etc. Each question uses plain language and simple selections, with optional depth behind a “Tell us more” link. We kept this flow under two minutes, reflecting research that long or complex onboarding drives attrition
02
Personalized Plan
From those inputs, the app composes a personalized plan. Each day includes a single action with guidance on movements and suggestions to stop or seek care when necessary. Users can edit, skip, or reschedule without penalty. The plan automatically tracks completed sessions and integrates reminders, features shown to support engagement
03
Workouts & Resources
For each exercise in the plan, users can tap into short videos and how‑to guides. These resources are linked directly to the plan so users don’t have to search. We kept descriptions concise and used everyday terms to explain movements and recovery techniques, following plain‑language recommendations
Accessibility
We designed Recovery Hub with accessibility in mind:
Contrast & Readability: Colors meet WCAG AA guidelines; type is large and legible.
Flexible Inputs: Users can tap, type, or speak to answer questions; voice input is optional.
Safety Cues: Red‑flag messages appear when a recommended action might not be safe, with links to professional guidance.
Plain Language: All copy avoids jargon and keeps sentences short
Minimal Data Entry: Sessions log automatically, reducing the need for manual entry
OUTCOMES
What Happened Next?
We finished design at the three‑month mark, handing off a polished system of over 40+ screens and a tokenized component library. Engineers built the Android app from these specs, and we launched on Google Play. Users could create a personalized plan in under two minutes and see exactly what to do each day.
My involvement ended at handoff, so I can’t share adoption metrics, but early signs were encouraging: testing showed faster onboarding completion and fewer drop‑offs after we simplified the flow.
In Market
Recovery Hub stayed live for about eight months. Early prototypes suggested that simplifying onboarding increased completion rates and reduced hesitation. Participants said the plan felt doable, and most created a plan quickly and reported knowing what to do on day one. Moving optional inputs behind a secondary link and clarifying primary actions reduced mis‑taps on smaller devices.
Acquisition
Around eight months post‑launch, the startup behind Recovery Hub was acquired by a larger health‑tech company that integrated its features into a broader health platform.
Quick Reflection
01
What I Learned
This project taught me the power of simplicity, especially in health tech where trust is fragile. Limiting onboarding to essential questions and using plain language built confidence and encouraged immediate action. Building a design system early kept our small team aligned and sped up development.
On a personal level, I learned how to collaborate under tight timelines and how to translate research into UI decisions without overselling my role.
02
What I’d Do Differently
If I could do it again, I’d push for earlier and more frequent usability testing with a broader audience. Our prototype sessions were valuable but limited. Including users with lower health or digital literacy could reveal additional accessibility needs and personalization opportunities.
I’d also document design rationale more thoroughly so future teams can understand decisions after handoff.



