OVERVIEW

FinPath Dashboard - Fintech Platform Redesign for AI Adoption

A financial wellness platform used by public sector employees across the U.S. to gain financial education, advice, and guidance to work towards financial freedom.

CLIENT/COMPANY

TCG, a HUB International company

ROLE

Lead Product Designer

TIMELINE

Jun 2024 - Mar 2025

TOOLS

Figma, Framer, WordPress

PROBLEM

Two Redesigns, One Mission: Build Clarity and Trust.

Phase 1
cluttered experience

Initially, the dashboard offered a wide range of valuable tools: coaching, calculators, articles, and financial education, but the interface was chaotic. Nothing felt connected.

Problems

UX Insights

Coaching was underbooked

No clear CTA or path to book support

Users dropped off early in onboarding

The quiz felt like a barrier, not a help

Most tools were ignored

No mental model or progress logic

Dashboard lacked narrative

No sense of stages or journey

Design Goal: Bring order to the chaos with personalization, structure, and simplicity.

phase 2
the ai pivot

Midway through, we adopted TIFIN @Work, a real-time financial AI tool. It changed both the strategy and the emotional stakes.

New Problems

Trust-Centered Design

AI felt unfamiliar or intimidating

Introduce softly, with benefit-first copy

Users wanted to keep human options

Keep coaching highly visible and accessible

STEPScore quiz became redundant

Shift quiz logic into AI onboarding flow

Coaching & courses needed recontextualizing

Align tools under 3 clear paths of help

the idea

Our mission was to transform FinPath from a tool buffet into a focused financial guidance platform.

Our core objectives were to:

  • Clarify the experience: Make it obvious what steps users could take and why they mattered

  • Drive engagement with coaching: Help users see the value of scheduling sessions and joining live guidance

  • Make personalization feel seamless: Use data to suggest tools without overwhelming the user

  • Increase retention: Create a reason for users to return and track progress

  • Adapt to evolving tools: Future-proof the design and trust for emerging features, including AI, with our audience

DESIGN PROCESS

Empathize

Define

Ideate

Prototype

Test & Iterate

Validate

AUDIENCE

Designing for Time-Strapped Educators Seeking Financial Peace

WHO THEY ARE

FinPath serves public sector workers, primarily K–12 school district employees, including teachers, administrators, and support staff. These users often juggle multiple responsibilities and face complex financial realities.

They’re not “tech-first” users. They’re pragmatic, protective of their personal information, and often unfamiliar with or hesitant toward AI-powered experiences.

They’re not looking to become financial experts; they want clear, trustworthy guidance that respects their time and helps them feel more in control of their financial future.

Personas

JANELLE, 34, TEACHER

Pain Points:

  • Doesn’t know which tools apply to her situation

  • Feels overwhelmed by financial terms and jargon

  • Hesitant to talk to a coach out of fear of judgment

Goals with FinPath:

  • Get quick, relevant help based on her financial stage

  • Learn how to pay off loans and save for retirement

  • Use tools that feel supportive, not intimidating

RON, 45, DISTRICT ADMINISTRATOR

Pain Points:

  • Rarely has time to explore the platform during work

  • Finds the dashboard cluttered and hard to navigate

  • Doesn’t trust AI to give him the same depth as a coach

Goals with FinPath:

  • Quickly find what’s relevant to him and his goals

  • Get answers fast, ideally without booking a call

  • Recommend tools to staff with confidence

TIFFANY, 29, SUBSTITUTE TEACHER

Pain Points:

  • Doesn’t feel “financially literate” enough to use the platform

  • Has bounced between part-time jobs with little stability

  • Sees FinPath but doesn’t know what to click or why

Goals with FinPath:

  • Understand her financial baseline

  • Get small, personalized recommendations

  • Avoid feeling judged or overwhelmed

WHO THEY ARE

Across personas, three needs came up consistently:

  • Clarity on what the tool is and why it’s helpful

  • Control over how they engage with AI vs. human support

  • Confidence that the guidance is safe, personalized, and designed for them

Our design needed to make FinPath feel like a financial mentor, not a maze.

RESEARCH

Turning Confusion Into Insight and Insight Into Structure

APPROACH

To move beyond surface-level fixes, we needed to deeply understand why users weren’t engaging, not just what they were clicking (or skipping). We combined platform analytics with user sentiment to uncover where the experience broke down.

We asked two key question:

  • “What stops a user from taking confident action when they land in FinPath?”

  • "How would they respond to a new AI-driven experience they didn’t ask for?"

METHODS

Our research blended qualitative feedback and behavioral data:

  • Past platform analytics: Used to map drop-offs and underutilized tools

  • User surveys: Gathered direct feedback on friction points and trust gaps

  • Stakeholder interviews: Captured business goals from product, marketing, and coaching leads

  • Competitive analysis: Benchmarked against SoFi and other fintech experiences

  • Design audits: Evaluated the old dashboard’s visual hierarchy, flow, and affordances

Due to privacy constraints, all research data was anonymized and kept internal.

key insights

Insights

Design Decision

“I don’t know what these tools are or how they help me.”

Introduced 3-card navigation with descriptions and icons for each financial path

"I have to take a long quiz to access the dashboard."

Deferred the STEPScore quiz to the AI tool’s onboarding and reduced question load

“I don’t trust new tech right away.”

Introduced the AI tool through soft language and optional entry, with human help nearby

“I don’t know what’s changed since I last visited.”

Added a persistent STEPScore banner and logic for returning users vs. new users

reframing the problem

Early on, we realized this wasn’t just a navigation issue; it was a mental model issue. Users weren’t failing to complete tasks because the UI was broken. They were failing because the experience didn’t align with how they think about financial help.

  • We weren’t just redesigning a dashboard.

  • We were helping people choose their financial path with confidence.

That meant shifting from tool delivery to decision empowerment and from technology-first messaging to trust-centered storytelling.

For AI to work, users had to feel like they were choosing help, not being handed a black box.

IDEATION

From Tool Overload to Path-Driven Clarity

phase 1
personalization design

In the early phase of the project, our goal was to bring order to the chaos. We needed to introduce hierarchy, help users understand their financial stage, and personalize the experience.

We anchored our strategy around the STEPScore quiz, a guided assessment that would evaluate a user’s financial well-being and surface tailored tools and courses. To support this, we:

  • Redesigned the dashboard UI with improved spacing, visual hierarchy, and modern components

  • Mapped user flows for first-time vs. returning users

  • Introduced micro-moments during onboarding to educate without overwhelming

  • Built a coaching logic model that matched users with dedicated coaches based on quiz results

turning point

Midway through the project, we adopted TIFIN @Work, an AI tool capable of delivering personalized financial guidance in real time. This changed everything.

The platform no longer needed to guess what tools a user might want; the AI could now suggest them with context. That allowed us to remove entire layers of complexity and rethink how we presented FinPath as a whole.

We quickly realized the AI tool couldn’t be introduced like a feature drop. It had to feel gentle, optional, and supportive, not like a system taking over.

That meant:

  • Avoiding the term “AI” in initial microcopy

  • Designing warm, outcome-driven language (“Get personalized help”)

  • Ensuring coaching stayed visible as a complementary path

  • Embedding personalization gradually, not forcing it upfront

phase 1
design around trust

We took a step back and reframed the dashboard’s job:
Instead of surfacing tools, it needed to help users choose how they wanted to receive guidance.

We redesigned the homepage into a 3-card layout, with each card representing a distinct, trust-calibrated path:

  • AI Guidance – for fast, intelligent suggestions

  • Coaching – for human support

  • Courses – for self-paced financial learning

Each path was:

  • Clearly described in plain language

  • Framed around outcomes, not technology

  • Tied to different comfort levels, from automated to human

We also replaced mandatory onboarding steps with optional, embedded entry points — like a shortened quiz inside the AI assistant, and clear live coaching CTAs.

IDEAS EXPLORED

Idea

Notes

Status

STEPScore Quiz-First Onboarding

Required up front to personalize tools

Kept initially, later deferred to AI onboarding

3-Card Dashboard Layout

AI, Coaching, Courses as distinct paths

Kept — became core structure

Live Coaching Access Point

Added a real-time option alongside scheduling

Kept — improved engagement

Personalized Banner for Financial Stage

Displayed user’s current quiz results


Cut — became unimportant the AI tool's memory

Financial Tool Platform

Broad label for calculators, resources, etc.

Cut — too vague, unclear purpose

ideation outcome

The redesign evolved from “make it modern and personalized” to “make it feel safe, supportive, and effortless.”

Phase 1 laid the groundwork: onboarding, quiz logic, coaching flows.
Phase 2 reframed the experience: clear paths, contextual guidance, and emotional trust.

phase 1
user flow

phase 2
user flow

TESTING

Validating Two Redesigns in Two Very Different Worlds

phase 1
approach

During the first redesign, our focus was on improving engagement through personalization, better onboarding, and clearer access to coaching.

We ran a series of internal usability tests and stakeholder walkthroughs to validate the effectiveness of our early ideas.

Phase 1
what we tested

Focus Area

What We Looked For

STEPScore Onboarding

Could users complete the quiz without drop-off or fatigue?

Coaching Assignment Flow

Did users understand how they were matched with a coach?

Dashboard Hierarchy

Could users distinguish between tools, courses, and coaching?

Returning User Logic

Did users know what to do when they came back?

Banner System

Did the STEPScore banner feel helpful or ignorable?

phase 1
ITERATION FROM

FEEDBACK

Feedback

Iteration

Users liked the concept of a financial “score,” but felt the quiz was too long

Reduced quiz length

Coaching flow worked well, especially for first-time users

Streamlined coaching logic to better handle rebookings

Tools and courses were hard to differentiate; many skipped over them entirely

Flagged tool sprawl as a deeper design issue


The persistent banner helped returning users re-engage, but became repetitive for power users

Set the stage for a future shift toward clearer pathways

phase 2
approach

With the integration of TIFIN @Work, we had to test not just interactions, but emotions. Would users trust this new way of receiving guidance?

We ran moderated tests using prototypes of the 3-path layout and AI entry flow, focusing on tone, clarity, and confidence.

phase2
what we tested

Focus Area

What We Looked For

3-Card Navigation

Could users understand the differences between AI, Coaching, and Courses?

Microcopy & Labels

Did users understand what they were clicking into and why?

AI Entry Flow

Did users feel in control, not forced into automation?

Quiz Timing

Was the lighter, embedded quiz experience easier to complete?

Live Coaching Access

Did users notice and prefer “Join Now” over “Schedule Later”?

phase 2
ITERATION FROM

FEEDBACK

Feedback

Iteration

“I’m not sure which card to start with.”

Added microcopy under each card to describe the experience (e.g., “Smart financial help from AI”)

“This assistant feels cold, is it secure?”

Reframed AI with warm, non-technical language

“Do I have to talk to a bot?”

Emphasized choice: human coaching always visible

“Quiz still feels like a lot.”

Deferred quiz deeper into AI flow, made it skippable

testing takeaways

Hierarchy wins over novelty.
Users defaulted to the clearest option. The 3-card layout helped them orient quickly.

Live coaching needs to feel immediate.
“Join Now” had a stronger conversion than “Schedule Later”, so we gave it visual priority.

AI needed a soft entry.
Framing, not features, was the unlock. By shifting focus to outcomes and using less technical language, trust improved.

Onboarding still had to earn attention.
 Whether via quiz or assistant, users appreciated frictionless entry, so we made everything feel lightweight and optional.,

SOLUTION

Designing a Financial Guide Users Could Actually Trust

Phase 1
solution

Our first redesign wasn’t about AI; it was about clarity.

We took a dashboard overflowing with tools and restructured it into a clear, guided experience centered around the user’s financial readiness. At the heart of this version was the STEPScore, a quiz designed to assess a user’s current financial health and recommend the right next steps.

Key elements included:

  • STEPScore Quiz Integration
     Users began their journey by taking a financial readiness quiz. Their score informed what tools, courses, and coaching pathways we suggested.

  • Coaching Flow Logic
     First-time users were auto-matched with a coach; returning users saw rebooking prompts.

  • Persistent Guidance Banner
     The user’s score and recommended actions were visible across the dashboard to create a sense of continuity and momentum.

  • Tool Reorganization
     Calculators, articles, and courses were grouped by goal, retirement, debt, and budgeting, to reduce decision paralysis.

This version increased coaching sign-ups and added structure, but as the product evolved, it became clear that STEPScore alone couldn’t scale personalization in the long term.

phase 2
solution

Halfway through the project, we adopted TIFIN @Work, a white-labeled AI tool capable of delivering real-time, personalized financial advice. This marked a major shift not just in functionality, but in how we framed the platform’s purpose.

We weren’t just giving people tools anymore.
We were giving them a guide.

the new experience

To keep the experience approachable, we replaced the dense homepage with three simple, outcome-driven paths:

Path

Description

AI Financial Guidance

Real-time, smart help via conversational assistant, positioned as quick, private, and adaptive

Coaching (Live or Scheduled)

For those who wanted human support, now easier to access and differentiated by user type

FinPath University

Self-paced learning modules, anchored by quiz or AI data to guide discover

why the changes worked

Feature

Redesigned Version

Quiz Onboarding

Folded into AI onboarding with fewer, easier questions

Tool Discovery

No longer needed, AI handled recommendation logic

Microcopy & Labels

Shifted to soft, benefit-driven language (“Smart help made for you”)

Coaching Entry

Added “Join Live” button with urgency cues

Navigation Hierarchy

Users landed directly on the 3-card layout, no menu required

what made it work

What made the final version successful wasn’t complexity; it was simplicity with intention.

  • Clarity replaced confusion

  • Choice replaced assumption

  • Momentum replaced fragmentation

And by presenting AI as one of many equally valid paths (rather than a forced new default), we honored the real emotional and cognitive needs of our users.

RESULTS

Clarity Increased. Coaching Rose. Trust in AI Took Root.

big picture

This wasn’t just a UI polish. It was a full transformation of FinPath’s experience, across two major redesigns, that reshaped how users interacted with financial tools, coaching, and new AI technology.

Both redesigns delivered outcomes. But the second one proved what happens when UX removes friction and gives users agency.

phase 1
outcome

  • Significant increase in coaching sessions scheduled

  • More users completed financial tasks after onboarding

  • Higher completion of the STEPScore quiz when presented upfront

  • Noticeable uptick in usage of self-guided tools and calculators

“I finally understood what I was supposed to do next. The quiz made it feel like someone was paying attention.”
— User feedback, early 2025

phase 2
outcome

  • High engagement with the new AI guidance feature within weeks of launch

  • Sharp decrease in quiz drop-off after moving it into AI onboarding

  • Strong interest in live coaching surfaced directly on the dashboard

  • Higher retention among returning users due to simplified layout and contextual prompts

  • Reduced bounce rate from homepage and improved time-to-action for core tasks

“I didn’t even realize it was AI at first, it just felt like someone was helping me.”
— User feedback, post-AI launch

what it validated

  • Simplicity drives trust, especially with unfamiliar tech

  • Personalization feels more powerful when it’s invisible

  • The right entry point changes everything — from AI to coaching

  • Clear design turns passive users into confident participants

REFLECTION

Designing With Empathy, Pivoting With Purpose

big picture

This case study wasn’t just a UX challenge; it was a lesson in adaptability, trust-building, and knowing when to let go of our assumptions.

what i learned

1. Design Is a Conversation, Not a Deliverable

The first redesign gave us clarity and structure. But when the product pivoted to AI, I had to let go of what was working and listen again to the users, the tech, and the business.

2. AI Isn’t a Feature, It’s a Relationship

Rolling out AI wasn’t a UI challenge; it was a trust challenge. I learned that introducing AI starts with emotion, not tech specs.

3. Success Is Often Subtractive

After months of building, testing, and refining… what I’m proudest of is the three cards. That simplification turned the dashboard from a maze into a mentor.

what i'd do differently

  • Advocate for Leaner Testing Earlier:
    The second redesign moved fast due to partner timelines. More lightweight testing upfront could’ve validated copy tone and AI positioning even faster.

  • Push Harder for Emotional Framing at Launch:
    I wish we had introduced the new experience with a stronger emotional hook, like a story-driven walkthrough, not just UI updates. Trust grows through tone as much as structure.

closing thought

This project reminded me that product design is never just about pixels; it’s about people navigating uncertainty.

In an industry that often races to implement the newest tool, we slowed down long enough to ask, “How does this actually feel?”

And that made all the difference.