Proactive
interventions
How I reduced DIY filer abandonment by 14% and increased contextual help engagement from 12% to 18% across 45M+ TurboTax users.
- Configurable component system shipped across TurboTax, QuickBooks, and Mint
- Product vision and phased UX roadmap
- Proactive intervention self help solution for over 45 million users



VEP the “Virtual Expert Platform” powers the connection and success of customers and SMBs to our live help support, within the TurboTax ecosystem.
Overview
TurboTax's proactive help experience relied on static pop-ups that users often ignored or distrusted, resulting in low engagement and abandonment of the tax flow.
My Role
I joined the team in 2021 as a named contributor on the Integrated PIM creative brief alongside PM and content leads. Over two years, I designed a context-aware intervention system that surfaces help based on user behavior signals. The system increased engagement from 12% to 18% and reduced abandonment by 14% across 45M+ DIY filers.
Help Existed, But Users Didn't Engage
Proactive chat was surfaced through a pop-up module (PIM) with 12% engagement. Contextual experiments raised that to 18%. But 81% of surveyed users perceived the banner as an upsell. The problem was trust and timing, not visibility.
Insights
The banner looked like an ad. Users treated it like one. Internally we called it FUD — fear, uncertainty, and doubt. That's the actual user state we were designing against.
When help showed up at the right moment — right screen, right words — engagement jumped 8%.

Engagement
12%
Proactive chat engagement rate through pop-up module (PIM)
Perceived as Upsell
81%
Of surveyed users perceived the banner as an upsell, not help
When Contextualized
18%
Engagement when help appeared in direct response to user confusion
Customer Problem Statement
I am a DIY TurboTax filer mid-return.
I am trying to complete my taxes without second-guessing every number.
But when I hit a confusing screen, the help that appears feels like an ad, not assistance.
Because it's generic, untimed, and doesn't reference what I'm actually looking at.
Which makes me feel fearful, uncertain, and doubtful — the exact state that makes filers abandon.
DEFINE — Narrowing to the Right Problem
Clarifying an Incomplete Product Strategy
The creative brief had the right instinct — if help is introduced in context, customers will see it as more credible. But it stopped short of defining what context meant or how to measure credibility. The PRD that followed assumed placement was the answer without asking why users ignored the existing banner in the first place.

Problem statement was unclear
The document focused on where to place the intervention rather than defining the underlying user problem.
Solution was predetermined
The PRD assumed the Help Panel placement was the answer without exploring alternative UX patterns.
Limited exploration of trust and credibility
The proposal did not address why the existing intervention felt promotional or untrustworthy to users.
I mapped three customer types from research archives. Each type has different blockers, different emotional states, and needs a different intervention tone.

Same component structure, different messaging per customer type.
DISCOVER — Building Empathy Before Solutions
Creating a Strategy and Roadmap
The creative brief had the right hypothesis but no definition of what "context" actually meant. I pitched a half-day Design for Delight workshop — six people, three hours, one agreed output before anyone opened Figma. I opened with the raw data, ran Customer Problem Statements, Ideal State, How Might We, and Crazy 8s. Three people independently sketched the same solution without talking to each other. We closed by mapping LOFAs on the 2x2 and writing hypothesis statements with defined success metrics. One-page summary written same day. That document replaced the PRD.

Design Principles
We defined four core intervention principles that would guide every design decision across the proactive help system.
Speak to the moment
Copy should reflect exactly what the user is seeing — not a generic offer for help.
Intervene at the signal
Behavioral cues like looping and stalling are the trigger. Help responds, it doesn't interrupt.
Help should feel native
Interventions use the same visual language and tone as the product — no modal, no jarring shift.
Earn the click
Lead with credibility and context first. The CTA comes after trust is established, not before.
Target State
Contextual InterventionEvery user who hesitates receives a response specific to their screen, their data, and their moment in the flow. The goal: users feel like the software anticipated the hard parts, not that it noticed them struggling.
"Success isn't engagement rate. It's whether users felt the product was on their side."
Mapping Where Users Hesitate
PM and I were aligned on the problem, not the solution. I pushed the framing from placement to timing — when does a user actually need help, not where should the UI live. We built a strategy around clickstream signals: loops, stalls, and time-on-page. Help surfaces at the moment of friction, not as a static UI element. I laid out three models — modal, toast, and card — and stress-tested each against credibility, friction, and scalability across product teams. The card aligned best with IDS and showed the lowest friction in early testing. Design systems signed off before I moved to hi-fi.

DEVELOP — Going Broad on Solutions
How Might We Design For Scalability Across Intuit?
PM and I were aligned on the problem, not the solution. I pushed the framing from placement to timing when does a user actually need help, not where should the UI live. I laid out three models — modal, toast, and card and stress-tested each against credibility, friction, and scalability across product teams. The card aligned best with IDS and showed the lowest friction in early testing. Design systems signed off before I moved to hi-fi.
Leap of Faith Assumptions we tested
- — Users will trust an avatar they've never seen if it appears at the right moment
- — Behavioral signals (loops, stalls, time-on-page) are reliable enough to trigger interventions without false positives
- — "Included at no cost" messaging removes the upsell perception — the brief called this "no strings attached" and it became our content north star
- — Users will choose to engage with help rather than dismiss it when it appears at the exact moment of doubt

Scaling Across the Enterprise


The brief scoped this to TurboTax first with other platforms as a stated future requirement. The Fixed / Optional / Editable framework wasn't over-engineering — it was building to a requirement already on the roadmap.
How Might We Increase Credibility In The Moment?
With the card pattern set, I tested variations: agent avatars, availability indicators, "Included At No Cost" messaging, and clear CTAs. The goal was a flexible framework that supports different intervention moments while staying consistent with IDS.



Designing Persistent Access with FABs
If users dismissed the intervention card, a FAB let them re-open support. Blue = free DIY help. Red = paid expert support. The number badge shows active conversations so users can continue filing without losing their place in line.


DELIVER — Rapid Experiments in Context
Content Strategy — How Might
We Apply Context?
Hypothesis
If we surface a contextual card after two failed W2 import attempts, users will engage with expert help rather than abandon — measured by intervention engagement rate on the W2 trigger specifically. Success = engagement above 15% on triggered sessions.
W2 Import
New Customer risk: HighTechnical errors or mismatched data during import make users doubt the software's accuracy, leading to "fear of the IRS" abandonment.
Content Goal: Reassurance and technical resolution.
- •Contextual Trigger: Trigger after two failed import attempts or if the user stays on the "Review your W-2"page for >90 seconds.
Refund amount
Existing Customer risk: HighUsers often abandon when the refund is lower than expected or the tax due amount is shockingly high.
Content Goal: Explain the "Why" and pivot to "What's Next."
- •Contextual Trigger: Trigger when the "Tax Due" or "Refund"amount is first displayed, or if the user clicks "Why is my refund lower?" multiple times
After hours
Lost Customer risk: HighUsers working late feel alone. If they hit a wall when experts aren't live, they close the tab and often never come back.
- •Content Goal: Set expectations and secure a "save for later" commitment.
- •Contextual Trigger: Trigger automatically when a user spends significant time on a complex section (like Business Income) outside of live-chat hours
What We Shipped
Q3: We shipped a behavior-triggered intervention system for three high-risk moments: W2 import failures, refund volatility, and after-hours abandonment.
Hesitation signals (repeated imports, refund confusion, late-night sessions) trigger contextual support. Messaging, tone, and escalation path adjust based on user state.
Results & Impact
We defined success before shipping: contextual engagement above 15% and a measurable reduction in flow abandonment. Each result maps back to a specific LOFA we tested. Here's what we found.
14% abandonment reduction tracks directly to 18% engagement lift. Users who engaged with contextual help were less likely to drop off.
14%
Reduced abandonment
Among DIY TurboTax filers who engaged with proactive interventions during filing.
At TurboTax's scale, 14% fewer drop-offs among engaged users = millions more completed returns.
18%
Engagement increase
Contextual help engagement when surfaced at the right moment in the user workflow.
45M+
Users impacted
Scalable proactive intervention system deployed across the TurboTax DIY experience.
+26%
Confidence score
Improvement in self-reported user confidence for customers who received contextual help.
Confidence is the strongest predictor of return-to-file rate the following year.