What Happens When Students Say No to Everything You Built
2025
INDEX
Role
Product Strategist
Status
Company
MU20
Category
Product Strategy

TL;DR
The Challenge
The EdTech market has thousands of course platforms and almost no career guidance that actually works. MU20 wanted to change that for Gen Z high school students. We had 50+ features, zero user research, and 4 weeks to a pre-seed funding meeting. The real challenge wasn't the deadline. It was that we were building solutions before we understood the problem.
My Output
Feature prioritization across 50+ backlog items, psychometric onboarding design, Micro Trials concept and system, Explorer/Focus architecture, Silver Lining feedback mechanic, AI Pathway Builder with three pathway variants, Opportunity Vault as a B2B surface, and a full Atomic Design System handed off to engineering
The Impact
Pre-seed funding secured in January 2026. The prototype convinced investors not just with screens but with a fundable thesis: students re-engage when career exploration feels like doing, not learning. The design system enabled engineering to begin building immediately without a redesign sprint.
The Honest Truth
This case study shows I can work at the intersection of product strategy and design craft under real pressure. I pushed for user research when the team wanted to skip it. That one decision changed the entire product. The $500k outcome is the proof it was the right call.
01 — THE MESS I WALKED INTO
MU20 was my first company.
I did my first design work there. So when they came back to me with a new problem, asking me to help define a product that could reshape how Gen Z students find career direction, I said yes without hesitation. Different time zones, contract basis, none of that mattered. I wanted to solve this for them.
What I walked into was messier than the brief suggested.
The team had ideas. Lots of them. There were 50+ features mapped across ten modules: onboarding, discovery, clubs, opportunities, profiles, progress tracking, an AI coach, a compete layer, college tools, account settings. The energy in the room was high. The direction was not.
This is a pattern I have seen in early-stage startups. When the deadline is a funding meeting and the pressure is real, teams move toward output. Features get named, flows get sketched, solutions get momentum, all before anyone has defined the actual problem. It is not a failure of intent. It is what happens when the clock is running.
I wanted to slow down just enough to do one thing right.
I made the case for user research. Not a long study, not a formal process, just real conversations with real students before we committed to a direction. My argument was simple: if the MVP does not land, we need to know whether we were solving the wrong problem or just solving it wrong. Research gives us that. It is not a delay. It is insurance.
The team agreed. We talked to 10 to 12 high school students. I was not running the sessions. But I made sure they happened.
What those students told us changed everything.
Turns out, they didn't want what we were building.
02 — THE INSIGHT THAT BROKE THE BRIEF
We went into user research thinking we were building a smarter course platform. We came out knowing we were not building a course platform at all.
The students were clear. Not hostile, just honest. They did not need more courses. They already had school for that. They said things like "I don't know what I'm actually good at beyond my marks." They said "I don't know if I'll like a career until I try it." They said "counseling and aptitude tests feel like a generic advice scam."
Not one student said they needed another course.
That was the moment everything shifted. We had been designing a smarter way to package and deliver educational content. What students actually needed was something closer to an action engine. A way to try things, not just learn about them. Advice, we realized, does not change behavior. Courses are advice. Clubs, challenges, real opportunities with real stakes: those are action.
The pivot was this: stop teaching. Start doing.
We went from course-led to opportunity-led. From passive consumption to active participation. The platform stopped asking "what do you want to learn?" and started asking "what do you want to become?" Every feature across those 50-odd backlog items got filtered through one question: does this help a student do something, or does it just tell them something? If it only told them something, it got cut or pushed to a later version.
The three modules that survived that filter became the spine of the MVP: Discover, Do, Show. Know yourself. Try something real. Make your effort visible.
MU20 is an action engine, not a guidance engine.
That one line became the filter for every design decision that followed.
03 — THE GOALS
Once the pivot was clear, we needed to align on what success actually looked like before touching Figma again. Not features. Not screens. Three questions that had to be true for the MVP to work.
Does a student understand what to do on day one without anyone explaining it to them?
Does the platform help them actually do something, not just browse?
Do they come back on their own?
Everything we designed had to serve at least one of those three. If it did not, it did not ship.
User Goals
Students needed to feel less lost, not more overwhelmed. The platform had to meet them wherever they were: the kid with no idea what they wanted and the kid who thought they did but had never tested it. It had to feel like trying something, not taking a test. And it had to give them something to show for their time, a visible record of effort that was not just a grade.
Design Goals
The experience had to be legible in under a minute. Exploration had to feel low-stakes, almost like play. Reflection had to happen naturally, built into the flow rather than bolted on at the end. And the profile had to feel like something a student would actually want to open again, not a form they filled out once and forgot.
04 — KEY DESIGN DECISIONS
Decision 01: The Psychometric Onboarding
What it is
Before a student touches any feature, the platform asks a set of questions designed with a child therapist's input. Not "what is your dream job?" but "what does the view from your dream room look like?" Not "what are you good at?" but "when you are doing something and time disappears, what is it?" The questions are indirect, imaginative, and deliberately low-pressure. The answers feed a profile vector that shapes every recommendation the platform makes from that point forward.
The design logic
Standard onboarding questions produce standard answers. A student who has been asked "what do you want to be when you grow up?" for twelve years has a rehearsed answer. It is usually what their parents want or what sounds impressive. Indirect questions bypass that performance layer and surface something more honest. The dream room question tells us whether a student is drawn to nature, to cities, to solitude, to crowds. That is real signal. A checklist of interests is not.
The product logic
The psychometric layer is what makes every downstream recommendation feel personal rather than generic. It also separates MU20 from every EdTech competitor that opens with a multiple choice quiz. For investors, it signaled that the team had thought seriously enough about the intake problem to bring in external expertise and design around it. That level of intentionality in the first screen of the product sets the tone for everything that follows.
Decision 02: Micro Trials
What it is
Micro Trials are short, hands-on experiences that let students do the actual work of a career for two to four hours, without commitment, stakes, or formal enrollment. A student exploring UX design gets a simplified usability task: which button placement reduces friction, which contrast ratio is more accessible, which layout guides the eye more naturally. A student exploring law gets a mini case analysis. A student exploring entrepreneurship gets a problem brief and has to pitch a solution. The AI analyzes how they engage, not whether they got it right, and surfaces skill signals. "You showed strong systems thinking here. Want to try a product management micro trial next?"
The design logic
This came from a simple observation: the most-watched career content on the internet is "day in my life as a UX designer at Google." Students do not want to be told what a job is like. They want to feel what it is like. Micro Trials are the closest a platform can get to that experience without actual job placement. The key design choice was removing pass/fail entirely from the evaluation layer. The system does not tell you what you got wrong. It tells you what your behavior revealed about your strengths. That reframe changes the emotional experience of the task from a test to a discovery.
The product logic
Micro Trials solve the single biggest drop-off problem in career exploration platforms: students browse, feel overwhelmed, and leave without doing anything. A trial forces a decision point in the best way possible. It is low enough stakes that students will start it, and engaging enough that they will finish it. The behavioral data it generates is also the most valuable signal in the entire platform. It is not self-reported interest. It is demonstrated tendency.
The gate mechanic connects directly here. After completing trials and selecting a major, students do not get added to a club automatically. They complete a short take-home assessment first. Not a course exam. Just enough friction to confirm genuine interest. If you are not willing to spend thirty focused minutes on this, the club is not right for you yet. That selectivity makes club membership feel earned, which keeps students more engaged once they are in.
Decision 03: The Explorer/Focus Toggle
What it is
At the start of onboarding, students choose their mode. Explorer drops them into an open, curiosity-driven feed of micro trials, quizzes, and career content. Focus fast-tracks students who already have a hypothesis directly to a domain deep dive, skipping the discovery layer entirely.
The design logic
A single onboarding flow that treats all students as undecided creates immediate drop-off for students who already have direction. Forcing an exploratory experience on a student who knows they want to study architecture feels patronizing. Forcing a commitment flow on a student with no idea what they want feels terrifying. The toggle is one question that branches into two completely different emotional experiences downstream. The Explorer side leans into play and serendipity. The Focus side leans into structure and momentum.
The product logic
Drop-off in onboarding is where most EdTech products lose users permanently. The toggle meant that regardless of where a student was in their self-awareness journey, the first experience felt right-sized for them. For investors, it also signaled product maturity: we understood our users were not a monolith, and we had designed for that from the very first screen.
Decision 04: The Wizard of Oz Architecture
What it is
The front-end prototype was designed to feel like an intelligent, personalized AI system. The backend for the MVP was rule-based matching. No real recommendation engine. No trained model. Opportunity lists were manually curated by the team and mapped to user inputs. The experience felt personalized. The mechanism powering it was human judgment dressed as machine intelligence.
The design logic
We had four weeks and no engineering runway for a real AI backend. The Wizard of Oz approach let us validate the value of personalization before investing in the technology to automate it. The question we were actually testing was not "can AI do this?" It was "do students respond differently when recommendations feel personal?" Answering that question mattered more than the mechanism behind it.
The product logic
Investors at pre-seed are not funding a tech stack. They are funding a thesis. Our thesis was: students disengage when advice feels generic and re-engage when it feels made for them. The prototype proved that thesis experientially without a single ML model behind it. Naming the strategy explicitly, calling it Wizard of Oz, demonstrated that we understood the difference between validating value and building infrastructure. That is a product thinking signal, not a confession.
Decision 05: The AI Pathway Builder
What it is
A student selects a North Star goal: "I want to work at a top design studio," "I want to launch my own company," "I want to get into a competitive university program." The system generates three personalized pathways to reach that goal, one optimized for cost, one for time, one for skill-building. Each pathway is pre-populated with real, vetted opportunities in sequence. Students can edit, swap, and commit to a pathway, which becomes a sticky progress bar throughout the entire platform. The path updates as they complete milestones and can be changed at any time, with a soft nudge from the AI when they try to drop it.
The design logic
Opportunity platforms fail because they show you everything and help you decide nothing. A student landing on a page of 200 programs does not know where to start. The Pathway Builder inverted that model: you declare a direction first, and the system builds the path toward it. The three pathway variants respected a real constraint: different students have genuinely different circumstances. A student in a smaller city with a tight budget needs a different path than a student in a major metro with time and resources. Designing for those differences explicitly was a choice about equity as much as UX.
The product logic
This was the feature that made investors lean forward. Not because of technical sophistication, it was still Wizard of Oz under the hood, but because it answered the question every EdTech investor is actually asking: does this change what a student does tomorrow? A student who commits to a pathway has made a micro-contract with the platform. That commitment is the foundation of retention, re-engagement, and the B2B monetization model, because every opportunity slotted into a pathway is a potential placement for a paying partner. The pathway is not just a UX feature. It is the commercial spine of the product.
05 — BUILT VS. DESIGNED
The gap between what was designed and what was built is not a failure list. It is a prioritization record. Every item below that did not ship for MVP was a deliberate cut, not an oversight.
Built for MVP | Designed but not developed |
Psychometric onboarding with therapist-informed indirect questions | Deep formal assessments: MBTI, OCEAN, aptitude tests |
Explorer vs. Focus mode toggle | Personalised recommendation engine |
Micro Trials with AI skill signal feedback | Full AI-matching of opportunities to student profiles |
Silver Lining feedback mechanic | Gamified leaderboards and XP system |
Domain deep dives with multimedia sequencing | Shared quizzes and friend challenges |
Club major/minor selection with take-home gate assessment | Clubs community feed, posts, collaboration projects |
AI Pathway Builder with three pathway variants | Full AI-generated roadmaps with hundreds of opportunities |
Opportunity Vault with manual admin curation | Automated partner upload and targeting engine |
GitHub-style consistency tracker on profile | Endorsements, rewards, and level system |
Growth log chronology | College tools, portfolio builder, application guidance |
Atomic Design System handoff | Parent dashboard, counselor tools |
The cuts followed one rule: does this need to exist to validate the core loop? Discover, Do, Show. If it did not serve that loop directly, it went to V2.
I can articulate exactly what I would build first in V2, why, and how I would measure whether it worked
06 — THE OUTCOME
In January 2026, the founders walked into a pre-seed funding meeting with our high-fidelity prototype and early beta data. They walked out with investment.
The design did specific work in that room. The psychometric onboarding reframed what intake could feel like for a teenager: not a form, not a test, but a conversation. The Micro Trials demonstrated a new category of career exploration that did not exist anywhere else in the market. The AI Pathway Builder made the abstract concrete: here is your goal, here is the path, here is what you do Monday morning. And the Opportunity Vault gave investors a tangible picture of the revenue model.
I concluded the contract by handing off a scalable Atomic Design System, with Quests, Opportunity Cards, Profile components, and Pathway modules fully componentized. The engineering team could begin building immediately without a redesign sprint.
The work I am most proud of is not any single screen. It is that the product we shipped to investors was fundamentally different from the product we started designing, because we stopped to ask twelve teenagers what they actually needed.
This project taught me something I will carry into every engagement after it: the most dangerous moment in a product sprint is when the team mistakes momentum for direction.
We had energy. We had features. We had a deadline. What we almost did not have was a reason to build any of it. Thirty minutes of honest conversation with twelve teenagers changed the entire product. Not because the research was sophisticated. But because we actually listened to what students said instead of what we assumed they needed.
The Micro Trials concept came from watching reels. The psychometric onboarding came from refusing to accept a standard form as good enough. The Pathway Builder came from asking what a student would actually do the next morning after using the platform. None of those ideas came from the backlog. They came from staying curious about the person on the other side of the screen.
Designing for funding is a unique constraint. You are simultaneously solving for the user sitting in front of the product and the investor sitting across the table evaluating it. The best decisions I made on this project were the ones that served both without compromising either.
The goal was never to build a platform. It was to build a mirror, one that showed a teenager something true about who they were becoming.