Smash.technology 2023 5 months

Salvaging the Smash MVP.

A stalled LMS, an offshore team on a waterfall contract, and five months to make it ship. I renegotiated the vendor SOW, led a platform migration, and built the product from scratch, as the only designer on the team.

Dev costs saved $220K+ SOW renegotiation + Open edX migration.
NPS swing −17 → +43 60-point improvement. Beta to GA.
MVP delivered 3 months On time, on budget. 2 months of testing and iteration followed.

What I did

  • Renegotiated the vendor SOW: moved from waterfall to sprint-based delivery with explicit acceptance criteria.
  • Led migration to Open edX (open-source LMS), eliminating $220K+ in custom build costs.
  • Cut three in-flight features (Smash Score, native chat, interviews) before they shipped broken.
  • Built a token-based design system on Ant Design before redesigning any screens.
  • Overhauled the IA and redesigned events discovery from form-based search to a browsable card grid.
  • Ran usability sessions throughout with active students; incorporated findings directly into shipped decisions.
Index / Selected work / № 01

Salvaging the Smash MVP: five months, one cut list.

ClientSmash.technology
Year2023
RoleFounding Designer
TagsProduct Strategy · Design Systems · UX Research
The setup

A learning-management system aimed at corporate L&D teams. Eighteen months of offshore development, a waterfall contract, three engineering teams who had never seen each other's tickets, and an MVP that beta users described (politely) as confusing. I joined as founding designer with five months of runway and a single brief: make it shippable.

$220K+ Dev costs saved

SOW renegotiation and Open edX migration together. Not by cutting corners. By cutting scope and rebuilding on stable ground.

+43 NPS at GA

From a beta NPS of −17 to a general-availability NPS of +43, a 60-point improvement in six weeks post-launch.

3 mo. MVP delivered

Shipped on time and on budget. The remaining 2 months of the contract went to testing and iteration.

Three-stage design evolution: wireframe, first redesign, and final Open edX product shown at an angle
Fig. 01The design arc: wireframe through first redesign to the final Open edX build. Three distinct phases across two years.
§ 01

The problem

Three things were true at once. The product had real ambition and a thoughtful pedagogy underneath. The build was bloated with half-finished features no one could remember commissioning. And the offshore team, capable but under-briefed, had been shipping against tickets that lived in a spreadsheet without context.

Beta users weren't bouncing because the product was bad. They were bouncing because they couldn't tell what the product was. The platform was trying to be a learning tool and a hiring pipeline simultaneously, communicating neither clearly.

"I'm not sure what to do here. There aren't any real interviewers. Do I have to learn something to get a real interview?"

That quote, from a first-time user navigating the platform alone, was the clearest diagnosis I had. The product wasn't legible.

§ 02

Constraints

These shaped every decision I made.

  • No internal engineering team; all development outsourced to an offshore shop under a waterfall contract. I couldn't walk over to an engineer's desk.
  • Budget fixed and shrinking. Every week of delay cost real money against a finite runway.
  • No existing design system, no design debt documentation, no component inventory. Starting from zero artifacts.
  • Two interns, no other designers. I was the entire design function.
  • Active users already in live cohorts. I couldn't throw everything out and start over. Migration and continuity mattered.
§ 03

Research & discovery

Before touching a wireframe, I spent the first two weeks in discovery, understanding what was actually broken from the user's perspective, not the roadmap's.

  • Stakeholder interviews with the CEO and team leads to map the gap between what was promised to partners and what the product actually did.
  • User interviews with 5–10 active students, focused on their mental models: what did they think Smash was for, and where did it fall short?
  • Heuristic evaluation of the existing product: every flow documented for context-switching failures, IA inconsistency, and misleading UI.
  • Competitive audit of eleven comparable platforms across three axes: learning & upskilling, real-world projects, and hiring enablement.

The research produced a clear primary persona: a student who understood she needed career help but had no idea how to navigate the industry landscape or what platforms like this were even for.

'Sanya Student' user persona — 22, undergrad, humanities major, wants to begin a satisfying career within 2 years

One of the personas we used

Part of the discovery was understanding what students were used to seeing in LMS platforms, specifically what "normal" looked like to them before they arrived at Smash.

Canvas LMS dashboard
Canvas
Docebo LMS dashboard
Docebo

The competitive audit revealed a genuine opening: no competitor was doing all three (meaningful learning, real industry projects, and direct hiring pipelines) in a single integrated platform. That framing became the foundation for every scope decision. Features that didn't serve the learning-to-hiring pipeline got cut. Features that strengthened it got prioritized.

§ 04

The strategy

Success required more than new screens. It required a structural overhaul of how the product was built, starting with the contract.

I spent the first two months functioning as a technical product manager as much as a designer. The design problems were downstream of process problems, and nobody else was positioned to fix them.

  • SOW renegotiation. Rewrote the Statement of Work with the offshore dev shop, replacing the open-ended waterfall arrangement with an agile framework, sprint-based delivery milestones, and strict acceptance criteria.
  • First redesign attempt. In parallel, I began a full visual redesign on the existing custom platform, building a proper design system, overhauling the IA, and laying the groundwork for a browsable events experience. This work proved the direction was right, even if the platform ultimately couldn't carry it.
  • Platform migration. Partway through the first redesign, I advocated for and led migration to Open edX (open-source, MIT license) instead of continuing to build custom infrastructure. A stable, extensible foundation at a fraction of the cost, but one that brought its own visual and technical constraints.
  • Financial impact. The contract restructure and migration together saved over $220,000 in development costs and reduced time-to-market by approximately three months.
§ 05

What I killed

At a startup with a fixed runway, every yes is a no to something else. Some of the most important design work I did at Smash was deciding what not to build.

  • The Smash Score. A personalized metric comparing a student's skills against hiring requirements, controlling access to projects and introductions. User research killed it. Students didn't trust it ("What happens if it's wrong?") and a mysterious number making decisions about their futures felt fundamentally unfair. I removed it entirely and replaced access eligibility with explicit, transparent criteria.
  • Native chat. Too technically unstable to ship reliably. Cut it and pivoted to Microsoft Teams as a communication layer: an existing tool students already had, rather than competing with it.
  • Interviews and mentors. Both planned, both deferred explicitly. We couldn't deliver them at a quality level that matched what they promised. Shipping a half-built mentorship feature would have damaged trust more than not shipping it at all.

The pattern across all three: I wasn't cutting because they were bad ideas. I was cutting because shipping a focused, trustworthy product was worth more than shipping an ambitious, unreliable one.

§ 06

The events redesign

The events discovery flow is where the design evolution is most visible. It went through three distinct states across two years.

The original. The inherited events page required students to fill out a search form — event name, category, state, city, date — before seeing any results. "Fill out any of the fields below to start searching." One student described the experience exactly: "I only see what I search for, but how do I see everything to know what to search?" It was a search-first paradigm applied to a browsing problem. Students didn't know what kinds of events existed, so they didn't know what to search for, so they found nothing and gave up.

Original Smash events page showing a blank search form with fields for event name, category, state, city, and date — no results visible
Fig. 02The original events page. "Fill out any of the fields below to start searching." Students had to know what they were looking for before they could find anything.

The first redesign (2022). I replaced the form with a card grid — all available, eligible events displayed upfront, browsable without any prior search input. Clicking a card opens a popup overlay with event details and the registration action, without navigating away from the grid. The background blurs behind the open card, signaling that clicking outside will dismiss it. This interaction pattern was established during the first redesign and carried forward into the final product.

First redesign events page showing a card grid of events with a popup detail overlay for Microsoft Azure 101, blurred background behind the card
Fig. 03First redesign: card grid with popup detail overlay. The blur signals dismissal by clicking outside. Microsoft, Netflix, and Amazon visible as hiring partners.

The Open edX build (shipped). The platform migration to Open edX brought component and layout constraints that required adapting the first redesign's visual direction. The core interaction logic (card grid, popup detail, dismiss by clicking outside) remained. The aesthetic changed to reflect both the Open edX framework and the expectations of the corporate partnerships we were building toward. It shipped in 12 weeks, on time and on budget.

Final shipped Open edX events page showing a card grid of corporate events including Microsoft and Amazon
Fig. 04Final shipped product: Open edX events page. Same browsing logic as the first redesign; visual direction shaped by platform constraints.
§ 07

How it shipped

We shipped the MVP in 12 weeks, on time and on budget. The following 8 weeks were tests and iterations, running product demos and working with a limited set of students to pressure-test what we'd built.

Six weeks after GA, NPS was +43 against an internal target of +20, up from a beta score of −17. One student's post-launch response captured what the numbers reflected: "I really like that I only see what works and that it's clear now."

§ 08

What I'd do differently

The Open edX pivot should have happened sooner. The time and cost savings were so significant that an earlier move could have materially changed the company's trajectory. I understood the value; I didn't push hard enough, fast enough. The first redesign work wasn't wasted; it proved the design direction, but it could have been applied directly to the right platform from the start.

I'd make the cut list a public artifact earlier. Half the founder's reluctance to cut features came from never having seen them stack-ranked against each other. Once the doc existed, the conversation took two days; without it, it had been stuck for months.

I'd also start usability sessions in week one, not week three. The screens I redesigned in weeks 1–3 were the ones I redesigned again in weeks 8–10. I hadn't yet watched anyone use them.

[ Case study № 01 ]

Salvaging the Smash MVP. Five months. One cut list.

ClientSmash.technology
Year2023
RoleFounding Designer
Duration5 months · 2mo strategy, 3mo build
StackFigma · Ant Design · Open edX · Linear · Notion
Outcome$220K+ saved · NPS −17→+43 · 3 contracts

Problem

Eighteen months of offshore development. Three engineering teams. A waterfall contract written before there was a product. An MVP that beta users described as confusing. Active students in live cohorts who couldn't describe what the platform was for.

Needs

Ship in five months. Recover credibility with the founder's first three corporate prospects. Stop the dev cost bleed. Make it possible for the offshore team to finish without rework. Don't break the live cohorts mid-semester.

Solution

Renegotiate the contract before writing a line of design. Migrate to Open edX before designing a component. Cut three in-flight features before building any new ones. Design system before screens. Card grid before search forms.

Less product, more clearly explained.

Three-stage design evolution shown at angle: wireframe, first redesign with Smash branding, and final Open edX product
Fig. 01 · Wireframe → first redesign → Open edX final. Three distinct phases; the design direction established in phase two, executed on a different platform in phase three. Smash.technology · 2022–2023

Competitive audit

11 platforms · 3 axes
Platform Learning & upskilling Real-world projects Hiring enablement Notes
Smash Integrated coaching & progress tracking Live industry projects with companies Direct pipelines to hiring partners Full-stack integration
Riipen Depends on partner institutions Course-based company projects Indirect; visibility, not direct matching Needs educator buy-in
Forage Structured virtual experiences Simulated employer-branded tasks Brand awareness, not placement Asynchronous + self-paced
Acadium Apprenticeship training Micro-internships with small businesses Focused on freelance & early-career roles Niche, limited in scale
SV Academy Full-time training cohorts Internal or simulated projects Placement into partner companies Specific to sales roles
Handshake Light skill development Few real-world experiences Job and internship board Widespread adoption, weak on depth
Piazza Careers Light engagement, mostly forum-based Minimal project integration Employer discovery & outreach Recruitment layer
Coursera / edX Robust credentialed learning Occasional capstones or simulations No direct job placement Strong academic partnerships
NovoEd Project-based learning Mostly internal learning simulations No external hiring connections Used in corporate settings
Turing / Karat No learning component Simulated coding challenges Job matching for vetted candidates Focused on engineering
LinkedIn Passive skill building No structured projects Massive hiring network Broad but shallow integration

No competitor was doing all three — meaningful learning, real industry projects, and direct hiring pipelines — in a single integrated platform. That gap became the foundation for every scope decision I made.

Decision 01Contract first

Renegotiate the SOW before designing anything.

Replaced the open-ended waterfall arrangement with sprint-based delivery and explicit acceptance criteria. The design problems were downstream of process problems. Those had to move first.

Trade-off

Two months before a pixel moved. Hard sell to a founder watching runway burn. But every design decision I made later was only possible because the contract allowed it.

Decision 02Platform migration

Migrate to Open edX mid-redesign instead of finishing on the custom platform.

I was partway through a first full redesign when the case for migrating became undeniable. Open edX: open-source, MIT license, comprehensive LMS features already built. The choice freed engineering hours for features that actually differentiated the product and saved $220K+ in build costs.

Trade-off

Constrained the visual design to Open edX's component model. The first redesign's visual direction had to be adapted, not transplanted. The final aesthetic was also shaped by an in-progress Microsoft Learning partnership. We gave two product demos for their team, came close to landing the contract, and never heard why we didn't. Some constraints aren't design decisions at all. Worth every tradeoff. Should have pushed for this move three months earlier.

Decision 03Kill the Smash Score

Remove the personalized eligibility metric before it shipped.

User research made the call. "What happens if it's wrong?" Students didn't trust a number they couldn't interrogate making decisions about their access to real opportunities. I replaced it with explicit, transparent eligibility criteria.

The Smash Score wheel showing 78 with a 10% progress indicator — the feature that was cut
The killed feature
Trade-off

Lost the product's marquee differentiator. Gained trust. An eligibility system that's legible and fair matters more than one that's impressive but opaque.

Decision 04System before screens

Built on Ant Design before redesigning any product screens.

Comprehensive component library, strong data display patterns, thorough documentation that reduced ambiguity in remote handoffs. Customized color, type, and card patterns on top. Adapted again for Open edX's native React components after the migration.

Design system preview showing IBM Plex Sans type scale, teal color ramp, tag components, and dashboard widgets
Type · color · components
Trade-off

Visually constrained to Ant's model early on. A more bespoke system would have been more impressive, and would not have shipped in the window we had.

Decision 05Card grid

Replaced form-based event search with a browsable card grid and popup detail overlay.

Students didn't know what kinds of events existed, so they didn't know what to search for. The original page led with a blank form. The first redesign replaced it with all eligible events displayed upfront; clicking a card opens a detail overlay with a blurred background, dismissible by clicking outside — without losing grid context. The Open edX version carried this interaction pattern forward within platform constraints.

First redesign showing events card grid with Microsoft Azure 101 popup overlay and blurred background
First redesign: interaction pattern
Trade-off

More surface area to design and maintain. Required building a real filtering system to replace the old search form. Worth it: the core flow went from opaque to instantly legible, and the interaction pattern held across two platform generations.

Decision 06Geo-filtering

Filtered in-person events to within 100 miles of the student's location.

Goal: reduce noise. Show students events they could actually attend, not everything that existed. Remote-only events show no map; in-person events show a venue thumbnail.

Trade-off

May have hidden opportunities some students would have wanted regardless of distance. A judgment call that deserved more data than I had at the time.

If I'd do it again

  1. Push for the Open edX migration earlier. The savings were so significant that an earlier move could have materially changed the company's trajectory. I understood the value; I didn't advocate hard enough, fast enough. The first redesign work wasn't wasted, but it could have been applied to the right platform from day one.

  2. Make the cut list a public artifact from day one. Half the founder's reluctance to cut features came from never having seen them stack-ranked next to each other. Once the doc existed, the conversation took two days. Without it, it had been stuck for months.

  3. Start usability sessions in week one, not week three. The screens I redesigned in weeks 1–3 were the ones I redesigned again in weeks 8–10. I hadn't yet watched anyone use them. Even two sessions earlier would have changed my starting assumptions.

Up next

№ 02 / MoodiBoard.

Back to index