CASE STUDY
3 min read
Project Overview
KEY METRICS
28%
Increase in on-time course completions after launch
45%
Faster resolution of overdue training
4.6/5
Average rating in post-launch feedback
Status
Shipped
Role
Lead Product Designer (solo)
Year
2023
Product
BuildWitt (B2B LMS Platform)
Audience
Construction admins and field crews
Skills
UX & UI Design
Information Architecture
User Research
Prototyping
Growth Strategy
Background
BuildWitt Training is a construction-focused Learning Management System (LMS) that helps crews upskill and stay compliant. It offers Learning Plans, Courses, and Lessons focused on workforce readiness and safety. However, the platform lacked any meaningful analytics—making it difficult for admins to monitor progress, identify issues, or take corrective action.
The Problem
BuildWitt Training provided almost no analytics or reporting, limiting the ability of administrators to track and evaluate employee progress.
Admins needed:
A way to monitor training progress across teams
Insight into how learners were performing on quizzes
Visibility into expiring or expired certifications
A fast way to identify employees who were falling behind
The Goal
Design and deliver a robust but easy-to-use analytics experience that would empower admins to make data-driven decisions, improve training outcomes, and reduce time spent manually piecing together reports.
Research and Insights
To deeply understand admin pain points, I conducted user interviews and task-based research sessions with BuildWitt customers. I also audited competing LMS tools to evaluate standard reporting patterns and emerging best practices
Key findings
Admins were managing training progress with spreadsheets due to a lack of built-in reporting
Quiz performance data was only accessible through raw exports
Monitoring expired or expiring certificates was time-consuming and error-prone
Admins wanted a central dashboard to surface risk and take quick action
User Flows & Journey Maps
I created detailed user flows and a journey map to outline the admin experience—from logging in to identifying and addressing training gaps. This helped align the product team and prioritize which analytics views were most critical.
Success Metrics
To measure impact, we aligned on the following KPIs:
Feature adoption
Total sessions on analytics screens within 90 days
Course completion rate
Pre- vs post-launch
Task efficiency
Time to identify overdue learners
User satisfaction
Measured via post-launch survey and qualitative interviews
Ideation
Using the insights from research, I sketched multiple dashboard concepts that emphasized scalability and actionability. These ideas were refined into wireframes, then high-fidelity designs, which I prototyped and tested with real users.
User Testing
I conducted usability testing with admins across different construction companies. The feedback was overwhelmingly positive and led to improvements in copy clarity, filter controls, and table sorting.
Final Design
Conclusion
Explore more of my work
mattsburt@gmail.com

