Delivering on Education

Everfi

2022-2024
From kindergarten to retirement, Everfi builds a compelling digital curriculum that demands thorough research and design planning to engage with users. Everfi had a storied list of bespoke educational courses but most patterns and process' were not well documented. There were many inefficiencies and gaps in the cross-functional team process that caused delays in onboarding and execution.
Design Lead
Art Direction
UX Design
Product Design
Problem
  • Platform and course product teams used an extensive library of bespoke layouts and rules that were referenced through tribal knowledge.
  • 'Styles' were build with entirely different design systems with oddly similar user needs.
  • Unmoderated user testing was the only insight into prototype usage and it was built on dated 3rd party offerings which offered no end user data.  
Discovery
Cross-Comparison
As part of my onboarding I audited the courses and their varied design systems. Teams had been solving similar problems by borrowing solutions without full adoption of templates.
Interview stakeholders
Many courses and platforms were revamps of previous builds to meet modern design styles. Designers have unique access to SME, PJMs, and external resources so I took full advantage of this to builid out learner profiles.
Process
Pain points
Younger users were speed running and not retaining information. Elective learners wanted less 'gamified' content. The solution was finding common ground with cohesive patterns that were engaging without overwhelming users.
Structure & Iterate
Previous teams built bespoke patterns aimed to solve old user goals. Most solutions were iterations of the same concept. My goal was to identify deprecated patterns and  rebuild with auto-layout formatting.
Research
Actionable data usage
Depending on the course a users age could be anywhere from 12-75. Testing was previously limited to unmoderated testing on Trymata. I built a business case for developing our own in-house testing that would give current and future builds real world data we could act on. Persistence paid off and my team was able to cut 4 hours from our intake process. This freed up product hours and we secured moderated testing with end users. A better gauge on how the product is being implemented and shared. We could now update current builds and use insights for future builds.