A technology and data-first approach to impact and education
While at EVERFI, our team wanted more direction for design priorities in future quarters. For this, I was tasked with conducting a detailed audit and competitive analysis for learner and teacher journeys on our platform. Afterwards, I researched and created initial designs for future direction and exploration.
Survey & Assessment Journey for Learners
We know that we present learners with a variety of surveys, pre-assessments, and confirmation screens before they start our courses. Within that variety are inconsistencies in display and interaction. This work is about understanding the current learner journey from beginning of course to finish, exposing opportunities for improvement, and creating design recommendations.
I first conducted a thorough audit of the current learner flow from beginning to finish in a minimum of five courses to better understand the journey segmentation, how flow and clicks differ in different courses, and the issues with each. Next, I took a visual inventory of how our surveys and assessments present, summarizing opportunities including, what extra clicks, inconsistencies, interaction patterns, etc. might we address in future iterations? Finally, I designed recommendations and deltas for future exploration which i presented to the team.
While noting the current learner flow for our courses from start to finish, we learned that there was a maximum of 24 pages a student could interact with outside of their course including a redirect page, pre-survey notice, pre-survey, pre-assessment, post-assessment, post-survey, and post-survey redirect.
With this in mind, I dissected each segment to find ways to decrease the average amount of pages a learner interacts with outside of their course. This includes having the first page a learner sees as a modal explaining the pre-survey where they click out, having a pop-up explaining the pre-survey, and adding a finish button instead of an extra page to finish for all pre and post-assessment. Click here for full flow map.
Through a visual inventory of our current surveys and assessments, we found many opportunities to update our designs to best align with our current and future design systems.
We see a visually dissonant experience while going through a full user experience. Each section follows a separate set of design rules. Ensuring all surveys and assessments use our design system best practices and components will ensure a more cohesive experience throughout the course and platform experience for learners.
Pre-survey landing page before and after
We identified issues of lack of legibility, outdated design components, and uninviting design for K12 or teacher audiences.
Unanswered pop-up question page before and after
By utilizing Unified Design Language components, left-aligning the text, implementing a split-screen visual, and incorporating course imagery, the previously found issues were resolved.
Pre & post survey page before and after
To improve the survey experience for students, it's important to update the design to align with the current design system of the platform, as well as streamline the process. This can be done by using our updated design system style, font, components, and survey template. Additionally, to prevent the inconvenience of large bouts of scrolling, students should be allowed to "return to survey" when they finish, in case they want to recheck any answers. A pop-up modal for finishing the pre-survey instead of a separate page would also help to streamline the process.
Future areas for further exploration include creating a mobile friendly and responsive survey and assessment experience. Due to time constraints, I was unable to further explore these areas.