ABC MOUSE ASSESSMENT CENTER

The Assessment Center allows children to take quick tests in several grade levels to assess their aptitude in Math and Literacy skills. If a child needs help in a skill, activities in ABCmouse are instantly assigned to the child to help them improve. The parent is given data on how the child performed that can facilitate further action to help the child along in the learning process.

 

PROBLEM

The Assessment Center aimed to be a product families would use on a regular basis, constantly gauging how their children were progressing. User engagement by both adults and children were not where we wanted them to be. Many started but never completed the process of taking an assessment, let alone multiple assessments, or reassessing if the child did not do well. Parents couldn’t find enough time in the day; children became distracted easily and lacked motivation to complete the assessments or recommended activities.

 

SOLUTION

Creating a more guided experience by breaking the assessment process into three simple steps ( a formative cycle of assess, review, practice) as well as giving them more actionable data about how the child did was essential. Adding icons that indicated whether the assessment could be conducted with our without adult supervision helped the parent gauge how hands on they would need to be.

For the child, an overall facelift to make things more bold and colorful, to hold there attention, was applied. Additionally, simply removing the distracting side-nav made it easier to understand what options were available. We used the child’s avatar as a progress indicator to make the experience more personal. Most importantly we created a completion screen that gave the child more sense of accomplishment as well as being more actionable.

Changes in design accounted for a 95% increase in assessment completions and a 65% increase in reassessing. In short, repeat visits grew exponentially and the formative cycle approach helped make the process more intuitive.

 

DISCOVERY

SurveyStudies_combo.png

USER SURVEYS, INTERVIEWS, AND TESTING

Quantitative questionaires allowed us to understand the “who”. Who makes up our audience…age, income, and other pertinent demographics.

Qualitative interviews revealed the “why”. Why they purchased and why they used it in the way they did. A surprising piece of data revealed that a percentage of our users wanted to use the assessment product within their homeschooling environment. This reinforced our desire to make the Assessment Center a ubiquitous product in a parent/teachers toolbox. This use-case was then integrated into one of our three persona’s.

Preliminary usability tests were conducted on the current design to see where improvements could be made on the child experience. Here I developed scripts that were handed to child specialists to moderate child sessions. The feedback here reiterated our assumptions that the process needed to be more engaging and game-like for the child.

 

PERSONAS

Three personas were generated based on the common goals and characteristics revealed in our previous research. Personas included both adult and child overviews since both were so tightly knit. The personas were informed by the amount of engagement a parent was able to have in their child’s process as well as the age and challenges of the child.

 

USER JOURNEYS

The journey map allowed us to explore our users motivations throughout the assessment process. Again, unique to our process was to formulate motivations and thoughts for both child and parent. Another aspect to our journey is that there were two different ways to approaching a series of assessments. A user may take an individual skill assessment or a full grade-level assessment. Each path demanded slightly different steps by the user but ultimately ended up with similar goals. It became apparent from this document that we needed merge these two paths and make one simplified process, making it simpler and more efficient to get rapid results.

 

DESIGN

The assessment UI went through several incremental changes focusing on both improved parent and child experience. Ultimately this yielded a more time efficient and easier to use product. These improvements resulted in a doubling of assessments completed and a substantial increase in lessons taken. Simply put it yielded higher user engagement and time spent on the product.

USER FLOW

Referring to our journey mapping exercise we were able to prioritize and simplify our path to completion for the user, stripping out redundancy and potential confusion. Easing friction along the user flow created a product that parents were more apt to use, not only on key times of the year (end of semester…) but also anytime a parent felt they wanted to learn more about their child’s progress in a subject.

 

CONCEPTUAL WIREFRAMES

Although an incremental approach to upgrading the interface was taken, I also generated future thinking wireframes to help guide us to our next phase in design. These emphasized a more guided approach to starting an assessment, then providing the parent with actionable feedback after assessing that would assist in completing the formative cycle.

Paper or InVision prototypes were tested to verify that the improvements were easy to understand and benefitted our users.

 

COMPLETION SCREEN

Here you can see some variations in the completion screen upgrade. The far left version is the legacy design. New versions of this screen and others went through simple paper prototyping to reveal if information placement and hierarchy were clear and scannable for both child (during self-guided assessments) and parent. Special attention was taken to communicate how a child did without making it to stressful for the child. In addition, we wanted next steps to be as obvious as possible. The design in the far right is the one we went with. Creating a “recommended” next step in the lower right of the screen helped guide the user to where we thought was best for them depending on the given score. 

DATA VISUALIZATION EXPLORATION

A variety of visualization techniques were developed and tested with users to see what conveyed simple data most directly. Card sorting exercises were performed to understand what data was most important to the users, we then explored different ways to efficiently convey the most important information. Interestingly enough, it was the most “no frills” approaches that attracted the users the most. Simply showing color coded numbers made for an easy to parse interface.

Artboard 1-100.jpg

REPORTING UI EXPLORATION