Redefining assessment
experience for masses

Redefining assessment experience for masses

Redefining assessment experience for masses

Product Redesign . User research . Testing
Introduction

Early 2022 - after two years of consistent effort, we shipped a brand new experience for Mercer Assessment Platform, which is now used by more than 400 universities, colleges and schools across over 100+ countries.
This involved a totally rework of the assessment flow, candidate registration, proctoring and accessibility tools while addressing the unique challenges identified through our research.

My Role
Responsible for research, conceptualisation, design, user testing & delivery of key modules and feature areas.
The Team
2 designers, 3 product managers & 10+ engineers
Timeline
May 2020 to March 2022
Introducing

Mercer Assessment Platform

The Product

The Mercer Assessment Platform is an online examination system to conduct seamless virtual exams and assessments, to offer a seamless, end-to-end assessment experience by integrating various evaluation methods, such as cognitive tests, behavioral assessments, simulations, and video interviews, just to name a few.

It is core product in a ever-expanding Mercer Assessment Suite.

WHY IS IT NEEDED

Entire assessment suite was to be re-created to achieve essential product goals and enable assessments of tomorrow to be conducted in a more cohesive and productive way.
And because of sub-optimal experience it offers and business cruciality of the Assessment platform, it became the first product in assessment suite to be redesigned for next generation.

Starting point

We started by expert evaluation to get list of usability issues, and what are the sections we need to focus on, and how can we use that knowledge to optimise our platform, and make it user-friendly and accessible to all.

WHERE WE CONCLUDED

Early 2022 - after two years of consistent effort, we shipped a brand new experience for Mercer Assessment Platform, which is now used by more than 400 universities, colleges and schools across over 100+ countries.
This involved a totally rework of the assessment flow, candidate registration, proctoring and accessibility tools while addressing the unique challenges identified through our research.

Chapter 01

Understanding gaps & frustration

user RESEARCH / Personas

Candidate / Test-taker / Participant

Our users go by many names, but they are the people who onboards the assessment platform to take different types of tests, including competitive entrance exams, university exams, job assessments, and talent development surveys.

Age range: 19 to over 60 years old

user research / conclusion

👍🏽 Strengths of Platform

The platform excels in several areas, providing a strong foundation for an intuitive user experience:

Minimalist & Aesthetic Design

A clean and well-structured interface prevents cognitive overload and enhances readability.

Error Prevention & Recovery

The system effectively prevents errors and provides clear, user-friendly messages to help users recover when issues arise.

Progress Visibility

Features like timers and progress bars help users manage their time efficiently.

user research / conclusion

👎🏽 Challenges & Areas for Improvement

Despite its strengths, the platform has several usability issues that need to be addressed:

Lack of Clear System Feedback

Candidates struggle to understand system status due to insufficient real-time updates on progress, data checks, and response saving.

Inconsistent Navigation & UI Standards

Button placements and interaction flows vary across sections, making navigation unintuitive.

Registration & Personal Data Concerns

Candidates feel uncomfortable providing personal information without understanding why it is required

Instruction Overload & Readability Issues

Long and dense instructions overwhelm candidates; breaking them into shorter, more visually engaging chunks will improve comprehension.

Limited User Control & Flexibility

Candidates lack clear exit options when performing unintended actions, making the process rigid and frustrating

Poor Visibility of Remaining Questions

Candidates are unable to track how many questions are left, increasing anxiety.

Inefficient Submission Process

The final submission page lacks clarity on next steps and confirmation messages, leading to uncertainty

Mobile Usability & Performance Issues

The platform struggles with performance on smaller screens, including slow load times and poor touch interaction design.

Color Contrast & Readability Concerns

Low contrast between text and backgrounds reduces readability, especially for key action items like buttons and instructions.

user research / conclusion

Concluding research

The studies we conducted summarised both strengths as well as critical improvement areas that we should include in our redesign.

The usability challenges like, inconsistent navigation, unclear system feedback and mobile performance concerns hindered an optimal candidate experience.

And these provided us with good starting point to figuring out how to solve the raised issues and proceed.

To dig-deep into the research that led to these conclusions, please drop a mail to
Chapter 02

Placing components & Designing flow

Design / User flow

Updated candidate flow

We started with ideating with new journey with updated artifacts that'll help us solve each pain-point and make a more seamless and better experience for candidates.

We put a constraint to keep the assessment flow as close to the existing flow and only change when necessary. So, we iterated multiple times with help of product managers to make the flow intuitive and concise yet contextual.

Design / INFORMATION ARCHITECTURE

Updated Information Architecture

We also came up with a robust IA to get an understanding of the content and information we'll be placing depends on what section candidate is in.

This was the final Information Architecture that showcased the data and action items was developed after multiple iteration during sketching and discussion phase.

Design / Low-Fidelity

Exploring ideas

We started exploring different ideas and thoughts — some based on the sketches, some on pure whim, just to get an picture of what will work and what won't.
Each design was iterated with product managers to get their views and any feedback they have to share.

Here are some snapshots of early iterations of the assessment platform.

Design / HIGH-fidelity

Refining & Iterating

After numerous iterations and iterating each section sketches and on early designs, we had a good understanding of the elements we are going to place and how.

After refining the low-fidelity sketches, we moved towards creating high-fidelity prototypes. For visual identity, we used Mercer's branding system for products, and started to flesh out the screens and started to put together Mercer's Assessment Platform 2.0

Design / HIGH-fidelity

Bringing assessment platform to life

After creating high-fidelity mockups, we started working on interactive prototypes to mimic the new candidate flow with new visuals.

Originally this prototype was created in (now defunct) Adobe XD and was re-created in Figma with all features that addressed the highlighted pain-points.

Design / pain-points & Features

Solving pain-points, one feature at a time

With new designs, we were able to solve numerous critical problems by creating features that expands on the issues that were raised in user research.

Oh nboarding

Assessment Overview
always in sight

Assessment Overview
always in sight

Whenever candidate starts the test, a quick summary of assessment will be with the candidate, covering different aspects of information that candidate should know at this point.

We also offer sample assessments, that will allow candidate to explore this platform without timer.

Instructions

More visibility to the Instructions

More visibility to the Instructions

To keep instructions handy to the candidate, we reduced the text content and emphasised on the color illustrations that gathered the attention of candidate.

Also added an auto-scroll carousel to rotate between dynamic instructions.

Instructions

Clear indicators of the journey & The next actions

Clear indicators of the journey & The next actions

Th flow was restructured into discrete sections and providing progress indicators as the candidate proceed forward.

After providing required information in each section, Call-to-Actions (CTA) is highlighted indicating the next actions in the journey.

ASSESSMENT EXPERIENCE

Consolidated instructions,
Connectivity strength

Consolidated instructions,
Connectivity strength

We designed a specific space for instructions , which candidate can trigger to view all available instructions - general or specific instructions, without leaving the assessment flow.
Internet connectivity indicator was also introduced, so that candidate keep check on their connection strength while giving the assessment.

ASSESSMENT NAVIGATION

Navigation redesign and more navigation options

Navigation redesign and more navigation options

We replaced the navigation elements on platform without hindering the assessment flow. We also introduced multiple navigation elements so that candidate don't have to rely on one specific method.

This guided our idea to experiment on certain aspects of platform and make further advancements as the platform grows.

assessment navigation

Section overview and Question reporting

Section overview and Question reporting

A consolidated view of the sections and candidate's progress inside the assessment was designed, allowing section to become primary view for candidates.

Candidate now have options to report any question, allowing more autonomy to the candidate.

ASSESSMENT SUBMISSION & FEEDBACK

Understand the platform, RealTime saving and Help always available

Understand the platform, RealTime saving and Help always available

We managed to provide a real-time saving so that candidate can work as per their convenience.

And lastly, help is always appreciated whenever it is needed, and for that we introduced always- available
chat system during assessment.

ASSESSMENT SUBMISSION

Question bookmarking & Progress summary

Question bookmarking & Progress summary

Many candidates seems to keep their progress of the assessment manually, to keep check on what questions they have skipped or need to relook.

So question bookmarking feature on the platform and an assessment progress summary was created.

POST-ASSESSMENT SUBMISSION

Confirmation screen & Feedback

Confirmation screen & Feedback

After the assessment, candidate will get a confirmation & congratulatory message and if assessment allows, then a report will be generated.

Feedback form with likert scale was added to fuel future enhancements.

Design / error handling

Designing for roadbumps along the way

Assessment journey in itself is a dynamic journey. Rarely is the case where there candidate goes on an ideal flow (no error), there's a range of scenarios candidate might end up in depending on the stage of assessment they are in and what information they need.

How might we reduce the candidate anxieties during these moment of failure and give them a clear way ahead with context.

ERROR TYPES

Inline Infobars

Inline Infobars

Inline infobars are used in any scenario where information needs to be persistent and with candidate for context.
This ensures that candidate has full context of the type of issue they have encountered and what they need to do to proceed.

ERROR TYPES

Critical: System-oriented

Critical: System-oriented

System-oriented issues occurs when system encounters any issue which effectively stops the assessment. These issues can occur any stage of assessment flow.
To not make it a dead end, candidates have a clear CTA to 'Refresh' which refreshes the page and list of potential solutions wherever possible.

The illustrations and copy is light hearted by design to reduce user anxiety.

ERROR TYPES

Critical: Candidate-oriented

Critical: Candidate-oriented

Candidate-oriented issues occurs when system responds to candidate interaction in not an intended way. These can happen in two phases - pre-assessment check and during assesment.

Error screens contains context and possible resolutions highlighted with CTA.

ERROR TYPES

Toast errors

Toast errors

Toast errors are defined for grabbing immediate attention of candidate without disturbing the current interaction. Toast come in from top of the existing screen and will persist on screen for certain duration, so that context is retained and the candidate can take particular action.

To dig-deep into the research that led to these conclusions, please drop a mail to
Chapter 03

Validating new designs with teams

Testing / Usability setting

Internal reviews

To validate if the new designs are solving the pain-points or not, we employed quick usability testing methods with internal team members to get a holistic overview.


Goal

We wanted to identify the usability and design inconsistency issues with this new designs and generate an actionable list of items for enhancements before developer handoff.

Evaluation Team
This exercise was performed with internal teams, which included

  • Product Managers (for business goals)

  • Tech lead (tech review)

  • Design manager (UX issues)

Testing / Usability setting

Data captured & Overall review

After collaborating with available team members for a internal review, we were able to get hands on following data, which told us what we need to do for improvement.

Some subjective feedback we received:
"clean and modern"
"easy to navigate"
"Clean understanding the flow of this exam"
"…summary is a feature that was much required"
"bookmarking question is a nice touch"

Testing / Final iteration

Assessment Platform 2.0

After validating our designs with internal teams, and various iterations on design, we were able to finalise our designs and proceeded to hand-off to technical teams.

To dig-deep into the research that led to these conclusions, please drop a mail to
Chapter 04

Reception & Feedback

Launch / feedback

Preparing for product launch

Product managers devised a phase-wise launch plan which incorporated a soft launch for fewer clients and limited question support.

Bug squashing
We went on a two-month bug squashing marathon with tech teams before soft launch to prepare the platform from any technical debts and making it as perfect as we can.

Feedback from early users
Candidate who had the pleasure of experiencing the platform before the official launch shared their feedbacks and based on the feedback we validated, and made changes for better experience — like Walkthrough for questions.
Similar treatment was done to any client requests that came to us after experiencing the new assessment platform.

Launch / feedback

Client Testimonials

Over a million learners across 100+ countries used the new platform for recruitment, learning and up-skilling, reported by one of our client, earning rating of 4/5 by 87% of the candidates who interacted with the platform.

With COVID fundamentally altering the hiring process, talent recruitment soars to 70% and dropped hiring attrition to 30% for our clients and newly designed proctoring suite for the platform, client reported staggering 95% elimination of malpractices occurred during the exam.

Launch / feedback

Accessibility Certification

We also received WCAG2.1 compliance & ADA accessibility certifications by external auditors.