Redefining assessment
experience for masses

Product Redesign . User research . Testing
My Role
Responsible for research, conceptualisation, design, user testing & delivery of key modules and feature areas.
The Team
2 designers, 3 product managers & 10+ engineers
Timeline
May 2020 to March 2022
Introducing

Mercer Assessment Platform

The Product

The Mercer Assessment Platform is an online examination system to conduct seamless virtual exams and assessments, to offer a seamless, end-to-end assessment experience by integrating various evaluation methods, such as cognitive tests, behavioral assessments, simulations, and video interviews, just to name a few.

It is core product in a ever-expanding Mercer Assessment Suite.

PRODUCT VISION

Assessment platform is the single-most crucial product in Mercer's library as this is where all the use-cases of MAS coincides.
And because of its criticality both from business and on user perspectives, it is a no-brainer that this product needs to have the best experience possible, with minimum friction in the journey, contextual assists along the way and a unique visuals identity to stand apart from the competitors.

This became the core vision which guided us when we undertook this project.
Every decision we made through the project was taken by keeping the product vision in check.

WHY IS IT NEEDED

Entire assessment suite was to be re-created to achieve essential product goals and enable assessments of tomorrow to be conducted in a more cohesive and productive way.
And because of sub-optimal experience it offers and business cruciality of the Assessment platform, it became the first product in assessment suite to be redesigned for next generation.

Starting point

We started by expert evaluation to get list of usability issues, and what are the sections we need to focus on, and how can we use that knowledge to optimise our platform, and make it novice-friendly and accessible to all.

feedback from users

Our preliminiary research,heuristic evaluation and cognitive walkthroughs, we discovered that the assessment platform is long due for an overhaul, and not from visual design, but from assessment flow perspective as well.
It was a rudimentary, disjointed and too long of an process — with numerous sections to go through, non-contextual instructions, no progress summary, and heavy reliance of external softwares for starting assessment.

WHERE WE CONCLUDED

Early 2022 - after two years of consistent effort, we shipped a brand new experience for Mercer Assessment Platform, which is now used by more than 400 universities, colleges and schools across over 100+ countries.
This involved a totally rework of the assessment flow, candidate registration, proctoring and accessibility tools while addressing the unique challenges identified through our research.

But let's keep digging and understand what changes were to made to make this assessment platform a platform of tomorrow.

Chapter 01

Understanding gaps & frustration

user research

Expert Opinion

When the project was in discussion phase, we wanted to get an understanding of current usability issues, so we started with heuristic evaluations.

user research / HEURISTIC EVALUATION

Goals, flow, framework and the Team

We started by defining the scope of the evaluation, the user flow to be evaluated and the team doing the evaluation.

Goal
We wanted to identify the usability issues and see where the problem lies in terms of usability, efficiency and consistencies and generate an actionable list of items for enhancements.


User flow
We wanted to evaluate end-to-end assessment flow with included following tasks and their included sub-tasks:

  • Pre-assessment checks and permission gathering

  • Registration

  • Assessment onboarding

  • Assessment interaction

  • Post-assessment verification


Heursitic Framework
We choose Jakob-Neilsen's heuristics framework as it is an adaptable, universally-accepted method that can be applied to any interface regardless of product type.

Nielsen's 10 heuristic principles explained with examples


Evaluation Team
This exercise was performed with team of 5 usability experts, which included

  • Product Managers

  • Tech lead

  • Designer

user research / HEURISTIC EVALUATION

Process & Result

Each expert went through the flow and marked their finding against the selected heuristic.

  • Device: Windows Laptop

  • Browser: Google Chrome

  • Resolution: 1920x1080

Criticality of each heuristic is defined through discussion with Product managers.

user research / HEURISTIC EVALUATION

Evaluation rating

Through this exercise, we got a clear picture of platform as it stands now and experts also shared their feedbacks on the improvements we could include.

user research / Analysis

Validating heuristics with analytics

We tried verifying the heuristics with the analytical data but failed, as we discovered that there was no analytic tool installed into the platform. But we do received a lengthy file containing subjective feedback data from two years prior.

So we tried to quantify the available data and see if there are any recurring patterns that align with the heuristic evaluation.

user research / Analysis

Feedback summary

user research / Analysis

Result

These were the common themes that we were able to quantify from the file - some were positive and some were opportunities that we should take care in the new project.

user research / Cognitive walkthrough

Stepping into shoes of first-time users

Till now, we have evaluated the platform from the perspective of experts, or people who already had exposure with the platform.
Now, we tried to step into the shoe of a person who has come to the platform as the first-time candidate, with no exposure to online assessments, and see what issues they have faced and any new insights we can get from the exercise.

User flow
We re-utilised the user flow we defined for heuristic evaluation as it was complete and covers all the areas we want to focus on.

Evaluator
We asked product manager of different team, to roleplay as our first-time candidate and give us her insight.

Evaluation Criteria
Each step is analyzed using these four key questions:

  • Will the candidate know what to do at this step?

  • Will the candidate see the correct UI element to take action?

  • Will the candidate associate the action with their goal?

  • Will the candidate get feedback that they performed the correct action?

user research / Cognitive walkthrough

Result

After the walkthrough, we were able to condense feedbacks into specific sections against the severity of the step and also recorded their feedbacks.

user research / interviews

Finding qualitative insights

We conducted user interviews to evaluate the usability of assessment platform. Goal was to validate heuristic evaluation findings, identify user pain points and gather insights for further improvements.

User flow
Continuing from cognitive walkthrough, we re-utilised the user flow we defined for heuristic evaluation as it was complete and covers all the areas we want to focus on.

user INTERVIEWS / Personas

Candidate / Test-taker / Participant

Our users go by many names, but they are the people who onboards the assessment platform to take different types of tests, including competitive entrance exams, university exams, job assessments, and talent development surveys.

Age range: 19 to over 60 years old

user research / interviews

Interview details

Participants Overview

  • Novice - People with no experience with online platforms.

  • Intermediate - Peopl who have used any online assessment platform atleast once in 6 months.

  • Expert - People who are well-versed in using any online assessment platforms or coding arenas.

Interview format:
1-on-1 virtual interviews

Interview duration:
45-50 minutes per session

Interview format:
1-on-1 virtual interviews

Interview Questions:
We defined questions for each phase of the assessment flow with intent of learning as much as we can from the candidate.
You can view questionnaire from button below

user research / user journey

Journey of the candidate

Based on current platform's user journey flows, we gathered psychological data that described what user felt through each step of the process.

user research / Interviews

Data captured & Overall ratings

user research / pain-points

With each interview, we came close to understanding user's frustation & pain-points that restricts user from reaching their goal.

Here are the pain-points and recommendations that provided us with lots of opportunity that we can include in platform design.

user research / Interviews

Key findings & Next steps

Users that are coming to the assessment platform has has these goals to complete:

Common Themes in Feedback

  • The interface is clear but feels outdated.

  • Navigation could be more intuitive.

  • Better indicators for unanswered & revisited questions needed.

  • Assessment-ending flow should be improved with clear confirmations.

user research / conclusion

👍🏽 Strengths of Platform

The platform excels in several areas, providing a strong foundation for an intuitive user experience:

Minimalist & Aesthetic Design

A clean and well-structured interface prevents cognitive overload and enhances readability.

Error Prevention & Recovery

The system effectively prevents errors and provides clear, user-friendly messages to help users recover when issues arise.

Progress Visibility

Features like timers and progress bars help users manage their time efficiently.

user research / conclusion

👎🏽 Challenges & Areas for Improvement

Despite its strengths, the platform has several usability issues that need to be addressed:

Lack of Clear System Feedback

Candidates struggle to understand system status due to insufficient real-time updates on progress, data checks, and response saving.

Inconsistent Navigation & UI Standards

Button placements and interaction flows vary across sections, making navigation unintuitive.

Registration & Personal Data Concerns

Candidates feel uncomfortable providing personal information without understanding why it is required

Instruction Overload & Readability Issues

Long and dense instructions overwhelm candidates; breaking them into shorter, more visually engaging chunks will improve comprehension.

Limited User Control & Flexibility

Candidates lack clear exit options when performing unintended actions, making the process rigid and frustrating

Poor Visibility of Remaining Questions

Candidates are unable to track how many questions are left, increasing anxiety.

Inefficient Submission Process

The final submission page lacks clarity on next steps and confirmation messages, leading to uncertainty

Mobile Usability & Performance Issues

The platform struggles with performance on smaller screens, including slow load times and poor touch interaction design.

Color Contrast & Readability Concerns

Low contrast between text and backgrounds reduces readability, especially for key action items like buttons and instructions.

user research / conclusion

Concluding research

The studies we conducted - heuristic evaluation, walkthrough, empathy mapping and user interviews summarised both strengths as well as critical improvement areas that we should include in our redesign.

While the minimal design, pre-filled registration provided a solid foundation upon which we can built our product, the usability challenges like, inconsistent navigation, unclear system feedback and mobile performance concerns hindered an optimal candidate experience.

And these provided us with good starting point to figuring out how to solve the raised issues and proceed.

Chapter 02

Placing components & Designing flow

Design / brainstorming

Pain points & possible solutions

Based on the insights we discovered, we set priority to each pain point and design team teamed up with product managers and engaged in brainstorming activity, to figure out possible solutions for each pain point uncovered.

Each pain point were color-coded with defined priority as marked by product managers so that we can on solving specific set of problems.

Design / User flow

Updated candidate flow

We started with ideating with new journey with updated artifacts that'll help us solve each pain-point and make a more seamless and better experience for candidates.

We put a constraint to keep the assessment flow as close to the existing flow and only change when necessary. So, we iterated multiple times with help of product managers to make the flow intuitive and concise yet contextual.

Design / INFORMATION ARCHITECTURE

Updated Information Architecture

We also came up with a robust IA to get an understanding of the content and information we'll be placing depends on what section candidate is in.

This was the final Information Architecture that showcased the data and action items was developed after multiple iteration during sketching and discussion phase.

Design / Low-Fidelity

Sketches & Ideas

Now that we have a list of features we need to include with updated user flow and the Information architecture, we moved on towards sketching possible designs of the core screens.

We started with paper sketches and iterating using guerrilla testing to ensure we are moving in the right direction.

Design / Low-Fidelity

Exploring ideas

We started exploring different ideas and thoughts — some based on the sketches, some on pure whim, just to get an picture of what will work and what won't.
Each design was iterated with product managers to get their views and any feedback they have to share.

Here are some snapshots of early iterations of the assessment platform.

Design / HIGH-fidelity

Refining & Iterating

After numerous iterations and iterating each section sketches and on early designs, we had a good understanding of the elements we are going to place and how.

After refining the low-fidelity sketches, we moved towards creating high-fidelity prototypes. For visual identity, we used Mercer's branding system for products, and started to flesh out the screens and started to put together Mercer's Assessment Platform 2.0

Design / HIGH-fidelity

Bringing assessment platform to life

After creating high-fidelity mockups, we started working on interactive prototypes to mimic the new candidate flow with new visuals.

Originally this prototype was created in (now defunct) Adobe XD and was re-created in Figma with all features that addressed the highlighted pain-points.

Design / pain-points & Features

Solving pain-points, one feature at a time

With new designs, we were able to solve numerous critical problems by creating features that expands on the issues that were raised in user research.

Oh nboarding

Assessment Overview
always in sight

Assessment Overview
always in sight

Whenever candidate starts the test, a quick summary of assessment will be with the candidate, covering different aspects of information that candidate should know at this point.

We also offer sample assessments, that will allow candidate to explore this platform without timer.

Instructions

More visibility to the Instructions

More visibility to the Instructions

To keep instructions handy to the candidate, we reduced the text content and emphasised on the color illustrations that gathered the attention of candidate.

Also added an auto-scroll carousel to rotate between dynamic instructions.

Instructions

Clear indicators of the journey & The next actions

Clear indicators of the journey & The next actions

Th flow was restructured into discrete sections and providing progress indicators as the candidate proceed forward.

After providing required information in each section, Call-to-Actions (CTA) is highlighted indicating the next actions in the journey.

ASSESSMENT EXPERIENCE

Consolidated instructions,
Connectivity strength

Consolidated instructions,
Connectivity strength

We designed a specific space for instructions , which candidate can trigger to view all available instructions - general or specific instructions, without leaving the assessment flow.
Internet connectivity indicator was also introduced, so that candidate keep check on their connection strength while giving the assessment.

ASSESSMENT NAVIGATION

Navigation redesign and more navigation options

Navigation redesign and more navigation options

We replaced the navigation elements on platform without hindering the assessment flow. We also introduced multiple navigation elements so that candidate don't have to rely on one specific method.

This guided our idea to experiment on certain aspects of platform and make further advancements as the platform grows.

assessment navigation

Section overview and Question reporting

Section overview and Question reporting

A consolidated view of the sections and candidate's progress inside the assessment was designed, allowing section to become primary view for candidates.

Candidate now have options to report any question, allowing more autonomy to the candidate.

ASSESSMENT SUBMISSION & FEEDBACK

Understand the platform, RealTime saving and Help always available

Understand the platform, RealTime saving and Help always available

We managed to provide a real-time saving so that candidate can work as per their convenience.

And lastly, help is always appreciated whenever it is needed, and for that we introduced always- available
chat system during assessment.

ASSESSMENT SUBMISSION

Question bookmarking & Progress summary

Question bookmarking & Progress summary

Many candidates seems to keep their progress of the assessment manually, to keep check on what questions they have skipped or need to relook.

So question bookmarking feature on the platform and an assessment progress summary was created.

POST-ASSESSMENT SUBMISSION

Confirmation screen & Feedback

Confirmation screen & Feedback

After the assessment, candidate will get a confirmation & congratulatory message and if assessment allows, then a report will be generated.

Feedback form with likert scale was added to fuel future enhancements.

Design / error handling

Designing for roadbumps along the way

Assessment journey in itself is a dynamic journey. Rarely is the case where there candidate goes on an ideal flow (no error), there's a range of scenarios candidate might end up in depending on the stage of assessment they are in and what information they need.

How might we reduce the candidate anxieties during these moment of failure and give them a clear way ahead with context.

ERROR TYPES

Inline Infobars

Inline Infobars

Inline infobars are used in any scenario where information needs to be persistent and with candidate for context.
This ensures that candidate has full context of the type of issue they have encountered and what they need to do to proceed.

ERROR TYPES

Critical: System-oriented

Critical: System-oriented

System-oriented issues occurs when system encounters any issue which effectively stops the assessment. These issues can occur any stage of assessment flow.
To not make it a dead end, candidates have a clear CTA to 'Refresh' which refreshes the page and list of potential solutions wherever possible.

The illustrations and copy is light hearted by design to reduce user anxiety.

ERROR TYPES

Critical: Candidate-oriented

Critical: Candidate-oriented

Candidate-oriented issues occurs when system responds to candidate interaction in not an intended way. These can happen in two phases - pre-assessment check and during assesment.

Error screens contains context and possible resolutions highlighted with CTA.

ERROR TYPES

Toast errors

Toast errors

Toast errors are defined for grabbing immediate attention of candidate without disturbing the current interaction. Toast come in from top of the existing screen and will persist on screen for certain duration, so that context is retained and the candidate can take particular action.

Chapter 03

Validating new designs with teams

Testing / Usability setting

Internal reviews

To validate if the new designs are solving the pain-points or not, we employed quick usability testing methods with internal team members to get a holistic overview.


Goal

We wanted to identify the usability and design inconsistency issues with this new designs and generate an actionable list of items for enhancements before developer handoff.


User flow
We wanted to evaluate end-to-end assessment flow with included following tasks and their included sub-tasks:

  • Pre-assessment checks and permission gathering

  • Registration

  • Assessment onboarding

  • Assessment interaction

  • Post-assessment verification

Evaluation Team
This exercise was performed with internal teams, which included

  • Product Managers (for business goals)

  • Tech lead (tech review)

  • Design manager (UX issues)


Questions to ask

The entire prototype was evaluated based on following questions:

  • Is the platform looks visually refreshing?

  • Does the candidate have appropriate context about the assessment?

  • Can candidate start the assessment and follow the process effortlessly?

  • Is the next step clear on every screen?

  • Are the actions intuitive?

  • Are buttons, fonts, and colors consistent?

  • Is spacing, alignment, and contrast visually balanced?

  • Does the system is responding in the correct manager with error messages & feedback states?

  • Can a candidate is able to answer the questions easily?

  • Is the candidate is getting correct feedback for the action they are performing?

  • Does the assessment summary page make it easy to revisit flagged questions?

  • Does the assessment ending screen clearly confirm completion?

Testing / Usability setting

Data captured & Overall review

After collaborating with available team members, we were able to get hands on following data, which told us what we need to do for improvement.

Some subjective feedback we received:
"clean and modern"
"easy to navigate"
"Clean understanding the flow of this exam"
"…summary is a feature that was much required"
"bookmarking question is a nice touch"

Testing / Final iteration

Assessment Platform 2.0

After validating our designs with internal teams, and various iterations on design, we were able to finalise our designs and proceeded to hand-off to technical teams.

Chapter 04

Reception & Feedback

Launch / feedback

Preparing for product launch

Product managers devised a phase-wise launch plan which incorporated a soft launch for fewer clients and limited question support.

Bug squashing
We went on a two-month bug squashing marathon with tech teams before soft launch to prepare the platform from any technical debts and making it as perfect as we can.

Feedback from early users
Candidate who had the pleasure of experiencing the platform before the official launch shared their feedbacks and based on the feedback we validated, and made changes for better experience — like Walkthrough for questions.
Similar treatment was done to any client requests that came to us after experiencing the new assessment platform.

Launch / feedback

Client Testimonials

Over a million learners across 100+ countries used the new platform for recruitment, learning and up-skilling, reported by one of our client, earning rating of 4/5 by 87% of the candidates who interacted with the platform.

With COVID fundamentally altering the hiring process, talent recruitment soars to 70% and dropped hiring attrition to 30% for our clients and newly designed proctoring suite for the platform, client reported staggering 95% elimination of malpractices occurred during the exam.

Launch / feedback

Accessibility Certification

We also received WCAG2.1 compliance & ADA accessibility certifications by external auditors.