top of page

AGENT EVALUATION FORMS

CalabrioONE's evaluation forms get an efficiency makeover that also saves customers money


DETAILS

  • Employer: Calabrio Inc., over 500 employees, in hyper-growth.

  • Project: Redesign the customer interaction evaluation form in CalabrioONE

  • Role: UX design and research, including interaction and visual design

  • Activity: As one part of a larger redesign initiative I collaborated with product owner, front end developer and engineers to improve the usability and enhance efficiency of agent evaluation forms in CalabrioONE .


IMPACT

Hundreds - and sometimes thousands - of evaluations are completed weekly by supervisors to evaluate agent performance in contact centres. Small efficiency enhancements made to the evaluation forms added up to thousands of dollars in annual savings for Calabrio customers. But besides that the enhancements reduce fatigue and frustration on the part of thousands of evaluators using CalabrioONE software.


Improvements in efficiency we made include:

  • Replaced multi-click dropdown menus with single-click buttons

  • Replaced confusing scoring scheme with a clear, concise design

  • Replaced disjointed easy-to-overlook commenting method with inline commenting

  • Replaced manual notation of video timestamp with automatic notation

A chart compares a 1-click button response to a 2-click dropdown menu response - for an evaluation.
Even one extra click and small interruption in cognitive processing, over time and long repetition, can cause user frustration, and errors of impatience. Multiplied many times, these can cost a company thousands of dollars annually.


OVERVIEW

Evaluating agents' performance during customer interactions (typically phone calls) is a critical practice at contact centres, and evaluation is typically done using an online form.


With CalabrioONE's Quality Management SaaS web app, customers can actually create their own evaluation forms, and then use powerful data analysis tools to gain strategic insights into workforce management and business performance. This redesign project focuses on the user experience of filling out the evaluation forms - a task that a supervisor or manager at a call centre often performs many times in a single week.


RESEARCH

A chart showing 6 screenshots of parts of the system related to evaluations

1. Map the ecosystem

I mapped the overall evaluation ecosystem to examine data inter-dependencies. Evaluations are not just completed, they are also created, assigned, sorted, and consolidated for analysis within CalabrioONE.

2. Evaluate usability of as-is solution

I did a quick expertise-based usability assessment to evaluate the as-is solution against UX/UI best practice, annotating screen captures with top findings to use as a communication tool in discussions with the team.


3. User journey map + user interviews

A user journey map helped pinpoint problem areas for users doing typical evaluation workflows. A handful of user interviews gave me plenty of confirmation on usability issues, but also provided some surprising insights. For instance, evaluators often take advantage of a "Populate Default Answers" button that pre-fills responses with answers selected by the evaluator. They do this to save time.

Top finding: Research showed that task efficiency was a top priority for evaluators.

The user journey map also helped clarify how the Call Inspector interface (in which the evaluation form appears, alongside other data related to the call) influences the user's awareness of changes to the evaluation, like recently added comments.


DESIGN FOR EFFICIENCY: THE PAY-OFF


Replace inefficient 'scoring' buttons

Two examples of dropdown menus
Typical dropdown menus in the old evaluation form.

The vast majority of evaluations - which customers build to their own specifications - include 30+ multiple choice responses (5-point Scale, and Yes/No). In the existing version of the software, dropdown menus automatically appear for all Scale and Yes/No responses.


The redesign replaces all dropdown menus with 1-click buttons instead.


This might seem like a very small change, but the increase in efficiency gained by changing this one detail is remarkable. Consider:


One-click responses vs. dropdown menus for evaluation responses now save up to 2 minutes per evaluation. At a modest 10 evaluations per session, the math tells us the following:


2 minutes per evaluation x 10 evaluations = 20 minutes

🔻

20 minutes per 65 sessions (in a year) = 1300 minutes

🔻

1300 minutes at a wage of $35/hr = $758 per year per evaluator

🔻

$758 x 5 evaluators = $3,792/year

🔻

$3,729 x 5 years = $18,960


Over 5 years, a medium sized call centre could lose $18,960 owing to the implementation of a response button with poor task efficiency.



Commenting

Commenting was also inefficient to do and to review in the existing evaluation forms. Many managers/supervisors did not even know they could leave comments, because the unlabeled icon-button was tiny, difficult to find, and hard to interpret.

Comment interface, old version.

A screenshot taken from my usability review/assessment of the existing Evaluation Form shows:


On the form (see A, left):

  • The "Add Comments" button is small, hard to find, easy to misinterpret as "Edit form".

  • Responses to most questions require the user to select from a dropdown menu

In the "Comments" pop-up box (see B, left):

  • It isn't clear that commenter must choose from 3 "types" of comment: general form, section specific, question specific.

  • To connect a comment to a precise point in the audio recording, the commenter must manually enter a time in the body of the comment.

  • A character countdown to 3,000 characters is included for some reason. (No customers I talked to required such lengthy comments).


RESULT

An audio energy bar on the left has a location indicator that matches up with a small letter "C" (for comment), and also matches a red highlight on a text comment in the adjacent Evaluation panel.
In the new evaluation forms it is much easier to see how a comment ("A") relates to a segment of recorded audio ("B"): When the user hovers over the small C icon, a tool tip appears. Selecting the icon causes the playback indicator to jump to that position, and the related comment appears in the evaluation panel, highlighted in red.


The new evaluation forms feature many small usability and efficiency improvements.

  • Helpful context. The Evaluation form now resides inside the "Call Inspector" interface by default. Panels in the inspector provide useful information about all aspects of the call - helpful to evaluators. A red highlight (see A, above) synchronizes evaluation comments to their timestamp position within the call (see B, above).

  • Options for display. The Evaluation panel can be optionally "torn off" for wider format view in a separate browser window.

  • More efficient way to select a response. Likert Scale and Y/N responses are now one-click; no more dropdowns.


Enhanced commenting

  • Comments are now in-line. No need for a popup boxes to make or view a comment; comments are added and viewed in the form itself, directly below the related question.

  • Automatic timestamp. Commenter can optionally choose to include a timestamp automatically generated based on the location of the indicator within the audio file.

  • Greater visibility. Every comment with a related audio timestamp appears as in the Audio panel as a tiny letter "C". Hovering over the C generates a tooltip with comment details. This is useful if the Evaluation panel is closed, or displayed in a separate browser window.

  • Less clutter. A text field limit remains (technical requirement), but the unnecessary character countdown has been removed. Instead an alert appears if more than 2000 characters have been entered (a rare event).

Comments


Commenting has been turned off.
bottom of page