Kristiana Druseiko

Transforming legacy fintech software into a modern SaaS solution

Before

Click to expand

After

Modernised UI

Improved layouts

Overhauled error handling

My role

UX Designer, responsible for:

  • Stakeholder alignment
  • Existing product audit
  • User research
  • Data synthesis
  • Concept wireframes
  • Improving feedback & error handling
  • Prototyping and user testing
  • Creating and managing a design system
  • Developer handoff

Project team

  • Design Team (2 UX Designers),
  • Business Analyst,
  • Development Team (2 Front-End and 4 Back-End Developers),
  • Quality Assurance Team (4 QA Engineers),
  • Project Manager

Duration

June 2021 – July 2024

Overview

Redress Manager is an internal tool used by financial institutions to manage customer compensation (redress) cases, typically in response to complaints, regulatory breaches, or service errors.

Problem

The software’s interface design had remained largely unchanged since its initial release in 1995. While it continued to serve its purpose, the technology had become outdated, limiting opportunities to improve its functionality, usability and accessibility.

The company set out to modernise the product by turning it into a SaaS solution.

Objectives

My primary objective was to modernise the product’s UI, making it visually appealing, responsive, and scalable.

I also used this opportunity to uncover and address existing usability issues through a product audit and user research, ensuring the redesign improved user and business outcomes.

Impact

Improved interface usability

Usability testing confirmed that the redesign reduced cognitive load in core workflows, improving learnability and task efficiency.

Overhauled error handling

I identified and addressed a major source of support tickets and user frustration. This was expected to reduce pressure on internal training and support teams after launch.

Built a scalable design system

I helped establish a scalable, token-driven design system that enabled faster delivery and more efficient collaboration between designers and developers.

The UX process

Discovery

Stakeholder alignment

I spoke with key stakeholders to understand the goals for the product transformation and align the UX approach with broader business objectives.

We set the following goals:

Modernise the interface and overall experience to meet current usability standards and user expectations.

Ensure the new product is scalable and adaptable for delivery as a SaaS solution.

Ease pressure on the internal support team by improving usability and streamlining key workflows.

Changes to existing functionality should be kept to a minimum to avoid disrupting existing users and to reduce rework costs.

Existing product audit

I conducted a detailed review of the legacy Redress Manager interface to understand its structure, workflows, and usability gaps.

Key activities included:

  • Analysing content hierarchy, terminology, and information grouping
  • Mapping key user flows
  • Evaluating the interface against usability heuristics
  • Reviewing feedback and error handling patterns

Annotated screenshot example

User research activities

My research focused on:

  • Identifying usability issues experienced by case workers
  • Validating and challenging internal assumptions and findings from the existing product audi
I set out to answer the following question:

What usability issues do case workers experience, and where are the biggest opportunities for improvement?

Methods I used

Direct access to external users was limited, so I used the research methods below to ensure design decisions were grounded in real user needs.

🎙️ Internal user interviews
To gain proxy insights, I interviewed colleagues from internal teams who work with external users:
  • Support staff to understand where users struggle
  • Trainers to uncover challenges faced by new users
  • Internal employees who use Redress Manager as part of their work
👥 Shadowing a training session

I had the opportunity to observe an on-site training session with a group of external users, which helped me identify challenges new users face and spot usability issues in action.

💬 Support ticket review
I analysed anonymised support tickets handled by our support team to identify common user pain points and see how often they occurred.
📊 In-product survey
I proposed adding a short, optional feedback pop-up in the existing software to quickly gather real user feedback from a larger audience. I collected 67 responses over the research period.

Click to expand

Key findings & insights:

🧠 Overwhelming interface increased cognitive load

Users found the system visually dense and cluttered, making it appear more complex than it was and difficult to learn. Survey respondents often requested a cleaner, more modern interface.

🎓 High dependency on trainers and support

All trainees regularly got stuck during the onboarding session and required assistance to progress, even for repeat tasks.

⚠️ Poor error handling was the main usability issue

Errors and warnings appeared in pop-ups that disappeared once closed, and rarely explained what went wrong or how to fix it. Users often struggled to correct mistakes and believed the system had produced incorrect results.

Key insight: 42% of reviewed support tickets referenced error-related issues

🔘 Unlabelled and inconsistent controls caused confusion

Many buttons were unlabelled, reused identical icons for different actions, or behaved inconsistently across screens. Users relied on memorisation, user manual, or asking for help, and trainees frequently misclicked due to unexpected UI behaviour.

Affinity mapping the insights

I grouped all the uncovered usability issues into themes using affinity mapping to identify recurring patterns, prioritise user pain points, and help inform design decisions.

Click to expand

Ideation

Wireframing new screen layouts

The legacy interface was visually dense and overwhelming for users. Using insights from the research, I redesigned layouts to make information easier to scan and reduce cognitive load.

Improving feedback & error handling

My key improvements included:

  • Introducing a clear message hierarchy: normal and critical dialogs, errors, warnings, success.
  • Rewriting system messages to make them more clear, concise, and actionable.
  • Replacing warning and error pop-ups with inline field messages supported by a summary.
  • Suggesting adding dynamic validation so errors and warnings appear in real time.
  • Adding tab-level error indicators to show where issues are located across multi-step workflows.

New message hierarchy

Before

All system feedback looked identical, so users couldn’t distinguish between warnings, errors, routine actions and critical steps.

After

Each message type has a clear, consistent visual style, reducing confusion and preventing mistakes.

Inline warnings and errors

Before

Warnings and errors appeared only in pop-ups and disappeared once closed.

After

Warnings and errors now appear next to the relevant fields, supported by a persistent summary.

Tab error indicators

Before

Users didn’t know which calculation tabs contained warnings or errors and had to check each one manually.

After

Tabs now display error/warning badges, guiding users directly to the issue.

Delivery

Prototyping & user testing

I added interactivity to the wireframes in Figma and prepared a set of test scenarios. Participants were shown both existing and redesigned interfaces and asked to complete a set of tasks.

The goal was to validate that the redesigned UI improved:
  • Findability: locating information and completing tasks more efficiently
  • Error recovery: identifying, understanding, and fixing errors more easily

Who I tested with

Access to external users wasn’t possible at the time, so I ran the tests with internal employees to gather proxy insights:

  • 6 users new to the system to approximate the new user and onboarding experience
  • 3 experienced users to confirm that improvements don’t slow down or frustrate expert users

How I tested

To reduce order bias, I counterbalanced the test by showing half of the participants the existing design first and the redesigned version second, while the other half saw them in the opposite order.

🔍 Findability and first-click accuracy

Users were shown multiple screens and asked to locate features and complete tasks (e.g. “Where would you save your work?”).

I measured:

  • Time to locate features and complete tasks
  • First-click accuracy or where users clicked if incorrect
  • Confidence level, noting any hesitation or verbal uncertainty
⚠️ Error recovery

Users were shown two screens containing errors and asked:
“This page contains errors. Can you locate them, and how would you fix them?

I measured:

  • Time to locate all errors
  • Whether users could explain how to fix errors, assessing clarity of the messages

Key findings & insights:

🎓 Improved learnability for new users
  • Most new users couldn’t locate errors under period tabs or large tables on the existing interface.
  • Unlabelled page actions also made it hard for them to complete tasks.
  • The redesign enabled the new users to independently find and resolve all errors, and complete tasks more efficiently and confidently.
👤 No disruption to experienced users
  • Experienced users could complete tasks and locate and fix errors on the redesigned UI slightly faster than on the old one, showing that the new interface didn’t disrupt their workflow.

Creating a design system

To ensure consistency, scalability, and a modern look for the new SaaS platform, I helped create a design system that covered foundations, components, patterns, and design tokens.

The system was documented in Figma, providing reusable elements and clear guidelines for designers and developers.

Foundations

First, I defined the visual principles that form the system’s backbone, including typography, colours, spacing, shadows, icons, and layout grids.

Components and patterns

I created reusable components and patterns in Figma and prepared detailed documentation to support the developer team during implementation.

Design tokens

I collaborated with the development team to implement design tokens that made the system scalable, consistent, and easier to maintain across future releases.

We used a three-tier token structure (primitive, semantic, and component-mapped) and added a fourth “responsive” tier to group values that would change across breakpoints in future (mobile, tablet, desktop).

Applying Figma variables

I used Figma variables to structure colour, typography, border and spacing tokens and apply them to components. This meant that any future visual updates could be made globally with a single change.

Developer handoff

Screens were delivered in phases that matched our development milestones. For each phase, I presented designs and documentation to the development team. We discussed technical constraints, I answered their questions, and we resolved edge cases.
Screens and components were iterated as needed until stakeholder approval was achieved. This made sure the designs were ready for implementation.

Project outcomes

The redesign wasn’t shipped before I left the company, so I didn’t have access to post-launch metrics. However, research showed a clear opportunity for impact.

💬 Expected impact: reducing pressure on the internal support team

Error-related issues accounted for 42% of reviewed support tickets. Interviews and onboarding observations showed that users struggled to understand, locate and resolve errors.

The redesigned error handling addressed the root causes identified in user research. Usability testing on prototypes showed that new users could independently find and resolve all errors. This was expected to reduce confusion and lower the need for trainer and support team assistance.

If I were involved after the launch, I would measure success by a reduction in error-related support tickets and fewer trainer interventions during onboarding sessions.

🧠 Expected impact: improved task efficiency and learnability

User research showed that visual clutter and inconsistent, unlabelled controls made the system harder to learn and navigate, increasing reliance on training and support.

I redesigned the screen layouts to reduce clutter, clarify structure, and make actions easier to find and understand. Usability testing confirmed that users could locate information and complete tasks more efficiently, and that the changes didn’t disrupt experienced users.

If I were involved after the launch, I would continue validating the redesign by testing complete end-to-end workflows on the fully-functional product. This would provide me with more realistic efficiency and learnability measurements.

🛠️ Impact: built a scalable design system

I helped establish a design system to modernise the UI, ensure consistency, and support future scalability. It included foundations, design tokens, reusable components and patterns, with detailed documentation to streamline developer handoff and accelerate delivery.