Noyo

Moving a third-party tool to our own platform—and improving it

Noyo

Moving a third-party tool to our own platform—and improving it

Noyo

Moving a third-party tool to our own platform—and improving it

TL;DR

Noyo's operations team receives files from benefits-administration platforms to manually review for enrollment updates and potential issues. The operations team was struggling with these reviews because the existing tool organized updates by "type of update", which obscured related issues and hid the narrative they needed to make decisions confidently and quickly. I led the re-design of their review tool, using a research-driven approach. I led discovery interviews and observation, prototype creation, and usability testing to compare the old vs new tool and validate our solution. I designed the new tool to organize the updates by member, which grouped families and their related udpates together. Anecdotally, the operations team reported faster file turnaround times, fewer questions for their team leads, and easier learning for new ops members.

My role

Design lead

Interaction design, research, visual design, prototyping, usability testing

Type

Web app

TL;DR

Noyo's operations team receives files from benefits-administration platforms to manually review for enrollment updates and potential issues. The operations team was struggling with these reviews because the existing tool organized updates by "type of update", which obscured related issues and hid the narrative they needed to make decisions confidently and quickly. I led the re-design of their review tool, using a research-driven approach. I led discovery interviews and observation, prototype creation, and usability testing to compare the old vs new tool and validate our solution. I designed the new tool to organize the updates by member, which grouped families and their related udpates together. Anecdotally, the operations team reported faster file turnaround times, fewer questions for their team leads, and easier learning for new ops members.

My role

Design lead

Interaction design, research, visual design, prototyping, usability testing

Type

Web app

TL;DR

Noyo's operations team receives files from benefits-administration platforms to manually review for enrollment updates and potential issues. The operations team was struggling with these reviews because the existing tool organized updates by "type of update", which obscured related issues and hid the narrative they needed to make decisions confidently and quickly. I led the re-design of their review tool, using a research-driven approach. I led discovery interviews and observation, prototype creation, and usability testing to compare the old vs new tool and validate our solution. I designed the new tool to organize the updates by member, which grouped families and their related udpates together. Anecdotally, the operations team reported faster file turnaround times, fewer questions for their team leads, and easier learning for new ops members.

My role

Design lead

Interaction design, research, visual design, prototyping, usability testing

Type

Web app

Context

Noyo is an insure-tech platform that offers primarily API, but also non-API, solutions to employee enrollment data management between between benefits-administration companies and insurance companies.

The Noyo operations team needs to manually review files that are sent in (almost daily) by benefits-administration companies that are not connected via API to Noyo.

The operations team reviews the updates to check for potential issues (such as duplicate enrollments or accidental removals) and needs to make decisions quickly and with confidence.

The operations team reported having problems with the existing application — it was slow to use had a high learning curve,

The tool, before the redesign

Discovery & Observation

Key questions

Key questions

How could we reduce the time needed for a review?

Why did it require such a high learning curve? What made it so that newer reviewers struggled with the tool?

What types of decisions can be automated? Which can't?

What things about the tool really bugged the reviewers?

I created a plan to interview and observe members of the Ops team.

I interviewed each team member them their experiences with the tool, then asked them to review between 3-4 files while I observed.

After all sessions were done, I synthesized my findings to present and discuss with my team.

Observation findings: Problems to solve

Observation findings:
Problems to solve

Problem 1: Missing narrative

Organized by "type" of update, which obscured the narrative of related updates and made it difficult for reviewers to confidently and quickly make decisions. I also grouped updates according to a hierarchy - updates that could only be approved within another update (such as adding enrollments for a new member) were grouped underneath that top update.

Problem 2: Not enough context

Lacked important context such as existing enrollments, which meant that users needed to have multiple apps open in order to find the information they needed in order to make a decision.

Problem 3: Progress lost

Did not save your progress, so if a reviewer lost their place, they would have to start from the beginning.

Problem 4: Not enough guidance in-app

New reviewers would ping their managers very often with questions about rules, how to make decisions, and other related questions.

Problem 5: Third-party app

Being built in a third-party app meant that our engineers could not easily maintain or update the tool, customization was limited, and it struggled with large file sizes.

Usability testing

I met with users individually, asked them to review a file in our new tool, and then the same file in the existing tool (in alternating order) so I could observe.

I also asked them questions to get a sense of how they felt using the new tool.

We opted to test using the actual tool, instead of using clickable Figma wireframes, because it allowed us to quickly test real files and compare it with the existing tool.

Key questions

Key questions

Did organizing the updates by member create a clear narrative? Did that narrative make reviewing the file less complex?

Were we missing any functionalities?

How easy was it to use for newer Ops members?

Which tool made file-reviews faster?

Were we displaying enough contextual information?

Could users track their progress in this tool as intended?

Was the information more scannable?

Were there any friction points in this UI?

Usability testing results

Usability testing results

Problem 1: Missing narrative — Solved

Organizing by member & family created a narrative that allowed reviewers to easily scan and make decisions about updates.

Problem 2: Not enough context — Solved

Adding more contextual information meant that reviewers did not have to lots of apps open to find the existing information needed for them to make decisions. Participants loved to see the side-by-side of "current" and "new" enrollment data.

Problem 3: Progress lost —Solved

Our tool saved their progress, which meant no one had to restart their review if they lost connection or lost track of their place.

Problem 4: Not enough guidance in-app —Solved

We added tips and hints throughout the UI to answer common questions and provide guidance about how to make decisions.

Problem 5: Third party app —Solved

Because the tool was built by our engineers, it could handle large files.

Finding: Bulk actions was needed

Reviewers were making decisions faster, but were slowed down by doing a bunch of clicking. So, we added the ability to "bulk action".

Solution

A custom tool that uses member narratives to guide the user through file reviews faster and with less complexity.

An example of an enrollment update - Dental and Vision being created for a member that already has dental and vision.

An example of an demographic update as well as updates to existing enrollments.

An example of tips we added into the UI to help guide reviewers - alerting on a potentially invalid SSN.

Impact

After switching over to our new tool, the operations team anecdotally reported:

  • Faster turnaround times.

  • Fewer questions for their team leads.

  • Less frustration.

  • Less of a learning curve required for new reviewers.

My takeaways

Be clear and focused about what you want feedback on, especially with leadership.

Early in the process, I had done some design reviews with leadership that I didn’t have a focused purpose for, so we ended up getting slowed down by some minor-UI discussions at a time when we should have been discussing more high-level things, such as the problem, how we wanted the tool to affect our business goals. and what our desired success outcomes for this new tool would be.

Book a call

Let's chat to see if I'm a good fit for your team or project!

Book a call

Let's chat to see if I'm a good fit for your team or project!

Book a call

Let's chat to see if I'm a good fit for your team or project!