Noyo

Moving a third-party tool to our own platform—and improving it

Noyo

Moving a third-party tool to our own platform—and improving it

Noyo

Moving a third-party tool to our own platform—and improving it

Overview

Noyo’s operations team used a third-party tool to review incoming files for data updates. The tool was slow, painful to use, and difficult to maintain. We built our own tool that enabled the team to review files faster and easier.

My role

Product designer

Interaction design, research, visual design, prototyping, usability testing

2023
Overview

Noyo’s operations team used a third-party tool to review incoming files for data updates. The tool was slow, painful to use, and difficult to maintain. We built our own tool that enabled the team to review files faster and easier.

My role

Product designer

Interaction design, research, visual design, prototyping, usability testing

2023
Overview

Noyo’s operations team used a third-party tool to review incoming files for data updates. The tool was slow, painful to use, and difficult to maintain. We built our own tool that enabled the team to review files faster and easier.

My role

Product designer

Interaction design, research, visual design, prototyping, usability testing

2023

Problem

The existing file-review tool that was used by our Ops team, having been built in a third-party app, was slow to use, difficult to maintain, and required a high learning curve.

The Ops team audits enrollment data updates from benefits-administration customers for issues such as accidentally duplicating members, accidentally removing members, or ingesting wrong SSNs.

Solution & Impact

After switching over to our new tool, the operations team anecdotally reported faster turnaround times, fewer questions for their team leads, and less frustration.

Our tool centered around the narrative of enrollments - one that essentially created a story for each employee. We prioritized showing relevant existing data, so that users would not have to search outside the tool for it.

Problem

The existing file-review tool that was used by our Ops team, having been built in a third-party app, was slow to use, difficult to maintain, and required a high learning curve.

The Ops team audits enrollment data updates from benefits-administration customers for issues such as accidentally duplicating members, accidentally removing members, or ingesting wrong SSNs.

Solution & Impact

After switching over to our new tool, the operations team anecdotally reported faster turnaround times, fewer questions for their team leads, and less frustration.

Our tool centered around the narrative of enrollments - one that essentially created a story for each employee. We prioritized showing relevant existing data, so that users would not have to search outside the tool for it.

Problem

The existing file-review tool that was used by our Ops team, having been built in a third-party app, was slow to use, difficult to maintain, and required a high learning curve.

The Ops team audits enrollment data updates from benefits-administration customers for issues such as accidentally duplicating members, accidentally removing members, or ingesting wrong SSNs.

Solution & Impact

After switching over to our new tool, the operations team anecdotally reported faster turnaround times, fewer questions for their team leads, and less frustration.

Our tool centered around the narrative of enrollments - one that essentially created a story for each employee. We prioritized showing relevant existing data, so that users would not have to search outside the tool for it.

Process

Process

Process

Discovery: Observing the Ops team

Discovery: Observing the Ops team

I created a plan to interview and observe members of the Ops team.

I outlined my questions and created a script around them, then recruited the members of the Ops team to meet with me. With each user, I interviewed them about their experiences with the tool, then asked them to do 3-4 files while I observed.

With each user, I recorded the session so I could review later, and share them with my team.

After all sessions were done, I synthesized my findings to present and discuss with my team.

My questions

My questions

How could we reduce the time needed for a review?

Why did it require such a high learning curve? What made it so that newer reviewers struggled with the tool?

What types of decisions can be automated? Which can't?

What things about the tool really bugged the reviewers?

My takeaways

My takeaways

Organizing updates by “type” obscured the relationships between them, so the narrative is hidden.

No way to track progress.

If they got distracted or lost internet, they’d need to start all over again.

Not enough relevant data shown.

Users often needed to hunt down a member’s current information in order to compare it to the incoming updates - often needing to have multiple tabs open at once.

The data wasn’t scannable.

Users were doing a lot of horizontal scrolling within the little data tables (on the right) to search for contextual information.

Large file sizes slowed down the tool.

The tool needed to be geared to human review.

Most decisions couldn't be automated yet, and some could never be automated.

The tool, before the redesign

The data wasn’t scannable.

Users were doing a lot of horizontal scrolling within the little data tables (on the right) to search for contextual information.

Large file sizes slowed down the tool.

The tool needed to be geared to human review.

Most decisions couldn't be automated yet, and some could never be automated.

The data wasn’t scannable.

Users were doing a lot of horizontal scrolling within the little data tables (on the right) to search for contextual information.

Large file sizes slowed down the tool.

The tool needed to be geared to human review.

Most decisions couldn't be automated yet, and some could never be automated.

Prototype validation

Prototype validation

I created a testing plan in order to validate our prototype and see what we needed to iterate on. I met with users individually, asked them to review a file in our tool, and then the same file in the existing tool (in alternating order) so I could observe. I also asked them questions to get a sense of how they felt using the new tool. I recorded the sessions.

Note: We opted to test our prototype by building out the application, instead of using clickable Figma wireframes, because it allowed us to quickly test real files and compare it with the existing tool.

My research questions

My research questions

  • Did organizing the updates by member create a clear narrative? Did that narrative make reviewing the file less complex?

  • Were we missing any functionalities?

  • Which tool made file-reviews faster?

  • How easy was it to use for newer Ops members?

  • Were we displaying enough contextual information?

  • Could users track their progress in this tool as intended?

  • Was the information more scannable?

  • Were there any friction points in this UI?

Key results from testing

Key results from testing

Organizing updates by family created a narrative of the update(s) in a way the existing tool did not.

  • Participants could quickly scan them and tell us the story of the updates. They were not able to do that in the existing tool

  • Because of this, they had a much easier time making decisions about which updates to accept and which ones to reject.

We were displaying the right amount of contextual information.

Many participants commented, unprompted, that they loved seeing the side by sides of “current” and “new” enrollment data. They no longer had to hunt down that relevant information through overpopulated little tables in the tool, or through other apps when they couldn’t find it here.

Our tool was much faster for reviews that required lots of thought-required decisions, and was slower for reviews that required them to “skip” most updates automatically.

We realized we needed to add a bulk decision-making option to account for this.

Our tool felt faster, even when it wasn’t.

When asked which review was faster, most participants said the newer tool was - even when that objectively wasn’t true.

Our tool made it easier for newer Ops team members to review files.

Because they could read the narrative for each set of updates, and because there was just the right amount of relevant existing enrollment data shown, they were able to complete reviews with much more confidence and with much fewer questions asked to their managers.

Participants were able to track their progress in the tool.

Our tool felt faster, even when it wasn’t.

When asked which review was faster, most participants said the newer tool was - even when that objectively wasn’t true.

Our tool made it easier for newer Ops team members to review files.

Because they could read the narrative for each set of updates, and because there was just the right amount of relevant existing enrollment data shown, they were able to complete reviews with much more confidence and with much fewer questions asked to their managers.

Participants were able to track their progress in the tool.

Our tool felt faster, even when it wasn’t.

When asked which review was faster, most participants said the newer tool was - even when that objectively wasn’t true.

Our tool made it easier for newer Ops team members to review files.

Because they could read the narrative for each set of updates, and because there was just the right amount of relevant existing enrollment data shown, they were able to complete reviews with much more confidence and with much fewer questions asked to their managers.

Participants were able to track their progress in the tool.

Screenshots of solution

Screenshots of solution

Screenshots of solution

This view shows updates for a member: adding dental and vision enrollments.

Note current existing enrollments displayed underneath - this allows the user to know that these new enrollments might be duplicates.

My takeaways

My takeaways

My takeaways

Be clear and focused about what you want feedback on, especially with leadership.

Early in the process, I had done some design reviews with leadership that I didn’t have a focused purpose for, so we ended up getting slowed down by some minor-UI discussions at a time when we should have been discussing more high-level things, such as the problem, how we wanted the tool to affect our business goals. and what our desired success outcomes for this new tool would be.

In a tool like this, where users need to open every row to make their decisions, maybe there’s a different way to display the information.

We ended up defaulting the tool to “open all rows”, because that cut down on the number of clicks people needed to do. But if I had to do this project over again, I would probably opt to go with a UI that didn’t use collapsable rows at all.