mvp
ecommerce
mobile
c2c
The initial problem was how many steps a user had to take in order to do a retinal screening
Insert image of the user workflow here.
There was an existing solution that aimed to simplify the steps highlighted, but users weren't happy with it
Maybe insert a gif of the autouploader here.
Semi-structured interviews
to gather user perspectives, experiences, and expectations to identify potential areas of growth
Contextural inquiry
to gain insight on real-world usage and observe users in their natural environments
Heuristic evaluation
to assess the existing solution against recognized usability principles to gain an objective, structured perspective
Moderated usability testing
to gather user feedback and assess their ability to perform intended tasks with the product
Using Google Meet as our medium, I met with several users from each key user group shown above (n=11). Here's what the interviews revealed about these users:
Training staff are overwhelmed with training and support asks
Staff report that the program is difficult to use and troubleshoot, resulting in a lot of their workload being centered on troubleshooting.
Management has to deal with unhappy clients
Many of their partnered clinics have reported that the program doesnβt work and has unclear workflows.
Retinal photographers struggle to remember how to use the product after being trained
Photographers, the main users of this program, have trouble remembering the workflows that they were trained on and feel that they need to constantly reach out to EyePACS for help.
What users say =/= what users do, so I considered contextual inquiry to be an important step for this project's research process. My goal was to understand how my key users were interacting with the AutoUploader in its real context and in what ways it was not meeting expectations. My two key user groups of focus for this were EyePACS training staff and retinal photographers at partnered clinics (n=6). The contextual inquiry revealed:
Lots of lagging between steps
There were no clear call-to-actions (CTAs) with each step in the AutoUploader, causing users to be confused about what they were supposed to do next
No support available
When they were stuck on a step, there was no easy way to troubleshoot or reach out to support to get help. Their only option was to email EyePACS support and wait for a response.
Not enough feedback
Often asking, "Did it work?" after completing certain steps - there wasn't enough feedback for users to know whether they are progressing correctly.
After gaining insights from the interviews and contextual inquiries, I conducted a heuristic evaluation to assess the existing platform against standard usability principles to get an understanding of the product outside of the user. This helped to further inform my design requirements and constraints.
At this point, I had gathered insights on the problem through three different ways: (1) semi-structured interviews, (2) contextual inquiry, and (3) a heuristic evaluation. Below are my takeaways for actionable ways I can act on my findings:
After discussing with EyePACS management and stakeholders, we agreed that it was best to completely re-design the product instead of simply fixing the existing solution. This meant that I would need to outline new user flows. Given the user flows, the management team wanted it to be designed as a dashboard.
I was asked by management to completely re-design the product instead of simply fixing the existing solution. This meant that I would need to outline new user flows. Given the user flows, the management team wanted it to be designed as a dashboard.
I asked for user feedback from 7 users, with a focus on asking for what they liked, disliked, and what they wanted to see. Most of the feedback was towards the homepage, patient list page, and overall workflow.
I asked for user feedback from 7 users, with a focus on asking for what they liked, disliked, and what they wanted to see. Most of the feedback was towards the homepage, patient list page, and overall workflow.
As part of the usability testing, I asked my users (n=5, consisting of EyePACS training staff and retinal photographers) to complete the following tasks while speaking their thought process out loud. The following are the tasks and their success rates:
Upload a patient case
3 out of 5 users were able to complete this task (60% success rate)
View the patient list and edit a patient's information
5 out of 5 users were able to complete this task (100% success rate)
While task #2 had a 100% success rate, users left additional feedback for the final iteration:
Task #1 had a 60% success rate and I found the following user pain point:
When presenting to a higher-up stakeholder for feedback on the mid-fidelity, we ran into a disagreement on one part of the design where I had to defend my design decision. The problem was whether the landing page for the dashboard should be the βDashboardβ (Home) tab or the βUploadβ tab. I had intentionally designed it so that βDashboardβ would appear upon launch.
After further discussion, the solution agreed upon was minimizing clicks for the user and setting βUploadβ as the landing page to prioritize efficient clinical workflows.
After discussing with EyePACS management and stakeholders, we agreed that it was best to completely re-design the product instead of simply fixing the existing solution. This meant that I would need to outline new user flows. Given the user flows, the management team wanted it to be designed as a dashboard.
7 minutes
original time to screen a patient
5 minutes
new time to screen a patient
Defending your design decisions
When discussing designs with stakeholders, disagreements are bound to happen. In those cases, I strongly believe in at least explaining my stance first, even if it doesnβt end the way I had originally planned. Voicing my thought process, hearing feedback, and learning from it is important for my growth as a designer.
Involving developers early-on
One of my biggest regrets for this project was not involving the developer early-on to discuss feature feasibility. It solidified the idea that design is certainly not possible solely through the hands of the designer, and itβs imperative that we work cross-functionally to ensure all moving parts are accounted for.
Designer responsibility
I discovered many user pain points were linked to the product's functionality, which is under the responsibility of the developer. Although I can't directly influence this aspect, as a designer, I can collaborate with the developer to incorporate all necessary workflows into the design, aiming to prevent as many issues as possible from arising on my end.
Be ready for change
This project started with no other goal than to improve it. Since the team was new to UX, once they had seen the low-fi and mid-fi prototypes, they had a lot of suggestions. It was important that I was open to those suggestions and kept an open-mind to making necessary changes to the original user flows to ensure that the product met business goals.